On Tuesday, the company said it would turn off its “location history” feature, which tracks location data for users under 18 globally.
It will further expand the types of age-sensitive ad categories that are blocked for users up to 18 and will turn on safe searching filters for users up to that age.
Google is introducing a new policy for all under-18s and their parents or guardians to request the removal of the young person’s images from Google Image search results, the company said in a blog post, as part of several changes regarding young users.
Major online platforms have long been under scrutiny from lawmakers and regulators over their sites’ impact on the safety, privacy, and wellbeing of younger users.
“Some countries are implementing regulations in this area, and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens globally,” said Mindy Brooks, Google’s general manager for kids and families.
Policy for minors
In recent months, online platforms’ approach to younger users has been in the spotlight as US lawmakers and attorneys general slammed Facebook Inc’s plans to create a kids-focused version of Instagram. Facebook recently announced changes to ad targeting under 18s, though its advertisers can still target these younger users based on age, gender, or location.
Google’s video-streaming site YouTube said on Tuesday it would in the coming weeks change the default upload setting to its most private option for teens aged 13-17, where content is seen only by the user and people they choose. Users will still be able to decide to make their content public.
YouTube will also remove “overly commercial content” from its YouTube Kids app, “such as a video that only focuses on product packaging or directly encourages children to spend money,” said the site’s kids and family product management director, James Beser.