Tech giant Google has launched a new safety tool that lets teenagers, their parents/guardians and legal representatives request the removal of images of under-18s from the Search results.
Google said that it will remove images of anyone below the age of 18 from search results at the request of the individual under 18 or their parent or guardian, with the exception of cases of compelling public interest or newsworthiness.
“This means these images won’t appear in the Images tab or as thumbnails in any feature in Google Search,” the company said on its updated help page.
Google said that it can prevent an image from appearing in its search results, but can’t remove it from websites that host it.
“This is why you might wish to contact the site’s webmaster and ask them to remove the content,” the company informed.
The feature was originally announced in August (along with new restrictions of ad targeting of minors) and is now widely available.
Applicants will need to supply the URLs of images they want to be removed from search results, the search terms that surface those images, the name and age of the minor and the name and relationship of the individual that might be acting on their behalf — a parent or guardian, for example.
It won’t comply with requests unless the person in the image is currently under 18. “So, if you’re 30, you can’t apply to remove pictures of you when you were 15”.
Google already offers other tools for requesting the removal of specific types of harmful content like non-consensual explicit imagery, fake pornography, financial or medical information, and “doxxing