The feature was originally announced in August (along with new restrictions of ad targeting of minors) but is now widely available. Anyone can start the removal process from this help page, the report said.
Applicants will need to supply the URLs of images they want to be removed from search results, the search terms that surface those images, the name and age of the minor and the name and relationship of the individual that might be acting on their behalf – a parent or guardian, for example.
The company notes it will remove images of any minors “with the exception of cases of compelling public interest or newsworthiness”.
It also seems from Google’s language that it won’t comply with requests unless the person in the image is currently under 18. So, if you’re 30, you can’t apply to remove pictures of you when you were 15, the report said.
That limits the scope of the tool to prevent abuse or harassment, but it presumably makes the process of verification much easier. It’s hard to prove what age you are in any given photo as opposed to proving how old you are right now.
Google also stresses that removing an image from its search results does not, of course, remove it from the web.
In addition to these new removal options for images of minors, Google already offers other avenues for requesting the removal of specific types of harmful content.
These include non-consensual explicit imagery, fake pornography, financial or medical information, and “doxxing” information including home addresses and phone numbers.