Google is partnering with the U.K. nonprofit StopNCII to bolster its efforts at combating the spread of nonconsensual intimate images, also known as revenge porn.
The search giant will begin using StopNCII’s hashes, which are digital fingerprints of images and videos, to proactively identify and remove non-consensual intimate imagery on Search.
StopNCII helps adults prevent their private images from being shared online by creating a unique identifier, or hash, representing their intimate imagery. These hashes are then provided to partner platforms like Facebook, allowing them to identify and remove matching content from their platforms or services automatically.
It’s worth noting that the private imagery itself never leaves the device, as only the hash is uploaded to StopNCII’s system.
“Our existing tools allow people to request the removal of NCII from Search, and we’ve continued to launch ranking improvements to reduce the visibility of this type of content,” Google wrote in a blog post. “We have also heard from survivors and advocates that given the scale of the open web, there’s more to be done to reduce the burden on those who are affected by it.”
Google has been slow to adopt StopNCII’s system, as its partnership with the nonprofit comes a year after Microsoft integrated the tool into Bing. Other companies that have partnered with StopNCII include Facebook, Instagram, TikTok, Reddit, Bumble, Snapchat, OnlyFans, X, and more.
The search giant’s partnership with the nonprofit marks its latest move to combat nonconsensual intimate images. Last year, Google made it easier to remove deepfake nonconsensual intimate images from Search and making them harder to find.