Advances in generative AI tools have created a new problem on the internet: the proliferation of synthetic nude images that resemble real people, and Microsoft took a big step Thursday by giving victims of revenge porn a tool to stop their Bing search engine from returning such images.
Microsoft announced a partnership with StopNCII, an organization that empowers victims of revenge porn to create digital fingerprints of these explicit images (whether real or fake) on their devices. StopNCII partners then use those digital fingerprints, or what are technically called “hashes,” to remove the images from their platforms. Microsoft's Bing joins Facebook, Instagram, Threads, TikTok, Snapchat, Reddit, PornHub and OnlyFans in partnering with StopNCII to use digital fingerprinting to stop the spread of revenge porn.
Microsoft said in a blog post that it had already used StopNCII's database to take action against 268,000 explicit images returned by Bing image searches in a pilot program that ran through the end of August. Microsoft previously offered direct reporting tools, but the company said they proved insufficient.
“We have heard concerns from victims, experts and others that user reporting alone may not be effective in mitigating the risk that images may be accessible through search,” Microsoft said in a blog post on Thursday.
It's not hard to imagine how much worse this problem would be for the far more popular search engine, Google.
Google Search offers its own tools to report and remove explicit images from search results, but has been criticized by former employees and victims for not partnering with StopNCII, according to a Wired investigation. South Korean Google users have reported 170,000 search and YouTube links containing unwanted sexual content since 2020, Wired reported.
The problem of AI deepfake nudity is already widespread. While StopNCII's tools are only available to people over the age of 18, the “undressing” sites are already causing problems for high school students across the country. Unfortunately, the United States lacks laws regarding AI deepfake pornography and the inability to hold anyone accountable means the country is addressing the issue with a patchwork approach of state and local laws.
In August, prosecutors in San Francisco announced they would file a lawsuit to take down 16 of the most “undressed” sites. According to Wired's tracker of deepfake porn laws, 23 U.S. states have passed laws to address non-consensual deepfakes, while nine have rejected bills.