Microsoft Bing now has more power to scrub AI-generated or deepfake images, a form of nonconsensual intimate image (NCII) abuse, from appearing on the search engine, as the company announces a new nonprofit partnership.
A collaboration with victim advocacy tool StopNCII, Microsoft is supplementing its user-reporting with a more “victim-centered” approach incorporating a more in-depth detection process, the company explained. StopNCII, a platform ran by UK nonprofit SWGfl and the Revenge Porn Helpline, offers individuals the ability to create and add digital fingerprints (also known as a “hash”) to intimate images, which can then be tracked to remove images as they appear on certain platforms.
Based on a pilot that ran through August, Microsoft’s new system harnesses StopNCII’s database to immediately flag intimate images and prevent them being surfaced in Bing results. Microsoft says it has already “taken action” on 268,000 explicit images.
StopNCII’s hashes are used by social sites like Facebook, Instagram, TikTok, Threads, Snapchat, and Reddit, as well as platforms like Bumble, OnlyFans, Aylo (owner of several popular pornography sites, including PornHub), and even Niantic, the AR developer behind Pokémon Go. Bing is the first search engine to join the partner coalition.
Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable’s weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
SEE ALSO:
Scammers are using pictures of your home to amplify sextortion threats
Google, also struggling with nonconsensual deepfake content, has taken similar steps to address the appearance of deepfake images in Search results, in addition to nonconsensual real images. Over the last year, the company has been revamping its Search ranking system to lower explicit synthetic content in results, replacing the surfaced results with “high-quality, non-explicit content,” the company explained, such as news articles. Google announced it was also streamlining its reporting and review process to help expedite removal of such content — the search platform already has a similar system for removal of nonconsensual real images, or deepfake porn.
But it has yet to join StopNCII and utilize its hashing tech. “Search engines are inevitably the gateway for images to be found, so this proactive step from Bing is putting the wellbeing of those directly affected front and center,” said Sophie Mortimer, manager of the Revenge Porn Helpline.
Microsoft has similar reporting processes for real-images based NCII abuse, as well as strict conduct policies against intimate extortion, also known as sextortion. Earlier this year, Microsoft provided StopNCII with its in-house PhotoDNA technology, a similar “fingerprinting” tool that has been used to detect and help remove child sexual abuse material.
Related Stories
Anti-deepfake legislation just took a major step toward becoming law
The consequences of making a nonconsensual deepfake
Victims of nonconsensual deepfakes arm themselves with copyright law to fight the content’s spread
Explicit deepfakes are traumatic. How to deal with the pain.
New evidence claims Google, Microsoft, Meta, and Amazon could be listening to you on your devices
SEE ALSO:
How Big Tech is approaching explicit, nonconsensual deepfakes
How to report intimate images with StopNCII
If you believe your image (explicit or non-explicit) is at risk of being released or manipulated by bad actors, you can add your own fingerprint to StopNCII for future detection. The tool does not require you to upload or store personal photos or videos to the site. Instead, images are retained on your personal device.
Visit Stopncii.org.
Click on “Create your case” in the top right corner.
Navigate through the personalized prompts, which gathers information about the content of the image or video.
The website will then ask you to select photos or videos from your device’s photo library. StopNCII then scans the content and creates hashes for each image. The hashes are then sent to participating platforms. No images or videos will be shared.
Save your case number, which will allow you to check if your image or video has been detected online.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.