All News

Google Adds StopNCII Hashing to Search to Fight Revenge Porn

Google has partnered with UK nonprofit StopNCII to use privacy-preserving image hashes that help detect and remove nonconsensual intimate imagery from Search. The system uploads only hashes, not images, letting platforms match and take down abusive content automatically. The move joins others like Microsoft and major social apps in reducing the burden on survivors.

Published September 18, 2025 at 01:09 PM EDT in Cybersecurity

Google partners with StopNCII to curb nonconsensual intimate images

Google announced it will begin using hashes provided by StopNCII, a U.K. nonprofit, to proactively identify and remove nonconsensual intimate images — commonly called revenge porn — from Search. The move comes after requests from survivors and advocates who say current tools still leave too much manual burden on affected people.

StopNCII’s approach is simple and privacy-preserving: an adult who wants to protect an image generates a unique digital fingerprint, or hash, of that photo or video. Only the hash is uploaded to StopNCII — the intimate image itself never leaves the user’s device — and partner platforms use those hashes to find and remove matching content automatically.

Google already offered removal tools and ranking changes to make these images harder to find. Adding StopNCII hashes means Search will be able to match known private images at scale, complementing manual requests and algorithmic ranking adjustments.

The search giant was slower than some peers to adopt the system: Microsoft integrated StopNCII into Bing last year. Large social and content platforms including Facebook, Instagram, TikTok, Reddit, Snapchat, Bumble, OnlyFans and X have already partnered with StopNCII to block matching content.

Why it matters: proactive hashing reduces the workload for survivors who otherwise must hunt down URLs and submit removal requests across dozens of sites. It also creates a shared defensive signal across platforms so the same image can be blocked in many places quickly.

There are operational and technical challenges. Perceptual differences — cropping, watermarks, or edits — can affect matching. Platforms need policies for appeals, clear survivor support channels, and safeguards to prevent misuse or mistaken takedowns. Transparency reporting and measurable metrics will be crucial to evaluate real-world impact.

For policy makers, the move underscores a growing industry consensus: technical tools that protect privacy and empower survivors can scale only when paired with cross-platform cooperation, robust redress, and legal clarity.

What organizations should consider next

Quicker adoption of hash-based matching can reduce harm, but implementers should follow a few practical steps:

  • Integrate privacy-preserving hashing into content moderation pipelines and takedown automation.
  • Design survivor-first flows: clear submission, quick response, and robust appeal procedures.
  • Track effectiveness with privacy-safe metrics and publish transparency reports to build trust.

Google’s adoption of StopNCII hashing is a notable step toward a web where private images are less likely to be weaponized. But technology alone won’t finish the job — it must be paired with survivor-centered policies, cross-platform coordination, and careful monitoring to avoid overreach and ensure real-world relief.

QuarkyByte can help organizations evaluate the technical trade-offs, design responsible operational playbooks, and measure impact so hash-based defenses deliver protection without compromising privacy or due process.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte helps platforms, government agencies, and large publishers evaluate and operationalize StopNCII-style hashing with survivor-first workflows and privacy-safe monitoring. Talk to us to design takedown automation, measure detection efficacy, and reduce false positives while meeting regulatory and transparency needs.