Jul 21 2020

Safer: Building the internet we deserve

Post By: Safer / 4 min read

Learn
Blog

Helping companies fight the spread of child sexual abuse material

We began this journey almost a decade ago: mobilizing, researching, learning, then building. After focusing our formative years on building software to accelerate the identification of juvenile trafficking victims, we saw a need to address the viral spread of child sexual abuse material (CSAM) on the internet. It was a problem very many knew little about—and the solution seemed even less clear.

Graph showing the sharp rise in CSAM reports.

In 2019 alone, 69.1 million files of child sexual abuse material, or CSAM, was reported to the National Center for Missing and Exploited Children (NCMEC). The number of reports of child sexual abuse material being found on the internet nearly doubles with each passing year and it begs the question:

What is technology’s role in protecting the most vulnerable? And how does this intersect with the challenges faced by companies hosting user-generated content?

So we got to work.

Safer’s vision is to build a world where all technology companies processing user-generated content can easily and proactively scan content to identify, remove, and report CSAM, providing their employees, their users, and the larger technology ecosystem a safer community. As one piece of a larger global strategy to eliminate child sexual abuse material from the internet, Safer equips technology companies to be apart of building a world where every child can be safe, curious, and happy.

The virtual spaces we’ve built for connection have created a sense of belonging and wonder in ways the physical world never provided. Safeguarding these spaces so that people can explore, connect, and create without accidentally coming into contact with child sexual abuse, and where survivors can build communities where they aren’t worried about potentially encountering triggering content is the internet we deserve.

As we got to building a solution, we quickly learned that platform protection remains inconsistent across the industry, costly to build, and databases of known child abuse content remain siloed. Companies looking to protect their platforms, and ultimately, victims of child sexual abuse whose content is being shared across the web, are faced with the challenge of how to go about doing so.

When we first launched Safer as a beta solution in October 2018, our goal was to provide companies of any size an accessible solution to scan their platforms for known CSAM. If it had been seen and classified as child abuse content by NCMEC, it had no business being hosted anywhere on the web. Our CSAM detection solution began with cryptographic and perceptual hashing techniques that allowed companies to identify images of CSAM and swiftly take action. Providing this technology en masse to companies unable to build a comparable in-house solution was the first step in Thorn’s mission to eliminate CSAM from the internet.

“Eliminating child abuse content from the internet is only possible if every compnay - small, medium and large - is equipped to find it, report it to NCMEC and remove it. We've built a solution that makes it easier for companies to do just that. ”

— Julie Cordua, Thorn CEO

Since then, we’ve expanded the ways companies are able to find more CSAM and as a result, created post-detection tools to support the removal and reporting to NCMEC seamlessly.

Today, Safer exists as a comprehensive solution for companies to identify, remove, and report CSAM at scale. Each feature uniquely addresses the complexity of CSAM.

These features include:

Image Hash Matching - Our flagship feature that generates both cryptographic and perceptual hashes and matches against a set of over 5.9M hashes of known CSAM

CSAM Image Classifier - A machine learning classification model that returns a prediction for whether a file is CSAM, regardless of whether the hash of that file matches against a known CSAM hash

Video Hash Matching - As video has increased in popularity, we’re addressing the detection of abuse content with the same hash matching technology used on images with videos to find known CSAM videos

Reporting - Streamlined integration with NCMEC’s CyberTipline API that aims to introduce efficiencies for both the platform and for NCMEC. This capability is also accessible through the Review Tool as a user interface

Review Tool - A content moderation tool for reviewing and reporting CSAM built with specific features to support employee wellness and resilience. If content is suspected to be CSAM, either by the hash matching or CSAM Classifier, it is added to a queue for review

SaferList - A collective effort by the Safer community to share and match against a broader set of hashes beyond CSAM, specifically those representing content that is sexually exploitative of children

False Positive Feedback Loop - An API to share feedback on false positive hashes to improve the efficacy of CSAM detection across industry

To learn more about the specific features, read about the release in Product Updates.

To learn more about Thorn and our mission to eliminate child sexual abuse material from the web, please visit Thorn.org.


You've successfully subscribed to Safer: Building the internet we deserve.!