Safer is a complete solution to help stop child sexual abuse material from spreading across your platform. Keeping you, your company and your users, safer.
Safer is an end-to-end solution consisting of four modules developed to quickly identify, remove, and report CSAM from your platform. By activating the technology ecosystem with a simple and cost-effective integration, Safer seeks to arm companies with the right tools to eliminate CSAM from their platforms.
Four tools, use them together as a complete solution, or individually to augment your existing workflow.
Your content doesn’t leave your control.
A simple integration allows all new images and videos to be hashed in multiple industry standard formats. A unique digital fingerprint for each image or video is created so your content does not leave your platform and user data is not shared.
Match against known CSAM hashes.
These digital fingerprints are then compared against our large databases of known child sexual abuse material. We’ll return any matches with an industry standard severity rating allowing you to quickly remove abusive content.
Quick, efficient, complete reports.
Companies have a legal requirement to report any child sexual abuse they encounter on their platforms, and Safer handles all the details. When you use Safer, you can be confident that the content will be reported accurately and quickly. In the US, all reports go to the National Center for Missing and Exploited Children.
Our law enforcement portal is built to streamline your experience with any legal process. All communications are encrypted, and logins are protected with two-factor authentication. Giving you confidence, and making it easy for your company to review.
Working together for the greater good.
Whether you are the originator of all internet memes, or you power imaginations around the world, Safer will keep you covered.
In 2004 there were reports of 450,000 pieces of CSAM content to NCMEC, and in 2015 there were 25,000,000. This year it’s expected to exceed 30 million. We can only stop the spread of this abuse if everyone is prepared to look for it, and we aren’t just looking for the new victims, we’re also looking for the victims who were rescued 20 years ago, but whose images are still traded daily.