Child abusers are abusing your platform. They’re costing you time, money and user reputation. safer is a complete solution to help stop child sexual abuse material from spreading across your platform. Keeping you, your company and your users, safer.
You build it, we make it safer.
safer is a complete pipeline to identify, remove, and report Child Sexual Abuse Material (CSAM) from your platform quickly. It’s simple, cost-effective, and easy to integrate so you can focus on building your platform to be whatever you dreamed it could be.
Who uses safer?
Whether you are the originator of all internet memes, or you power imaginations around the world, safer will keep you covered.
Five tools, use them together as a complete solution, or individually to augment your existing workflow.
Your content doesn’t leave your control.
A simple integration allows all new images and videos to be hashed in multiple industry standard formats.
Match against known CSAM hashes.
We compare your hashes to our large database of known child sexual abuse material. We’ll return matches with a severity rating, and industry standard labels.
Review multiple types of abuse content.
Reviewing content is hard work. That’s why we’ve built our review tool with mental health in mind. You can choose which images you want to review, and how frequently you want to review them. We support fully automated reporting and have the option for you to review based on your company’s policies. safer makes it as easy as possible to review, and move on.
Quick, efficient, complete reports.
Companies have a legal requirement to report the child sexual abuse they encounter on their platforms, and safer handles all the details. When you use safer, you can be confident that the content will be reported completely, accurately and quickly. In the US, all reports go to the National Center for Missing and Exploited Children.
Our law enforcement portal is built to streamline your experience with any legal process. All communications are encrypted, and logins are protected with two-factor authentication. Giving you confidence, and making it easy for your company to review.
In 2004 there were reports of 450,000 pieces of CSAM content to NCMEC, and in 2015 there were 25,000,000. This year it’s expected to exceed 30 million. We can only stop the spread of this abuse if everyone is prepared to look for it, and we aren’t just looking for the new victims, we’re also looking for the victims who were rescued 20 years ago, but whose images are still traded daily.