Self-Service

CSAM Keyword Hub

In partnership with the Technology Coalition, Thorn has developed an API containing child sexual abuse material (CSAM) terms and phrases in multiple languages to improve your content moderation process.

Get Access
FAQs

1 What can I do with the CSAM Keyword Hub?

Technology companies with a chat function who are seeking to protect children from online predation on their platforms can apply for cost-free use of the technique. Law enforcement and non-governmental organizations (NGOs) are also eligible to apply.

2 How should the CSAM Keyword Hub be used?

Instead of using the Keyword Hub for strictly blocking specific keywords that match the list, the strong preference is to use the list to kickstart the training of machine learning models. By doing so, we hope companies can proactively detect and remove CSAM.

3 What kind of terms can be added?

Terms must be relevant to the detection and elimination of online materials that involve sex trafficking, exploitation, or the endangerment of children. They should be able to help identify, prevent, or remove this online material. Terms may include the identifier of a known child abuse image or video (for example a filename, MD5 hash, or street name). PII included in CSAI terms (such as a series name) is allowed, but PII that includes user account information is not allowed. Once terms are added, they are intended to remain even if a company decides to no longer participate.

4 Who can participate?

This technology is available to approved Internet service companies and NGOs that have a core mission of combatting CSAM.

CSAM Keyword Hub

Apply Now

Anyone interested in accessing the CSAM Keyword Hub must complete the application on this page, share their intent of use, and agree to the terms and conditions.