Protect your content and online community from child exploitation videos

Free technology to remove illegal video content

At YouTube, we’re dedicated to stopping the spread of online child exploitation videos. In this spirit, we believe our commitment to the fight means making innovative technology available across the industry, including technology companies and NGOs.

CSAI Match is YouTube’s proprietary technology for combating CSAI (Child Sexual Abuse Imagery) content online. This technology allows us to identify known CSAI content in a sea of innocent content. When a match of CSAI content is found, it is then flagged to partners to responsibly report in accordance to local laws and regulations.

How it works

Online platforms can prevent this illegal and abhorrent content from being displayed and shared on their sites by:

Search icon

Using YouTube’s proprietary video match technology to detect CSAI

Security icon

Matching against one of the largest indices of known CSAI content for optimal coverage

Scale icon

Scaling challenging content management with simple integration with your system

Who can use CSAI Match

YouTube makes CSAI Match available to partners in industry and NGOs. We give access to fingerprinting software and an API to identify matches against our database of known abusive content.

Companies and NGOs already participating include:

Adobe
Thorn
Yahoo
Thumblr
Reddit
Canadian Center For Child Protection

“CSAI Match was straightforward to deploy and quickly proved itself a very effective tool in our ongoing fight against the online distribution of child sexual exploitation content.”

Mary Catherine Wirth, Associate General Counsel and Director, Trust & Safety at Adobe

“The YouTube CSAI match service enhances our ability to prioritize videos involving child sexual abuse, reduces our analyst team’s exposure, and will greatly assist in accelerating video removal."

Lianna McDonald, Executive Director, Canadian Centre for Child Protection

FAQ

Find answers to your CSAI Match questions below.

  • Does CSAI Match work for images?

    CSAI Match is designed for video, but through Google’s Content Safety API, a collection of tools are available to industry and NGO partners, offering machine learning powered classification for images. Learn more.

  • What information is returned with an identified match?

    The match will identify which portion of the video matches known CSAI, as well as a standardized categorization of the type of content that was matched.

  • What happens when a query returns a positive match for CSAI?

    Child Sexual Abuse Content is illegal in most places and a provider may have legal obligations to report the incident. In the US, service providers are required to report such content to the National Center for Missing and Exploited Children. CSAI Match allows you to find this content before it’s exposed to your users.

  • What happens in the event of a false positive?

    In the rare case of a false positive, we will work with the provider to resolve the issue.