⚖️ CSAM Detection - Classifier API
Integrate with our CSAM Detection API for both images and videos
This model is only available to our Enterprise customers. Please reach out to us and indicate you'd like to integrate with our CSAM Detection API, and we'll be glad to reach out.
Overview
Hive has partnered with Thorn to offer our proprietary CSAM detection API, which uses embeddings to detect novel child sexual abuse material (CSAM) content. This classifier differs from our other collaboration with Thorn—the CSAM matching API—which only covers known CSAM content. Both images and videos are accepted inputs.
The classifier works by first creating embeddings of the media. An embedding is a list of computer-generated scores between 0 and 1. After we create the embeddings, we permanently delete all of the original media. Then, we use the classifier to classify the content as CSAM or not based on the embeddings. This process ensures that we do not store any CSAM.
Response
The classifier returns a score between 0 and 1 that predicts whether a video or image is CSAM.
To see an annotated example of an API response object for this model, you can visit our API Reference.
Supported File Types
Image Formats:
jpg
png
Video Formats:
mp4
mov
Updated about 21 hours ago