Hive's Combined CSAM Detection API runs two CSAM (Child Sexual Abuse Material) detection models, created in partnership with Thorn:
- Hash Matching: Capable of detecting known CSAM. If there is no match found, we will send the input to the classifier as well.
- Classifier: Capable of detecting novel CSAM.
The endpoint returns both models' responses into a single, combined response.
Note: Currently, this API can only accept images as input. We plan to add video capabilities in the near future.
This is an example of a matched response, i.e., a match for the input has been found in Thorn's CSAM database. Note the non-empty reasons
and hashes
properties, which reflect that CSAM was flagged.
{
"status": [
{
"status": { "code": 200, "message": "SUCCESS" },
"response": {
"input": {
"id": <ID>,
"created_on": "2025-04-24T00:32:11.782Z",
"user_id": <USER_ID>,
"project_id": <PROJECT_ID>
},
"output": {
"file": {
"fileType": "image",
"reasons": ["matched"],
"classifierPrediction": null
},
"hashes": [
{
"hashType": "saferhashv0",
"matchTypes": ["CSAM"],
"reasons": ["matched"],
"sources": [
{
"hashSet": "test",
"matches": [
{
"sourceId": <SOURCE_ID>,
"matchDistance": 45,
"matchMetadata": null,
"matchTransformations": []
}
]
}
],
"classifierPrediction": null,
"isLetterboxed": false,
"startTimestamp": null,
"endTimestamp": null,
"frameIndex": null
}
]
}
}
}
]
}
This is an example of a non-matched response, i.e., a match for the input has not been found in Thorn's CSAM database. Note that the reasons
and hashes
arrays are empty.
{
"status": [
{
"status": {
"code": 200,
"message": "SUCCESS"
},
"response": {
"input": {
"id": <ID>,
"created_on": "2025-04-24T00:12:44.137Z",
"user_id": <USER_ID>,
"project_id": <PROJECT_ID>
},
"output": {
"file": {
"fileType": "image",
"reasons": [],
"classifierPrediction": {
"csam_classifier": {
"csam": 0.00003219065911252983,
"pornography": 0.001365269417874515,
"other": 0.998602569103241
}
}
},
"hashes": []
}
}
}
]
}