CSAM Detection - Combined API

This model is available to Enterprise customers only. Contact us to enable access for your organization.


What this API does

Hive's Combined CSAM Detection API runs two CSAM (Child Sexual Abuse Material) detection models, created in partnership with Thorn:

  1. Hash Matching: Finds known CSAM. Hash Matching always runs as a first pass.
  2. Classifier: Capable of detecting novel CSAM. Runs only if no hash match is found.
    1. Within the response, the Classifier makes predictions for three possible classes, assigning each a confidence scores between 0 and 1, inclusive. The confidence scores for each class sum to 1:
      1. pornography: Pornographic media that does not involve children
      2. csam: Child sexual abuse material
      3. other: Non-pornographic and non-CSAM media

The endpoint returns both models' responses into a single, combined response.

Note: Currently, this API can only accept images as input. We plan to add video capabilities in the near future.

CSAM Detection Project and API Key

You can find your CSAM Detection Project from the list of Projects in your organization. You can find your API key by finding your CSAM Detection Project → clicking Integration & API Keys.

Quick-start Requests

Sync via image URL

curl --location --request POST "https://api.thehive.ai/api/v2/task/sync" \
  --header "Authorization: Token $CSAM_KEY" \
  --form "url=https://s3.amazonaws.com/docs.thehive.ai/client_demo/moderation_image.png"

Async via image file + webhook

curl --location --request POST "https://api.thehive.ai/api/v2/task/async" \
  --header "Authorization: Token $CSAM_KEY" \
  --form "media=@/absolute/path/to/file.jpg" \
  --form "callback_url=https://example.com/webhooks/csam"

Example Responses

This is an example of a matched response, i.e., a match for the input has been found in NCMEC's CSAM database. Note the non-empty reasons and hashes properties, which reflect that CSAM was flagged.

{
  "status": [
    {
      "status": { "code": 200, "message": "SUCCESS" },
      "response": {
        "input": {
          "id": <ID>,
          "created_on": "2025-04-24T00:32:11.782Z",
          "user_id": <USER_ID>,
          "project_id": <PROJECT_ID>
        },
        "output": {
          "file": {
            "fileType": "image",
            "reasons": ["matched"],
            "classifierPrediction": null
          },
          "hashes": [
            {
              "hashType": "saferhashv0",
              "matchTypes": ["CSAM"],
              "reasons": ["matched"],
              "sources": [
                {
                  "hashSet": "test",
                  "matches": [
                    {
                      "sourceId": <SOURCE_ID>,
                      "matchDistance": 45,
                      "matchMetadata": null,
                      "matchTransformations": []
                    }
                  ]
                }
              ],
              "classifierPrediction": null,
              "isLetterboxed": false,
              "startTimestamp": null,
              "endTimestamp": null,
              "frameIndex": null
            }
          ]
        }
      }
    }
  ]
}

This is an example of a classified response where the input has triggered the classifier, meaning novel CSAM was detected.

{
  "status": [
    {
      "status": { "code": 200, "message": "SUCCESS" },
      "response": {
        "input": {
          "id": <ID>,
          "created_on": "2025-04-24T00:32:11.782Z",
          "user_id": <USER_ID>,
          "project_id": <PROJECT_ID>
        },
        "output": {
          "file": {
            "fileType": "image",
            "reasons": ["csam"],
            "classifierPrediction": {
                "csam_classifier": {
                    "pornography": 0.01,
                    "csam": 0.98,
                    "other": 0.01
                }
            }
          },
          "hashes": []
        }
      }
    }
  ]
}

This is an example of a non-pertinent response, i.e., a match for the input has not been found in NCMEC's CSAM database, and the classifier was also not triggered. Note that the reasons and hashes arrays are empty.

{
    "status": [
        {
            "status": {
                "code": 200,
                "message": "SUCCESS"
            },
            "response": {
                "input": {
                    "id": <ID>,
                    "created_on": "2025-04-24T00:12:44.137Z",
                    "user_id": <USER_ID>,
                    "project_id": <PROJECT_ID>
                },
                "output": {
                    "file": {
                        "fileType": "image",
                        "reasons": [],
                        "classifierPrediction": {
                            "csam_classifier": {
                                "csam": 0.00003219065911252983,
                                "pornography": 0.001365269417874515,
                                "other": 0.998602569103241
                            }
                        }
                    },
                    "hashes": []
                }
            }
        }
    ]
}