Visual Moderation

Visual Classification Overview

Visual classification models classify an entire image into different categories by assigning a confidence score for each class.

Classification models can be multi-headed, where each group of mutually exclusive model classes belong to a single model head. For example, when an image is run through Hive's visual moderation model, one head might classify sexually not-safe-for-work (NSFW) content while another head might classify the presence of guns.

This concept is illustrated below. This imaginary model has two heads:

NSFW classification: general_nsfw, general_suggestive, general_not_nsfw_not_suggestive

Gun classification: gun_in_hand, animated_gun, gun_not_in_hand, no_gun

The confidence scores for each model head sum to 1.

When submitting a video to be processed, Hive’s backend splits the video into frames, runs the model on each frame, then recombines the results into an aggregated response for the entire video. The video output for a classifier is similar to a list of classification output objects, but with multiple timestamps.

Visual Content Moderation

Note: Older versions of the API might not perfectly match the outline below. Please reach out to [email protected] if you would like to access the latest content moderation classes.

Sexual

NSFW Head:

  • general_nsfw - genitalia, sexual activity, nudity, buttocks, sex toys
  • general_suggestive - shirtless men, underwear / swimwear, sexually suggestive poses without genitalia
  • general_not_nsfw_not_suggestive - none of the above, clean

Sexual Activity Head:

  • yes_sexual_activity - a sex act or stimulation of genitals are present in the scene
  • no_sexual_activity - no sex act is present in the scene

Realistic NSFW Head:

  • yes_realistic_nsfw - live nudity, sex acts, or photo-realistic representations of nudity or sex acts
  • no_realistic_nsfw - non-photorealistic representations of nudity or sex acts (statues, crude drawings, paintings etc.); lack of any NSFW content

Female Underwear Head:

  • yes_female_underwear - lingerie, bras, panties
  • no_female_underwear

Male Underwear Head:

  • yes_male_underwear - fruit-of-the-loom, boxers
  • no_male_underwear

Sex Toy Head:

  • yes_sex_toy - dildos, certain lingerie
  • no_sex_toy

Female Nudity Head:

  • yes_female_nudity - breasts or female genitalia
  • no_female_nudity

Male Nudity Head:

  • yes_male_nudity - male genitalia
  • no_male_nudity

Female Swimwear Head:

  • yes_female_swimwear - bikinis, one-pieces, not underwear
  • no_female_swimwear

Shirtless Male Head:

  • yes_male_shirtless - shirtless below mid-chest
  • no_male_shirtless

Sexual Intent Head: (beta)

  • yes_sexual_intent - occluded, blurred, or hidden sexual activity
  • no_sexual_intent

Animal Genitalia Head: (beta)

  • animal_genitalia_and_human - sexual activity including both animals and humans
  • animal_genitalia_only - animals mating and pictures of animal genitalia
  • animated_animal_genitalia - drawings of sexual activity involving animals
  • no_animal_genitalia - none of the above, clean

Violence

Gun Head:

  • gun_in_hand - person holding rifle, handgun
  • gun_not_in_hand - rifle, handgun, not in hand
  • animated_gun - gun in games, cartoons, etc. can be in-hand or not.
  • no_gun

Knife Head:

  • knife_in_hand - person holding knife, sword, machete, razor blade
  • knife_not_in_hand - knife, sword, machete, razor blade, not in hand
  • culinary_knife_in_hand - knife being used for preparing food
  • no_knife

Blood Head:

  • very_bloody - gore, visible bleeding, self-cutting
  • a_little_bloody - fresh cuts / scrapes, light bleeding
  • no_blood - minor scabs, scars, acne, etc. are not considered ‘blood’ by model
  • other_blood - animated blood, fake blood, animal blood such as game dressing

Hanging Head:

  • hanging - the presence of a human hanging by noose (dead or alive)
  • noose - a noose is present in the image with no human hanging from it
  • no_hanging_no_noose - no person hanging and no noose present

Corpses Head: (beta)

  • human_corpse: human dead body present in image
  • animated_corpse: animated dead body present in image
  • no_corpse

Emaciated Bodies Head:

  • yes_emaciated_body: emaciated human or animal body present in image
  • no_emaciated_body

Self Harm Head: (beta)

  • yes_self_harm: self cutting, burning, instances of suicide or other self harm methods present in image
  • no_self_harm

Drugs

Pill Head:

  • yes_pills - pills and / or drug powders
  • no_pills - no pills and / or drug powders

Injectable Head:

  • illicit_injectables - heroin and other illegal injectables
  • medical_injectables - injectables for medical use
  • no_injectables - no injectable drug paraphernalia

Smoking Head:

  • yes_smoking - cigarettes, cigars, marijuana, vapes, or other smoking paraphernalia
  • no_smoking - no cigars, marijuana, vapes, or other smoking paraphernalia

Hate

Nazi Head:

  • yes_nazi - Nazi symbols
  • no_nazi - absence of the above

Terrorist Head:

  • yes_terrorist - ISIS flag
  • no_terrorist - absence of the above

White Supremacy Head:

  • yes_kkk - KKK symbols
  • no_kkk - absence of the above

Middle Finger Head:

  • yes_middle_finger - middle finger
  • no_middle_finger - absence of the above

Other Attributes

Text Head:

  • text - any form of text or writing is present somewhere on the image
  • no_text - no text present in the image

Overlay Text Head:

  • yes_overlay_text - digitally overlaid text is present on an image (think meme text)
  • no_overlay_text - lack of digitally overlaid text in the image

Child Presence:

  • yes_child_present: a baby or toddler is present in the image
  • no_child_present

Drawings: (beta)

  • yes_drawing: a drawing, painting, or sketch is the central part of the image
  • no_drawing

Image Type Head:

  • animated - the image is animated
  • hybrid - the image is partially animated
  • natural - the image has no animation

Brand Safety GARM Moderation (Global Alliance for Responsible Media)

GARM is a cross-industry initiative established by the World Federation of Advertisers to address the challenge of harmful content on digital media platforms and its monetization via advertising. Hive has committed to building out additional class support in our Visual Content Moderation Suite to power this framework. These GARM classes are defined in Hive's Visual Content Moderation suite through the presence of visual cues pertinent to the GARM framework's classes. The beta release of these GARM classes are defined below and can be added to your Hive Visual Moderation API upon request.

GARM Classes (beta)

  • garm_adult_and_explicit_sexual_content - genitalia, sexual activity, nudity, buttocks, sex toys
  • garm_arms_and_ammunition - guns, knives, and guns-in-use
  • garm_death_injury_or_military_conflict - gore, blood, hanging, military conflict
  • garm_hate_speech_and_acts_of_aggression - kkk imagery, nazi imagery
  • garm_obscenity_and_profanity - middle finger
  • garm_illegal_drugs_tobacco_ecigarettes_vaping_alcohol - pilled or powdered drugs; tobacco, marijuana, or vaping paraphernalia
  • garm_terrorism - ISIS symbol imagery
  • garm_suggestive_sexual_content - nude or suggestive content (swimwear, underwear, shirtless-ness)

Choosing Thresholds

For each of the classes mentioned above, you will need to set thresholds to decide when to take action based on our model results. For optimum results, a proper threshold analysis on a natural distribution of your data is recommended (for more on this please contact Hive at the email below). Generally, a model confidence score >.90 is a good place to start to flag an image for any class of interest. For questions on best practices, please message your point of contact at Hive or send a message to [email protected] to contact our API team directly.


What’s Next

See the API reference for more details on the API interface and response format.