Class Descriptions – Violence, Weapons & Gore
How to Use This Page – Overview
This page gives a more detailed overview of Hive's visual moderation classes related to violence, gore, and weapons. If you need more details on these classes after reading our main Visual Moderation page, look here. We'll enumerate, as clearly as possible, which types of subject matter are covered by each class in our models.
Because all platforms have different moderation requirements and risk sensitivities, we recommend that you consult these descriptions carefully as you decide which classes to build into your moderation logic. At the end of the day, it's up to you to decide which classes are important to monitor based on your content policies.
NOTE:
To determine which class(es) cover specific types of visual content, it may be helpful to search this page (Ctrl/Cmd + F) with terms for that subject matter (i.e., gun, injury) rather than looking for it in specific class descriptions.
General Notes
Before looking at subject matter breakdowns for each class, it may be helpful to understand the following:
-
Hive's visual classifier is multi-headed. Each model head defines a group of categorizations we call classes. Each model head includes at least one positive class (e.g., yes_hot_dog) and a negative class (e.g., no_hot_dog). Scores returned for each class correlate with the model's certainty the image meets our ground truth definition for the category. This page attempts to explain these ground truth definitions as clearly as possible.
-
The model makes classifications for each model head independently. In other words, if an image scores highly in multiple classes, the image meets our definitions for each class. Confidence scores from each model head are generated separately and are not correlated in and of themselves. It's easiest to think of this as asking multiple, narrower models (that may or may not overlap in scope) to each make a prediction on an image.
-
As a corollary, a high confidence score in negative classes (e.g., no_gun, no_knife) does not mean the image is clean in general. This is simply the logical opposite of the positive classification; the subject matter captured by the positive class in that model head is not present. For example, an image that scores 0.99 in no_gun can still score highly in very_bloody, human_corpse, or any other class trained to flag other subject matter. For this reason, we will describe these negative classes as a non-exhaustive list of subject matter that is not captured by the positive classes in that model head but that might be helpful to distinguish what is and is not flagged by that model head (e.g., borderline content, content captured by other classes).
-
For some classes, the model classifies animations, drawings, diagrams, paintings, or other artwork in the same way as photographs or photorealistic images. We will call out which model heads this does or does not apply to in the detailed description.
Gun Head
This model head can be used to flag images depicting handguns, rifles, machine guns, etc.. These classes are specific to firearms and do not capture other weapons or military activity more generally. Other types of guns that are not visually distinguishable from actual firearms – airsoft or paintball guns, model guns etc. – will also be flagged.
NOTE:
Art, animations, drawings and other representational depictions of (photorealistic) guns are classified as animated_gun. This class can be used to distinguish real weapons from those depicted in video games and animated content.
Subject Matter Breakdown by Class
gun_in_hand: the image is a photograph of a person holding or handling a gun. Captures:
- A person holding a firearm of any kind (pistol, rifle, etc.)
- A person handling or touching a firearm, even if holstered or mounted
gun_not_in_hand: the image is a photograph showing a gun that is not being held or handled. Captures:
- Firearms displayed on a stand or mount
- Firearms strapped to or slung over a person's back but otherwise not being touched or handled
- Firearms in a holster that is not being touched or handled
- Guns mounted to a vehicle or turret
animated_gun: the image is a drawing, animation, or diagram depicting a gun. Captures:
- Guns depicted in video games
- Guns depicted in cartoons and animations
- Guns depicted in diagrams or schematics
- Guns depicted in drawings and photorealistic art
no_gun: the image does not depict a gun. For clarity, the following subject matter would not be captured by other classes in this model head:
- Photographs or animations, drawings, etc. where no gun is visible
- Large weapons mounted on ships, tanks, and planes
- Rocket-propelled grenades and other handheld rocket launchers
- Cannons and artillery
- Explosives such as grenades, dynamite, and C4
- Non-realistic toy guns such as water guns, nerf guns, obvious cosplay guns, props, etc.
Knife Head
This model head can be used to flag images of blades such as knives, box cutters, machetes, swords, and other bladed weapons. This does not include scissors, saws, and other bladed tools that are not easily usable or intended to be used as weapons. This model head distinguishes between knives used in a culinary context and those used as weapons; it contains a separate class for each.
For this model head, animations, drawings, and other non-photographic depictions of knives and blades are not flagged. These images are classified as no_knife.
Subject Matter Breakdown by Class
knife_in_hand: the image is a photograph showing a knife or blade being held or handled by a person (outside of culinary/agricultural settings). This applies to:
- Common knives (including plastic knives), machetes, box cutters, daggers, throwing knives or shuriken, exposed razor blades, swords, and bayonet blades
- Sheathed knives, swords, or other blades
- Bayonets attached to a gun that is being held or handled
culinary_knife_in_hand: the image shows a person holding or using a knife or blade to prepare or harvest food. This class is used to distinguish these uses from the other positive classes in this model head. Captures:
- Knives and blades being handled or used when processing or preparing food
- A person holding a knife in a kitchen with ingredients or cutting boards visible
- A person using a knife as a utensil when eating
- A butcher processing meat with knives and other bladed tools
- Blades being used to harvest plants, grains, vegetables in agricultural settings
knife_not_in_hand: the image is a photograph showing a knife or blade that is not being held or handled by a person (outside of culinary/agricultural settings). This applies to:
- Knives stored in knife blocks or drawers
- Sheathed knives and blades that are displayed or otherwise not being handled
- Unsheathed knives and other blades/bladed weapons being displayed
- Knives and other blades lodged into objects
culinary_knife_not_in_hand: the image is a photograph showing any type of blade that is not being held or handled by a person and appears in a culinary/agricultural setting. This applies to:
- Knives stored in kitchen knife blocks or drawers
- Knives that are displayed in a kitchen setting, i.e. on chopping board
- Knives and other blades lodged into vegetables, meat, or other food
no_knife: the image does not show a knife or blade or depicts an animated or illustrated knife. To be clear, the following subject matter is not captured by the other classes in the knife model head:
- Scissors
- Axes and hatchets, even if crafted as weapons
- Spears
- Batons and nightsticks
- Saws, including chainsaws
- Shaving razors and razor heads
- Animated or illustrated knives and blade weapons
Blood Head
This model head can be used to flag images showing blood, wounds, active bleeding, and gore. Generally, these classes do not capture images of injuries where blood is not present. Blood depicted in art, animations, drawings, etc. are classified as other_blood
Subject Matter Breakdown by Class
very_bloody: the image is a photograph showing substantial amounts of blood, major wounds that are actively bleeding, or gore. This includes:
- Gunshot wounds
- Stab wounds
- Deep cuts
- Other major injuries resulting in visible bleeding: loss of limbs, fingers, etc., animal attacks/bites, and the like
- Profuse bloody noses
a_little_bloody: the image is a photograph showing minor amounts of blood or evidence of a major injury that has been treated or healed. This includes:
- Minor cuts, scrapes, and scratches
- Small amounts of blood on surfaces
- Stitches and scar tissue
other_blood: the image shows animal blood, blood in a medical or laboratory setting, or blood depicted in art, animations, or illustrations. This includes:
- Blood in test tubes, transfusion bags, dialysis machines, etc.
- Animal blood and injuries, including slaughter and butchering
- Depictions of blood or graphic injury in animations, art, illustrations, etc.
- Imitation blood used such that it is evidently fake (e.g., Halloween costumes or decorations)
- Liquids that could be blood, but the image lacks contextual evidence that this is from human injury
no_blood: the image does not show blood or imitation blood. For clarity, the following subject matter is not captured by the other classes in this model head:
- Bruises and contusions
- Accidents or injuries with no visible blood
Hanging Head
This model head can be used to flag images of nooses, hangings, and hanging bodies. Art, animations, and illustrations follow the same definitions for these classes as photographic images.
Subject Matter Breakdown by Class
hanging: the image depicts corpses hanging from a rope or a person being hanged
noose: the image depicts a rope tied as a noose hanging from gallows, trees, or other objects
no_hanging_no_noose: the image does not depict a noose or hanging. To be clear, the following subject matter is not captured by the other classes in this model head:
- Corpses that are not hanging
- Ropes hanging that are not tied as a noose
- Nooses not hanging from an object (e.g., on a surface such as a table or floor)
- Bondage and hanging in BDSM situations not intended to cause death
- Knots, lassos, and regular ropes
Corpse Head
This model head can be used to flag images of human corpses or images of bodies with enough contextual evidence to assume the person is dead. It does not flag even graphic injuries and accidents if the victim is visibly alive (use Blood model head instead).
The human_corpse flags photographic images only. Art, animations, and illustrations of dead bodies are classified as animated_corpse.
Subject Matter Breakdown by Class
human_corpse: the image is a photograph of a dead human body. This includes:
- Motionless bodies with evidence of potentially fatal injury
- Bodies that are clearly identifiable as dead based on color, lividity, decomposition, etc. even if no injuries are visible
- Bodies that are clearly identifiable as dead based on contextual factors (e.g., body in a casket, body in a morgue) even if no injuries are visible
- Autopsy photos
- Any of the above depicted by actors and/or makeup and effects in a movie or TV show
animated_corpse: the image is an animation or illustration of a dead human body. This includes:
- Deaths and corpses depicted in video games
- Bodies and death scenes depicted in cartoons, anime, etc.
no_corpse: the image does not explicitly show a dead human body. To be clear, the following subject matter would not be flagged by the other classes in the corpse model head:
- Graphic injuries without evidence that the victim is dead (use very_bloody to flag this instead)
- Body bags or caskets where no corpse is visible
- Urns and funerary receptacles
- Fully decomposed bones and skeletons
- Mummified or embalmed bodies displayed in museums or mausoleums
- Staged deaths such as in a theater production or renaissance fair (note: staged deaths with convincing fake blood will likely be flagged)
- A person sleeping (e.g., in a sleeping bag)
- A homeless person laying on a bench or the ground
- A person laying on the ground while under arrest
Emaciated Body Head
This head flags images of people and animals that appear severely underweight, malnourished, or sickly. It is not sensitive to slim/skinny but otherwise healthy people or animals. For these classes, art, illustrations, and animations follow the same definitions as photographic images.
Subject Matter Breakdown by Class
yes_emaciated_body: the image depicts a person or animal that is so underweight they appear ill or severely malnourished. In general, this includes:
- A person with ribs, hip bones, arm bones, and/or facial bone structure clearly visible through the skin
- Starving animals with clearly visible rib cages or pelvic bones
no_emaciated_body: the image does not depict the above. Generally, the following will not be flagged by yes_emaciated_body:
- Skinny/underweight individuals that appear healthy
- Muscular individuals with low body fat
- Skeletons and corpses
- Animals that are naturally slim with short fur, such as greyhounds
Self Harm Head
This model head flags images of intentional self-inflicted injuries and other indicators of self-harm. Injuries sustained from accidents and other causes are ignored. Animations and illustrations follow the same definitions as photographic images for these classes.
Subject Matter Breakdown by Class
yes_self_harm: the image depicts acts or evidence of self-harm or self-inflicted injuries. This includes:
- Images of someone cutting or burning themselves
- Self-inflicted cuts or burn scars (e.g., as evidenced by location, number, dimensions, direction, hesitation, etc.)
- A person pointing a gun to their own head or chest
- A person holding knives, razor blades, fire, or hot objects to their body
- Religious self-harm such as self-flagellation or self-immolation
no_self_harm: the image does not depict self-harm or self-inflicted injuries. To be clear, the following subject matter is not flagged by yes_self_harm:
- Corpses and hanging
- Graphic injuries with no evidence that they are self-inflicted
- Smoking and drug use
- BDSM and sexual torture
- People handling knives, razor blades etc. without evidence of intent to self-harm
- Surgical wounds or scars
- Scar tissue or burn marks without additional evidence of self-harm
Animal Abuse Head (Beta)
This model head can be used to flag images of animals being beaten, tortured, or treated inhumanely as well as animals with graphic injuries. Only photographic images are flagged — art, animations, and illustrations of abused or otherwise mistreated animals are classified as no_animal_abuse.
Subject Matter Breakdown by Class
yes_animal_abuse: the image is a photograph of an animal being abused, tortured, or otherwise mistreated. This includes:
- A person beating or torturing an animal. This includes but is not limited to the following actions:
- Kicking
- Punching
- Hitting with an object
- Burning
- Throwing
- A person threatening to do one of the above actions to an animal
- Two animals fighting for the purpose of human entertainment or gambling
- An animal with severe, graphic injuries
- A dead animal that shows excessive gore, evidence of abuse, or decomposition
- An animal that is being kept in inhumane and torturous conditions, including:
- Cages that are too small
- Cages with more animals than their intended capacity
- Dirty cages (i.e., filled with feces, standing water, bugs, rats, or dead animals)
- Collars, leashes, or other accessories that are too tight and cause injury
- Uncooked full or half body of an animal
- Animals being branded
- Animals held with chains directly on their skin (chains attached to a collar do not count)
no_animal_abuse: the image does not explicitly show animal abuse. To be clear, the following subject matter would not be flagged by the animal abuse model head:
- Dead animals where there is no gore, blood, or other signs of abuse shown
- Animals that are shown in clean cages for transportation purposes and not for long-term living
- Livestock farming, unless excessive injuries or dead livestock are visible
- Cuts of meat such as steak, ribs, etc.
- Cooked meat
- Scars or visible effects of previous branding
- Matted or dirty fur without excessive injuries
- Fur that is missing due to age or minor rash
- A person in a ring with a bull with no violence or injury shown
- Hunting paraphernalia with no animal visible in the image
- Taxidermy
- Animals that are severely underfed with no other signs of abuse or physical violence
- Injury or violence to fish, reptiles, or amphibians
- No animal is visible in the image
- Beastiality
- Minor cosmetic changes such as dying an animal’s hair, temporary painting, etc.
- Humans fighting animals in self-defense
- Animals attacking other animals in the wild and not for human entertainment
- Animals play-fighting
Fights Head (Beta)
This model head can be used to flag images of physical fights. This class flags photographic images only. Art, animations, and illustrations of fights are classified as no_fight.
Subject Matter Breakdown by Class
yes_fight: the image is a photograph in which two or more individuals are engaging in a physical fight. This includes:
- Two or more individuals engaging in any of the following:
- Punching each other
- Kicking each other
- Choking each other (unless it is in a sexual context)
- Physically attacking each other
- Two or more individuals that are about to start fighting, such as:
- People standing face-to-face in attack-ready position
- Fists raised in order to punch someone
no_fight: the image does not include a physical fight. The following subject matter will not be flagged by yes_fight:
- Physical fights that take place in a sports context (i.e., boxing, wrestling, or martial arts)
- Punching, kicking, choking, or attacking with no opponent, such as shadow boxing, punching a punching bag, or air fighting
- Cartoons or animations of fighting
- A person fighting an animal
- An animal fighting another animal
- A person being executed
- Police brutality
- A person being choked, spanked, or restrained during sexual activity
Child Safety Head (Beta)
This model head flags images of shirtless children. For the purposes of the model, a child is considered to be anyone 11 years of age or younger.
Subject Matter Breakdown by Class
yes_child_safety: the image contains a child 11 years old or younger who is unclothed from the waist up.
no_child_safety: the image does not contain a shirtless child 11 years old or younger. The following subject matter would not be flagged by the child safety head:
- Shirtless children that are older than 11 years of age
- Children 11 years old or younger whose chests are covered by clothing, towels, blankets, etc.
Updated 10 months ago