ComfyUI Extension: Underage Filter
An implementation to detect underage subjects in images for ComfyUI.
Custom Nodes (0)
README
ComfyUI Underage Content Filter & Moderation Nodes
This extension adds moderation-focused nodes for ComfyUI to help filter, block, or gate content based on predicted age, classification confidence, and dynamic logic gates. It uses nateraw/vit-age-classifier to determine age from images.
🚀 Features
✅ AgeCheckerNode
Performs age classification using a ViT-based model and optionally blocks underage content.
Inputs:
image: Image tensor (1 image)gate_enabled: Boolean toggle to block underage outputuse_local_model: Toggle to load local model fromLOCAL_AGE_MODEL_PATHenvironment variable
Outputs:
is_underage: Booleanpredicted_age: Integer (age bucket or class)confidence: Float (probability)status: String (UnderageorOK)gate_output: Boolean for workflow continuation
If
gate_enabledisTrueand the subject is underage, it will raise aPermissionError.
🔎 UnderageFilterNode
A lightweight classifier that checks if the image falls into one of the underage classes (0-2, 3-9, 10-19) with a confidence threshold.
Inputs:
image: Image tensorscore: Minimum confidence threshold (default:0.85)
Outputs:
is_underage: Boolean
â›” MultiTypeGateNode
A flexible gate node that can conditionally halt workflows based on any value type.
Inputs:
value: SupportsBOOLEAN,INT,FLOAT,STRINGblock_on: Mode (falsy,truthy,equal)match_value: Value to match if usingequalmodemessage: Custom error message to raise
Outputs:
- (None) — raises
PermissionErrorif condition is met
📦 Installation
- Clone or download this repository into your
ComfyUI/custom_nodes/directory:git clone https://github.com/your-repo/comfyui-underage-filter.git