ComfyUI Node: Intercept NSFW Outputs
Category
image/processing
Inputs
image IMAGE
threshold FLOAT
cuda BOOLEAN
Outputs
IMAGE
STRING
Extension: ComfyUI YetAnotherSafetyChecker
Just a simple node to filter out NSFW outputs. This node utilizes a/AdamCodd/vit-base-nsfw-detector to score the outputs. I chose this model because it's small, fast, and performed very well in my testing. Nudity tends to be scored in the 0.95+ range, but I've set the default to 0.8 as a safe baseline.
Authored by BetaDoggo
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more