This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.
Using the score, a user can add logical filters to their workflow.
A threshold of 0.95 works well for most cases.
While one can use this node to quickly check the NSFW score of an image, it is more likely to be used when you are using workflow as an API. Using the output from this node, you can programmatically filter out NSFW images and have dynamic thresholds.
Ouput is a list of scores for the batch of images.
This project is based on ComfyUI-NSFW-Detection by trumanwong.
Clone this repo into custom_nodes directory of ComfyUI location
Run pip install -r requirements.txt