NSFWScore
FLOAT
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.
Authored by iamandeepsandhu
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more