An implementation of NSFW Detection for ComfyUI
This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work. If an image is classified as NSFW, an alternative image is returned.
Clone this repo into custom_nodes directory of ComfyUI location
Run pip install -r requirements.txt
The main functionality of the project is encapsulated in the NSFWDetection
class in the node.py
file. This class has a run
method that takes three parameters:
image
: The image to be classified.score
: The threshold score for classifying an image as NSFW.alternative_image
: The image to be returned if the input image is classified as NSFW.https://github.com/trumanwong/ComfyUI-NSFW-Detection/blob/main/workflow.json
Contributions are welcome. Please submit a pull request if you have any improvements or bug fixes.
This project is licensed under the terms of the MIT license.