ComfyUI Extension: NSFW Check for ComfyUI

Authored by iamandeepsandhu

Created

Updated

10 stars

This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.

Custom Nodes (0)

    README

    NSFW Check for ComfyUI

    Project Overview

    This project is designed to detect whether images generated by ComfyUI are Not Safe For Work (NSFW). It uses a machine learning model to classify images as either safe or not safe for work and returns a confidence score for the NSFW classification.

    Using the score, a user can add logical filters to their workflow.

    A threshold of 0.95 works well for most cases.

    image

    Usage

    While one can use this node to quickly check the NSFW score of an image, it is more likely to be used when you are using workflow as an API. Using the output from this node, you can programmatically filter out NSFW images and have dynamic thresholds.

    Ouput is a list of scores for the batch of images.

    Credits

    This project is based on ComfyUI-NSFW-Detection by trumanwong.

    Install

    1. Clone this repo into custom_nodes directory of ComfyUI location

    2. Run pip install -r requirements.txt