ComfyUI Extension: ComfyUI Rating Checker
A custom node for ComfyUI that classifies images into NSFW (Not Safe For Work) categories.
Custom Nodes (0)
README
ComfyUI Rating Checker
This is a custom node for ComfyUI that classifies images as NSFW (Not Safe For Work).
Feature
This node is designed especially for classifying NSFW content in illustrated images.
While existing similar nodes perform well for real human photos, they tend to classify anime-style illustrations as NSFW even when there is minimal nudity. Additionally, distinguishing between R15 and R18 categories is often difficult.
To address these issues, Rating Checker (NudeNet)
combines an object detection model (NudeNet) with an NSFW classification model to categorize images into three labels: SFW
, NSFW (R15)
, and NSFW (R18)
.
Installation
ComfyUI Manager
Search for ComfyUI Rating Checker
.
Manual
Clone the repository into your custom_nodes
directory:
git clone https://github.com/tighug/comfyui-rating-checker.git
Usage
This package includes three nodes for NSFW rating. The primary node is the NudeNet version, but the others (created during evaluation) are also included.
Rating Checker (NudeNet)
Classifies images into the following three labels based on these conditions:
-
nsfw_r18
: At least one of the following body parts is detected withdetect_[body part] = True
:- armpits
- female_breast
- male_breast
- female_genitalia
- male_genitalia
- belly
- buttocks
- anus
- feet
-
nsfw_r15
: Notnsfw_r18
, butnsfw_score > threshold_nsfw
-
sfw
: Does not meet any of the above conditions
Models used:
Rating Checker (GantMan)
Classifies images into the following five labels:
drawings
: Illustrationshentai
: Anime or manga-style contentneutral
: General-purpose imagesporn
: Real-world explicit contentsexy
: Images with a sexual vibe
Useful for distinguishing between real images and illustrations and for general NSFW classification.
Model used:
Rating Checker (Marqo)
Calculates an NSFW score from the image and outputs it as scores
. It also performs binary classification (sfw
/ nsfw
) using threshold_nsfw
, and outputs the result as ratings
.
Model used: