This project provides custom safety checkers for image classification using Falcons AI and CompVis models. The safety checkers are designed to detect and filter out NSFW content from images.
This project provides custom safety checkers for image classification using Falcons AI and CompVis models. The safety checkers are designed to detect and filter out NSFW content from images.
Clone the repository:
git clone https://github.com/shabri-arrahim/ComfyUI-Safety-Checker.git
cd ComfyUI-Safety-Checker
Install the required dependencies:
pip install -r requirements.txt
Download the models and place them into the diffusers
folder in the models
directory:
Falcon AI model:
huggingface-cli download Falconsai/nsfw_image_detection --local-dir models/diffusers/Falconsai_nsfw_image_detection
CompVis model:
huggingface-cli download CompVis/stable-diffusion-safety-checker --local-dir models/diffusers/CompVis_stable_diffusion_safety_checker
Ensure you have ComfyUI installed and set up. If not, follow the instructions on the ComfyUI GitHub page.
Place this cloned repository into the ComfyUI custom nodes directory:
cp -r /path/to/ComfyUI-Safety-Checker /path/to/comfyui/custom_nodes/
Start ComfyUI:
cd /path/to/comfyui
python main.py
In the ComfyUI interface, you should now see the custom nodes FalconsAISafetyChecker
and CompVisSafetyChecker
available under the "SafetyChecker" category.
Use these nodes in your ComfyUI workflows to filter images for NSFW content.
This project is licensed under the MIT License.