ComfyUI Extension: ComfyUI LLM SDXL Adapter
A comprehensive set of ComfyUI nodes for using Large Language Models (LLM) as text encoders for SDXL image generation through a trained adapter.
Custom Nodes (0)
README
ComfyUI LLM SDXL Adapter
A comprehensive set of ComfyUI nodes for using Large Language Models (LLM) as text encoders for SDXL image generation through a trained adapter.
<img width="1803" height="904" alt="image" src="https://github.com/user-attachments/assets/e8e5f047-37e7-4f8b-9bbd-78d70e2a7d80" />šÆ Available Adapters
RouWei-Gemma Adapter
Trained adapter for using Gemma-3-1b as text encoder for Rouwei v0.8 (vpred or epsilon or base).
Download Links:
š¦ Installation
Requirements
- Python 3.8+
- ComfyUI
- Latest transformers library (tested on 4.53.1)
Install Dependencies
pip install transformers>=4.53.1 safetensors einops torch
Install Nodes
- Clone the repository to
ComfyUI/custom_nodes/
:
cd ComfyUI/custom_nodes/
git clone https://github.com/NeuroSenko/ComfyUI_LLM_SDXL_Adapter.git
- Restart ComfyUI
Setup RouWei-Gemma Adapter
-
Download the adapter:
- Download from CivitAI or HuggingFace
- Place the adapter file in
ComfyUI/models/llm_adapters/
-
Download Gemma-3-1b-it model:
- Download gemma-3-1b-it (non-gated mirror)
- Place in
ComfyUI/models/llm/gemma-3-1b-it/
- Note: You need ALL files from the original model for proper functionality (not just .safetensors)
-
Download Rouwei checkpoint:
- Get Rouwei v0.8 (vpred, epsilon, or base) if you don't have it
- Place in your regular ComfyUI checkpoints folder
š File Structure Example
ComfyUI/models/
āāā llm/gemma-3-1b-it/
ā āāā added_tokens.json
ā āāā config.json
ā āāā generation_config.json
ā āāā model.safetensors
ā āāā special_tokens_map.json
ā āāā tokenizer.json
ā āāā tokenizer.model
ā āāā tokenizer_config.json
āāā llm_adapters/
ā āāā rouweiGemma_g31b27k.safetensors
āāā checkpoints/
āāā rouwei_v0.8_vpred.safetensors
š Debugging
To enable detailed logging, edit __init__.py
:
# Change from:
logger.setLevel(logging.WARN)
# To:
logger.setLevel(logging.INFO)