ComfyUI Node: Nunchaku FLUX DiT Loader

Authored by nunchaku-tech

Created

Updated

2340 stars

Category

Nunchaku

Inputs

model_path
    cache_threshold FLOAT
    attention
    • nunchaku-fp16
    • flash-attention2
    cpu_offload
    • auto
    • enable
    • disable
    device_id INT
    data_type
    • bfloat16
    • float16
    i2f_mode
    • enabled
    • always

    Outputs

    MODEL

    Extension: ComfyUI-nunchaku

    Nunchaku ComfyUI Node. Nunchaku is the inference that supports SVDQuant. SVDQuant is a new post-training training quantization paradigm for diffusion models, which quantize both the weights and activations of FLUX.1 to 4 bits, achieving 3.5× memory and 8.7× latency reduction on a 16GB laptop 4090 GPU. See more details: https://github.com/mit-han-lab/nunchaku

    Authored by nunchaku-tech

    Run ComfyUI workflows in the Cloud!

    No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

    Learn more