ComfyUI Node: Unet Loader (GGUF)

Authored by city96

Created

Updated

1747 stars

Category

bootleg

Inputs

unet_name

    Outputs

    MODEL

    Extension: ComfyUI-GGUF

    GGUF Quantization support for native ComfyUI models This is currently very much WIP. These custom nodes provide support for model files stored in the GGUF format popularized by llama.cpp. While quantization wasn't feasible for regular UNET models (conv2d), transformer/DiT models such as flux seem less affected by quantization. This allows running it in much lower bits per weight variable bitrate quants on low-end GPUs.

    Authored by city96

    Run ComfyUI workflows in the Cloud!

    No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

    Learn more