ComfyUI Extension: ComfyUI-OpenVINO

Authored by openvino-dev-samples

Created

Updated

3 stars

ComfyUI nodes for OpenVINO.

Custom Nodes (0)

    README

    OpenVINO Node for ComfyUI

    This node is designed for optimizing the performance of model inference in ComfyUI by leveraging Intel OpenVINO toolkits.


    Supported Hardware

    This node can support running model on Intel CPU, GPU and NPU device.You can find more detailed informantion in OpenVINO System Requirements.

    Install

    Prererquisites

    The recommended installation method is to use the Comfy Registry.

    Comfy Registry

    These nodes can be installed via the Comfy Registry.

    comfy node registry-install comfyui-openvino
    

    ComfyUI-Manager

    This node can be installed via ComfyUI-Manager in the UI or via the CLI:

    comfy node install comfyui-openvino
    

    Manual

    This node can also be installed manually by copying them into your custom_nodes folder and then installing dependencies:

    cd ComfyUI/custom_nodes
    git clone https://github.com/openvino-dev-samples/comfyui_openvino 
    cd comfyui_openvino
    pip install -r requirements.txt
    

    Instruction

    To trigger OpenVINO Node for ComfyUI, you can follow the example as reference:

    1. Start a ComfyUI server.

      cd ComfyUI
      python3 main.py --cpu --use-pytorch-cross-attention
      
    2. Prepare a standard workflow in ComfyUI.

      Step 1

    3. Add OpenVINO Node.

      Step 2

    4. Connect OpenVINO Node with Model/LoRa Loader.

      Step 3

    5. Run workflow. Please notice it may need an additional warm-up inference after switching new model.

      Step 4