ComfyUI Extension: ComfyUI-OpenVINO
ComfyUI nodes for OpenVINO.
Custom Nodes (0)
README
OpenVINO Node for ComfyUI
This node is designed for optimizing the performance of model inference in ComfyUI by leveraging Intel OpenVINO toolkits.
Supported Hardware
This node can support running model on Intel CPU, GPU and NPU device.You can find more detailed informantion in OpenVINO System Requirements.
Install
Prererquisites
- Install comfy-cli
The recommended installation method is to use the Comfy Registry.
Comfy Registry
These nodes can be installed via the Comfy Registry.
comfy node registry-install comfyui-openvino
ComfyUI-Manager
This node can be installed via ComfyUI-Manager in the UI or via the CLI:
comfy node install comfyui-openvino
Manual
This node can also be installed manually by copying them into your custom_nodes
folder and then installing dependencies:
cd ComfyUI/custom_nodes
git clone https://github.com/openvino-dev-samples/comfyui_openvino
cd comfyui_openvino
pip install -r requirements.txt
Instruction
To trigger OpenVINO Node for ComfyUI, you can follow the example as reference:
-
Start a ComfyUI server.
cd ComfyUI python3 main.py --cpu --use-pytorch-cross-attention
-
Prepare a standard workflow in ComfyUI.
-
Add OpenVINO Node.
-
Connect OpenVINO Node with Model/LoRa Loader.
-
Run workflow. Please notice it may need an additional warm-up inference after switching new model.