ComfyUI Node: Download Huggingface Model | Ollama Nodes

Authored by slyt

Created

Updated

0 stars

Category

LLM/Ollama

Inputs

repo_id STRING
filename STRING
local_dir STRING

Outputs

STRING

Extension: comfyui-ollama-nodes

ComfyUI custom nodes for working with a/Ollama. NOTE:Assumes that an Ollama server is running at http://127.0.0.1:11434 and accessible by the ComfyUI backend.

Authored by slyt

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more