ComfyUI Node: Pull Model | Ollama Nodes

Authored by slyt

Created

Updated

0 stars

Category

LLM/Ollama

Inputs

model_name STRING
stream BOOLEAN

Outputs

STRING

STRING

Extension: comfyui-ollama-nodes

ComfyUI custom nodes for working with a/Ollama. NOTE:Assumes that an Ollama server is running at http://127.0.0.1:11434 and accessible by the ComfyUI backend.

Authored by slyt

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more