ComfyUI Node: List Models | Ollama Nodes
Category
LLM/Ollama
Inputs
delimiter STRING
Outputs
LIST
STRING
Extension: comfyui-ollama-nodes
ComfyUI custom nodes for working with a/Ollama. NOTE:Assumes that an Ollama server is running at http://127.0.0.1:11434 and accessible by the ComfyUI backend.
Authored by slyt
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more