ComfyUI Node: LLM Model (API)
Category
YALLM
Inputs
model
- [LOCAL] llama.cpp
- [LOCAL] LM Studio
- [LOCAL] Ollama
Outputs
LLMMODEL
Extension: ComfyUI-YALLM-node
Yet another set of LLM nodes for ComfyUI (for local/remote OpenAI-like APIs, multi-modal models supported)
Authored by asaddi
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more