ComfyUI Node: LLM Provider (API)
Category
YALLM
Inputs
provider
- [LOCAL] llama.cpp
- [LOCAL] LM Studio
- [LOCAL] Ollama
model
- fetch.models.first
Outputs
LLMMODEL
Extension: ComfyUI-YALLM-node
Yet another set of LLM nodes for ComfyUI (for local/remote OpenAI-like APIs, multi-modal models supported)
Authored by asaddi
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more