ComfyUI Node: Omost LLM HTTP Server
Category
omost
Inputs
address STRING
api_type
- OpenAI
- TGI
Outputs
OMOST_LLM
Extension: ComfyUI_omost
ComfyUI implementation of a/Omost, and everything about regional prompt. NOTE: You need to install ComfyUI_densediffusion to use this node.
Authored by huchenlei
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more