ComfyUI Node: Omost LLM Chat

Authored by huchenlei

Created

Updated

434 stars

Category

omost

Inputs

llm OMOST_LLM
text STRING
max_new_tokens INT
top_p FLOAT
temperature FLOAT
seed INT
conversation OMOST_CONVERSATION

Outputs

OMOST_CONVERSATION

OMOST_CANVAS_CONDITIONING

Extension: ComfyUI_omost

ComfyUI implementation of a/Omost, and everything about regional prompt. NOTE: You need to install ComfyUI_densediffusion to use this node.

Authored by huchenlei

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more