ComfyUI Node: ChatGPT & Local LLM ♾️Mixlab
Category
♾️Mixlab/GPT
Inputs
prompt STRING
system_content STRING
model
- gpt-3.5-turbo
- gpt-3.5-turbo-16k
- gpt-4o
- gpt-4o-2024-05-13
- gpt-4
- gpt-4-0314
- gpt-4-0613
- gpt-3.5-turbo-0301
- gpt-3.5-turbo-0613
- gpt-3.5-turbo-16k-0613
- qwen-turbo
- qwen-plus
- qwen-long
- qwen-max
- qwen-max-longcontext
- glm-4
- glm-3-turbo
- moonshot-v1-8k
- moonshot-v1-32k
- moonshot-v1-128k
- deepseek-chat
- Qwen/Qwen2-7B-Instruct
- THUDM/glm-4-9b-chat
- 01-ai/Yi-1.5-9B-Chat-16K
- meta-llama/Meta-Llama-3.1-8B-Instruct
seed INT
context_size INT
api_url
- openai
- api2d
- Kimi
- DeepSeek-V2
- SiliconCloud
api_key STRING
custom_model_name STRING
custom_api_url STRING
Outputs
STRING
STRING
STRING
Extension: comfyui-mixlab-nodes
3D, ScreenShareNode & FloatingVideoNode, SpeechRecognition & SpeechSynthesis, GPT, LoadImagesFromLocal, Layers, Other Nodes, ...
Authored by shadowcz007
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more