ComfyUI Node: Generate Stable Diffsution Prompt With LLM
Category
LLM
Inputs
prompt STRING
template_system STRING
template_user STRING
stop STRING
response_pattern STRING
temperature FLOAT
max_tokens INT
model_name
- nousresearch/nous-capybara-7b:free
- mistralai/mistral-7b-instruct:free
- gryphe/mythomist-7b:free
- undi95/toppy-m-7b:free
- openrouter/cinematika-7b:free
- google/gemma-7b-it:free
- neversleep/noromaid-mixtral-8x7b-instruct
- nousresearch/nous-hermes-llama2-13b
- nousresearch/nous-hermes-2-mixtral-8x7b-dpo
- nousresearch/nous-hermes-2-mixtral-8x7b-sft
- gryphe/mythomax-l2-13b
- nousresearch/nous-capybara-7b
- teknium/openhermes-2-mistral-7b
- open-orca/mistral-7b-openorca
- huggingfaceh4/zephyr-7b-beta
- openai/gpt-3.5-turbo-0125
- google/gemini-pro
- perplexity/sonar-small-chat
- perplexity/sonar-medium-chat
- cognitivecomputations/dolphin-mixtral-8x7b
- anthropic/claude-instant-1.2
Outputs
STRING
Extension: comfyui-llm-assistant
Nodes:Generate Stable Diffsution Prompt With LLM, Translate Text With LLM, Chat With LLM
Authored by longgui0318
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more