ComfyUI Node: Tara Preset LLM Config Node

Authored by ronniebasak

Created

Updated

99 stars

Category

tara-llm

Inputs

llm_models
  • openai/gpt-3.5-turbo
  • openai/gpt-4-turbo-preview
  • groq/llama2-70b-4096
  • groq/llama3-70b-8192
  • groq/llama3-8b-8192
  • groq/mixtral-8x7b-32768
  • groq/gemma-7b-it
  • together/coming-soon
temperature FLOAT
seed INT
max_tokens INT
top_p FLOAT
frequency_penalty FLOAT
presence_penalty FLOAT
timeout INT
use_loader BOOLEAN
loader_temporary BOOLEAN
api_key STRING

Outputs

TARA_LLM_CONFIG

Extension: ComfyUI-Tara-LLM-Integration

Tara is a powerful node for ComfyUI that integrates Large Language Models (LLMs) to enhance and automate workflow processes. With Tara, you can create complex, intelligent workflows that refine and generate content, manage API keys, and seamlessly integrate various LLMs into your projects.

Authored by ronniebasak

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more