ComfyUI Node: ✨ Groq LLM API

Authored by MNeMoNiCuZ

Created

Updated

48 stars

Category

⚡ MNeMiC Nodes

Inputs

model
  • mixtral-8x7b-32768
  • llama3-70b-8192
  • llama3-8b-8192
  • gemma-7b-it
preset
  • Use [system_message] and [user_input]
  • Generate a prompt about [user_input]
  • Create a negative prompt for [user_input]
  • List 10 ideas about [user_input]
  • Return JSON prompt about [user_input]
  • Add your own presets in UserPrompts.json
system_message STRING
user_input STRING
temperature FLOAT
max_tokens INT
top_p FLOAT
seed INT
max_retries INT
stop STRING
json_mode BOOLEAN

Outputs

STRING

BOOLEAN

STRING

Extension: ComfyUI-mnemic-nodes

Added new models to Groq LLM. Added a new node: Tiktoken Tokenizer Info.

Authored by MNeMoNiCuZ

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more