ComfyUI Node: MinusZone - ModelConfigDownloaderSelect(LLamaCPP)

Authored by MinusZoneAI

Created

Updated

120 stars

Category

MinusZone - Prompt/others

Inputs

model_name
  • Meta-Llama-3-8B-Instruct.Q4_K_M
  • llama3_if_ai_sdpromptmkr_Q4_K_M
  • qwen2-7b-instruct-q5_k_m
  • qwen2-0_5b-instruct-q5_k_m
  • omost-llama-3-8b-Q4_K_M
  • omost-phi-3-mini-128k-Q4_K_M
  • Meta-Llama-3-8B.Q4_K_M
  • Phi-3-mini-4k-instruct-q4
  • llama3-zh.Q4_K_M
  • llama3_8b_instruct_dpo_zh-Q4_K_M
  • qwen1_5-14b-chat-q4_k_m
  • qwen1_5-7b-chat-q4_k_m
  • qwen1_5-4b-chat-q4_k_m
  • qwen1_5-1_8b-chat-q4_k_m
  • qwen1_5-0_5b-chat-q4_k_m
chat_format
  • auto
  • llama-2
  • llama-3
  • alpaca
  • qwen
  • vicuna
  • oasst_llama
  • baichuan-2
  • baichuan
  • openbuddy
  • redpajama-incite
  • snoozy
  • phind
  • intel
  • open-orca
  • mistrallite
  • zephyr
  • pygmalion
  • chatml
  • mistral-instruct
  • chatglm3
  • openchat
  • saiga
  • gemma
  • functionary
  • functionary-v2
  • functionary-v1
  • chatml-function-calling

Outputs

LLamaCPPModelConfig

Extension: ComfyUI-Prompt-MZ

Use llama.cpp to help generate some nodes for prompt word related work

Authored by MinusZoneAI

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more