ComfyUI Node: LLM Min-P
Category
YALLM/samplers
Inputs
min_p FLOAT
previous LLMSAMPLER
Outputs
LLMSAMPLER
Extension: ComfyUI-YALLM-node
Yet another set of LLM nodes for ComfyUI (for local/remote OpenAI-like APIs, multi-modal models supported)
Authored by asaddi
Run ComfyUI workflows in the Cloud!
No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues
Learn more