ComfyUI Extension: wavespeed-comfyui
This is a custom node for ComfyUI that allows you to use the WaveSpeed AI API directly in ComfyUI. WaveSpeed AI is a high-performance AI image and video generation service platform offering industry-leading generation speeds. For more information, see a/WaveSpeed AI Documentation.
Custom Nodes (0)
README
ComfyUI-WaveSpeedAI-API
This is a custom node for ComfyUI that allows you to use the WaveSpeed AI API directly in ComfyUI. WaveSpeed AI is a high-performance AI image and video generation service platform offering industry-leading generation speeds.
NEW: Dynamic Node Approach [WIP] - We've introduced a streamlined workflow using dynamic nodes that replace the previous extensive collection of individual model nodes. You can now select models and configure parameters dynamically using WaveSpeedAI Task Create [WIP] ➜ WaveSpeedAI Task Submit [WIP] workflow for a more flexible and maintainable experience.
For more information, see WaveSpeed AI Documentation.
Requirements
Before using this node, you need to have a WaveSpeed AI API key. You can obtain your API key from the WaveSpeed AI.
Installation
Installing manually
-
Navigate to the
ComfyUI/custom_nodes
directory. -
Clone this repository:
git clone https://github.com/WaveSpeedAI/wavespeed-comfyui.git
-
Install the dependencies:
- Windows (ComfyUI portable):
python -m pip install -r requirements.txt
- Linux or MacOS:
pip install -r requirements.txt
-
If you don't want to expose your API key in the node, you can rename the
config.ini.tmp
file toconfig.ini
and add your API key there. -
Start ComfyUI and enjoy using the WaveSpeed AI API node!
How to Use
The following are typical workflows and result demonstrations (each group includes a ComfyUI workflow screenshot). The workflow images contain workflow information and can be directly dragged into ComfyUI for use.
NEW: Dynamic Node Examples [WIP]
1. Dynamic Nodes - Nano Banana
- Workflow Example: dynamic-nodes-nano-banana.json
- This example demonstrates the new dynamic node approach using nano banana model with WaveSpeedAI Task Create [WIP] ➜ WaveSpeedAI Task Submit [WIP] workflow.
2. Dynamic Nodes - Seedream V4
- Workflow Example: dynamic-nodes-seedreamv4.json
- This example shows how to use Seedream V4 model through the dynamic node system.
3. Dynamic Nodes - Seedream V4 Sequential
- Workflow Example: dynamic-nodes-seedreamv4-sequential.json
- This example demonstrates sequential processing with Seedream V4 using the dynamic node workflow.
Hot
-
We have launched very powerful video nodes called seedance, please enjoy them freely
-
Workflow Example:
-
Result Video:
https://github.com/user-attachments/assets/b9902503-f8b1-46b2-bc8e-48fcba84e5bc
1. Dia TTS
-
Workflow Example:
2. Flux Control LoRA Canny
-
Workflow Example:
3. Flux Dev Lora Ultra Fast
-
Workflow Example:
4. Hunyuan Custom Ref2V 720p Workflow and Result
-
Workflow Example:
-
Result Video:
https://github.com/user-attachments/assets/46220376-4341-4ce3-a7f4-46f12ff7ccf6
5. Wan2.1 I2V 720p Ultra Fast Workflow and Result
-
Workflow Example:
-
Result Video:
https://github.com/user-attachments/assets/77fc1882-6d74-43b0-a4eb-6d8883febcdc
New Recommended Approach: Dynamic Parameter Nodes [WIP]
We now recommend using our new dynamic node system for a cleaner and more flexible workflow:
Core Dynamic Nodes:
- WaveSpeedAI Task Create [WIP] - Select any available model and configure parameters dynamically
- WaveSpeedAI Task Submit [WIP] - Execute your configured task
- WaveSpeedAI Client - Client configuration (still required)
Workflow Benefits:
- Dynamic Model Selection: Choose from all available models within a single node interface
- Dynamic Parameters: Configure model-specific parameters without needing individual nodes
- Simplified Setup: Cleaner workflows with fewer node types
- Future-Proof: New models are automatically available without requiring new node releases
How to Use:
- Add WaveSpeedAI Task Create [WIP] to your workflow
- Select your desired model from the dropdown (Flux, Hunyuan, Wan2.1, etc.)
- Configure the dynamic parameters based on your selected model
- Connect to WaveSpeedAI Task Submit [WIP] to execute
Legacy Individual Model Nodes:
While we still support the individual model nodes listed below, we recommend migrating to the dynamic approach for new workflows. If you encounter any issues with the new dynamic nodes, please submit an issue.
<details> <summary>Click to view legacy individual model nodes</summary>Legacy Nodes List (Still Supported):
- "WaveSpeedAI Client"
- "WaveSpeedAI Dia TTS"
- "WaveSpeedAI Flux Control LoRA Canny"
- "WaveSpeedAI Flux Control LoRA Depth"
- "WaveSpeedAI Flux Dev"
- "WaveSpeedAI Flux Dev Fill"
- "WaveSpeedAI Flux Dev Lora"
- "WaveSpeedAI Flux Dev Lora Ultra Fast"
- "WaveSpeedAI Flux Dev Ultra Fast"
- "WaveSpeedAI Flux Pro Redux"
- "WaveSpeedAI Flux Redux Dev"
- "WaveSpeedAI Flux Schnell"
- "WaveSpeedAI Flux Schnell Lora"
- "WaveSpeedAI Flux and SDXL Loras"
- "WaveSpeedAI Framepack"
- "WaveSpeedAI Ghibli"
- "WaveSpeedAI Hidream E1 Full"
- "WaveSpeedAI Hidream I1 Dev"
- "WaveSpeedAI Hidream I1 Full"
- "WaveSpeedAI Hunyuan 3D V2 Multi View"
- "WaveSpeedAI Hunyuan Custom Ref2V 480p"
- "WaveSpeedAI Hunyuan Custom Ref2V 720p"
- "WaveSpeedAI Hunyuan Video I2V"
- "WaveSpeedAI Hunyuan Video T2V"
- "WaveSpeedAI Instant Character"
- "WaveSpeedAI Kling v1.6 I2V Pro"
- "WaveSpeedAI Kling v1.6 I2V Standard"
- "WaveSpeedAI Kling v1.6 T2V Standard"
- "WaveSpeedAI LTX Video I2V 480p"
- "WaveSpeedAI LTX Video I2V 720p"
- "WaveSpeedAI MMAudio V2"
- "WaveSpeedAI Magi 1.24b"
- "WaveSpeedAI Minimax Video 01"
- "WaveSpeedAI Preview Video"
- "WaveSpeedAI Real-ESRGAN"
- "WaveSpeedAI SDXL"
- "WaveSpeedAI SDXL Lora"
- "WaveSpeedAI Save Audio"
- "WaveSpeedAI SkyReels V1"
- "WaveSpeedAI Step1X Edit"
- "WaveSpeedAI Uno"
- "WaveSpeedAI Upload Image"
- "WaveSpeedAI Vidu Image to Video2.0"
- "WaveSpeedAI Vidu Reference To Video2.0"
- "WaveSpeedAI Vidu Start/End To Video2.0"
- "WaveSpeedAI Wan Loras"
- "WaveSpeedAI Wan2.1 I2V 480p"
- "WaveSpeedAI Wan2.1 I2V 480p LoRA Ultra Fast"
- "WaveSpeedAI Wan2.1 I2V 480p Lora"
- "WaveSpeedAI Wan2.1 I2V 480p Ultra Fast"
- "WaveSpeedAI Wan2.1 I2V 720p"
- "WaveSpeedAI Wan2.1 I2V 720p LoRA Ultra Fast"
- "WaveSpeedAI Wan2.1 I2V 720p Lora"
- "WaveSpeedAI Wan2.1 I2V 720p Ultra Fast"
- "WaveSpeedAI Wan2.1 T2V 480p LoRA"
- "WaveSpeedAI Wan2.1 T2V 480p LoRA Ultra Fast"
- "WaveSpeedAI Wan2.1 T2V 480p Ultra Fast"
- "WaveSpeedAI Wan2.1 T2V 720p"
- "WaveSpeedAI Wan2.1 T2V 720p LoRA"
- "WaveSpeedAI Wan2.1 T2V 720p LoRA Ultra Fast"
- "WaveSpeedAI Wan2.1 T2V 720p Ultra Fast"
How to Apply Lora
- As we provide services on WaveSpeedAI-API, you cannot use your local lora files. However, we support loading lora via URL.
- You can use "WaveSpeedAi Wan Loras", "WaveSpeedAi Flux Loras", or "WaveSpeedAi Flux SDXL Loras" nodes.
- Enter the lora URL in the lora_path field. For example: https://huggingface.co/WaveSpeedAi/WanLoras/resolve/main/wan_loras.safetensors
- Enter the lora weight in the lora_weight field. For example: 0.5
- If you have multiple loras, you can add additional lora_path and lora_weight pairs.
- If your model is not on Hugging Face, that's fine. Any publicly accessible URL will work.
How to Use image_url in Nodes
- You can use the "WaveSpeedAi Upload Image" node to convert a local IMAGE into an image_url.
- Connect the output to the corresponding node that requires it. You can find examples in the provided samples.