ComfyUI DeepSeek_R1 Chat Node
A custom node for ComfyUI that integrates DeepSeek's powerful chat and instruction API, enabling seamless AI interactions within your ComfyUI workflows.

Features
-
Dual Operating Modes
- Chat mode with conversation memory
- Single-turn instruction mode
-
Vision Integration
- Support for image context through vision descriptions
- Seamless integration with ComfyUI's visual workflows
-
Advanced Configuration
- Customizable max token limit (up to 8192 tokens)
- Optional reasoning display
- Configurable chat history management
- Built-in rate limiting protection
Installation
- Clone this repository into your ComfyUI custom nodes directory:
cd ComfyUI/custom_nodes
git clone https://github.com/ShmuelRonen/comfyui-deepseek-chat.git
-
Get your DeepSeek API key:
-
Edit theconfig.json
file in the node directory:
{
"deepseek_api_key": "your-api-key-here"
}
Important
Check API availability on this site: [DeepSeek API status](https://platform.deepseek.com/
Usage
Node Configuration
The node provides several input parameters:
- mode: Choose between "chat" and "instruct"
- prompt: Your input text/question
- max_tokens: Maximum response length (1-8192)
- show_reasoning: Toggle reasoning display
- clear_history: Option to reset conversation
- remember_chat: Enable/disable chat history persistence
- vision_description: Optional image context description. This can be used to:
- Provide image descriptions for visual context
- Connect with image generation nodes
- Enable vision-language tasks
- Process image analysis results
Example Workflow
- Add the "DeepSeek-V3 Chat/Instruct" node to your workflow
- Configure the node parameters
- Connect it to other nodes as needed
- Execute the workflow
Outputs
The node provides two outputs:
latest_response
: The most recent AI response
full_chat_history
: Complete conversation history
Rate Limiting
The node implements a 1-second minimum interval between requests to comply with API limits.
Error Handling
- Comprehensive error logging
- Graceful handling of API failures
- Clear error messages in node outputs
Technical Details
API Integration
API Endpoint: https://api.deepseek.com/v1/chat/completions
Model: deepseek-reasoner
- Supports full chat context with message history
- Handles authentication via API key
- Implements rate limiting protection
File Structure
comfyui-deepseek-chat/
├── deepseek_chat_node.py
├── config.json
├── chat_history.json
└── README.md
Dependencies
- Python 3.6+
- requests library
- ComfyUI environment
Contributing
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Built for ComfyUI
- Powered by DeepSeek's API
- Inspired by community needs for AI integration in visual workflows