ComfyUI Extension: ComfyUI GCP Cloud Storage Integration
This custom node package provides seamless integration between ComfyUI and Google Cloud Platform (GCP) Cloud Storage, enabling you to run ComfyUI without relying on instance disk storage. decouple your storage needs from local disks by leveraging GCP's scalable and reliable Cloud Storage service. Models can be saved and loaded directly from Cloud Storage, while outputs and temporary files are automatically synced.
Custom Nodes (0)
README
ComfyUI GCP Cloud Storage Integration
This custom node package provides seamless integration between ComfyUI and Google Cloud Platform (GCP) Cloud Storage, enabling you to run ComfyUI without relying on instance disk storage. decouple your storage needs from local disks by leveraging GCP's scalable and reliable Cloud Storage service. Models can be saved and loaded directly from Cloud Storage, while outputs and temporary files are automatically synced.
Tips & Tricks:
- Initial model loading may take time depending on your network speed and model size.
- The model file is cached locally after the first download, so subsequent uses will be faster.
- To save the output to Cloud storage, use the "GCP Storage Upload Image" node in your workflow.
- File saving supports 2 different naming formats to keep the naming unique and organized. Save it with a timestamp or with ComfyUI's standard naming with incremental numbers.
- "GCP Storage Upload Image" node requires user to pick the file naming format, timestamp or ComfyUI standard naming. * You will encounter error if you leave it blank.*
Features
- π Transparent Storage: Automatically sync files between local cache and Cloud Storage
- π Smart Caching: Only download files when needed, cache locally for performance
- π Bidirectional Sync: Upload outputs and download models/inputs automatically
- ποΈ Node Integration: Custom nodes for explicit Cloud Storage operations
- βοΈ Easy Setup: Automated setup script for GCP configuration
- π Scalable: Remove dependency on instance persistent storage
Architecture
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β ComfyUI β β Storage Manager β β GCP Cloud β
β Instance βββββΊβ (Middleware) βββββΊβ Storage β
β β β β β β
β βββββββββββββββ β β ββββββββββββββββ β β βββββββββββββββ β
β βLocal Cache β β β βPath Mapping β β β β Buckets β β
β β- models/ β β β β- models/ β β β β β - models/ β β
β β- temp/ β β β β gs://bucket β β β β - outputs/ β β
β β- cache/ β β β β- outputs/ β β β β β - inputs/ β β
β βββββββββββββββ β β β gs://bucket β β β β - user/ β β
βββββββββββββββββββ β ββββββββββββββββ β β βββββββββββββββ β
ββββββββββββββββββββ βββββββββββββββββββ
Quick Start
1. Automatic Setup
Run the automated setup script:
cd /home/prawegko/ComfyUI/custom_nodes/comfyui-gcp-storage/
./setup_gcp_storage.sh
This script will:
- Create a GCP Cloud Storage bucket
- Set up service account and permissions
- Generate credentials
- Install Python dependencies
- Create environment configuration
2. Manual Setup
If you prefer manual setup:
Install Dependencies
pip install google-cloud-storage google-auth google-auth-oauthlib google-auth-httplib2
Set Environment Variables
export GCP_PROJECT_ID="your-project-id"
export GCP_STORAGE_BUCKET="your-bucket-name"
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account-key.json"
Create Cloud Storage Bucket
gsutil mb gs://your-bucket-name
gsutil -m cp -r /dev/null gs://your-bucket-name/models/
gsutil -m cp -r /dev/null gs://your-bucket-name/outputs/
gsutil -m cp -r /dev/null gs://your-bucket-name/inputs/
gsutil -m cp -r /dev/null gs://your-bucket-name/user/
gsutil -m cp -r /dev/null gs://your-bucket-name/workflows/
3. Upload Existing Models
Transfer your existing models to Cloud Storage:
gsutil -m cp -r ./models/* gs://your-bucket-name/models/
4. Restart ComfyUI
Restart ComfyUI to load the new nodes and storage manager.
Usage
Automatic Mode (Recommended)
Once configured, the storage manager automatically:
- Downloads models from Cloud Storage when ComfyUI needs them
- Uploads outputs to Cloud Storage when generated
- Caches files locally for performance
- Syncs changes bidirectionally
No changes to your workflows are needed!
Manual Node Usage
Use the custom nodes for explicit control:
GCP Storage Upload Image
- Upload generated images directly to Cloud Storage
- Configurable paths, formats, and quality
GCP Storage Download Image
- Download images from Cloud Storage into workflows
- Supports gs:// URLs
GCP Storage Upload/Download Model
- Explicitly manage model files in Cloud Storage
- Useful for dynamic model loading
GCP Storage List Files
- List files in Cloud Storage buckets
- Browse available models and assets
Transfering existing model file to GCP Cloud Storage
- Use
gsutilcommand line tool to upload existing model files to your Cloud Storage bucket.
gsutil -m cp -r ./models/* gs://your-bucket-name/models/
- You can also use the python script to handle selective migrate of model files when transferring larger number of model files. Due to large size of model files, it is recommended to use
gsutilcommand line tool for faster transfer. - To migrate the model file with selective_python.py script, run the following command:
python selective_python.py --source ./models/checkpoints/<safetensor model file> --destination gs://your-bucket-name/models/
Configuration
Environment Variables
| Variable | Description | Required |
|----------|-------------|----------|
| GCP_PROJECT_ID | Your GCP project ID | Yes |
| GCP_STORAGE_BUCKET | Cloud Storage bucket name | Yes |
| GOOGLE_APPLICATION_CREDENTIALS | Path to service account JSON key | Yes* |
| GCP_CREDENTIALS_JSON | Service account JSON as string | Yes* |
*Either GOOGLE_APPLICATION_CREDENTIALS or GCP_CREDENTIALS_JSON is required.
Path Mappings
Default mappings from local paths to Cloud Storage:
{
"./models/": "models/",
"./output/": "outputs/",
"./input/": "inputs/",
"./user/": "user/",
"./saved_workflows/": "workflows/",
"./temp/": None # Keep local only
}
Bucket Structure
Recommended Cloud Storage bucket organization:
gs://your-bucket-name/
βββ models/
β βββ checkpoints/
β βββ loras/
β βββ vae/
β βββ ...
βββ outputs/
β βββ images/
β βββ videos/
βββ inputs/
β βββ user_uploads/
βββ user/
β βββ user_data/
βββ workflows/
βββ saved_workflows/
Authentication
Service Account (Recommended)
- Create a service account with Storage Admin role
- Download the JSON key file
- Set
GOOGLE_APPLICATION_CREDENTIALSenvironment variable
Application Default Credentials
For GCP Compute Engine instances with attached service accounts:
gcloud auth application-default login
Custom Credentials
Set credentials as environment variable:
export GCP_CREDENTIALS_JSON='{"type": "service_account", ...}'
Performance Optimization
Caching Strategy
- Smart Downloads: Files only downloaded when accessed
- Local Cache: Frequently used files cached locally
- Timestamp Checking: Avoid unnecessary downloads
- Temp Files: Temporary files stay local
Batch Operations
Use the storage manager for batch sync:
from gcp_storage_manager import storage_manager
# Sync entire model directory
storage_manager.sync_directory('./models/', direction='download')
# Upload all outputs
storage_manager.sync_directory('./output/', direction='upload')
Monitoring and Logging
Enable detailed logging:
import logging
logging.basicConfig(level=logging.INFO)
Monitor operations:
- Download/upload activities logged
- Cache hit/miss information
- Error handling and retries
Troubleshooting
Common Issues
Authentication Error
google.auth.exceptions.DefaultCredentialsError
Solution: Verify GOOGLE_APPLICATION_CREDENTIALS points to valid JSON key file.
Permission Denied
403 Forbidden
Solution: Ensure service account has Storage Admin role on the bucket.
Bucket Not Found
404 Not Found
Solution: Verify bucket name and ensure it exists in your project.
Import Error
ModuleNotFoundError: No module named 'google.cloud'
Solution: Install dependencies:
pip install google-cloud-storage
Debug Mode
Enable debug logging for detailed troubleshooting:
export COMFYUI_GCP_DEBUG=true
Test Configuration
Verify your setup:
from gcp_storage_manager import get_storage_status
print(get_storage_status())
Migration Guide
From Local Storage
-
Backup existing data:
tar -czf comfyui-backup.tar.gz models/ output/ input/ user/ saved_workflows/ -
Run setup script:
./setup_gcp_storage.sh -
Upload existing files:
gsutil -m cp -r models/* gs://your-bucket-name/models/ gsutil -m cp -r output/* gs://your-bucket-name/outputs/ gsutil -m cp -r input/* gs://your-bucket-name/inputs/ -
Test the setup:
# Clear local cache and test download rm -rf models/* # Start ComfyUI - models should download automatically
From Other Cloud Providers
Use gsutil to transfer from other cloud storage:
# From AWS S3
gsutil -m cp -r s3://source-bucket/* gs://target-bucket/
# From Azure
gsutil -m cp -r az://source-container/* gs://target-bucket/
Security
Best Practices
- Use Service Accounts: Don't use personal credentials in production
- Minimal Permissions: Grant only necessary Storage permissions
- Rotate Keys: Regularly rotate service account keys
- Network Security: Use VPC and private Google Access if needed
- Bucket Policies: Configure bucket-level IAM policies
Recommended IAM Roles
roles/storage.objectAdmin- For read/write access to objectsroles/storage.legacyBucketReader- For listing bucket contents
Cost Optimization
Storage Classes
- Standard: For frequently accessed files (models, active outputs)
- Nearline: For files accessed less than once per month
- Coldline: For archival of old outputs
- Archive: For long-term backup
Lifecycle Policies
Set up automatic lifecycle management:
gsutil lifecycle set lifecycle.json gs://your-bucket-name
Example lifecycle.json:
{
"lifecycle": {
"rule": [
{
"action": {"type": "SetStorageClass", "storageClass": "NEARLINE"},
"condition": {"age": 30, "matchesPrefix": ["outputs/"]}
},
{
"action": {"type": "Delete"},
"condition": {"age": 365, "matchesPrefix": ["temp/"]}
}
]
}
}
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
License
This project is licensed under the same license as ComfyUI.
Support
For issues and questions:
- Check the troubleshooting section
- Review logs for error details
- Create an issue with detailed information
- Include configuration (without sensitive data)
Note: This integration is designed for production use on GCP. Ensure you understand GCP pricing and have appropriate monitoring in place.