ComfyUI Extension: Comfy-Pack

Authored by bentoml

Created

Updated

56 stars

A comprehensive toolkit for standardizing, packaging and deploying ComfyUI workflows as reproducible environments and production-ready REST services

Custom Nodes (0)

    README

    Comfy-Pack: Making ComfyUI Workflows Shareable

    banner2

    comfy-pack is a comprehensive toolkit for reliably packing and unpacking environments for ComfyUI workflows.

    • 📦 Pack workflow environments as artifacts: Saves the workflow environment in a .cpack.zip artifact with Python package versions, ComfyUI and custom node revisions, and model hashes.
    • ✨ Unpack artifacts to recreate workflow environments: Unpacks the .cpack.zip artifact to recreate the same environment with the exact Python package versions, ComfyUI and custom node revisions, and model weights.
    • 🚀 Deploy workflows as APIs: Deploys the workflow as a RESTful API with customizable input and output parameters.

    Motivations

    ComfyUI Manager is great for find missing custom nodes. But when sharing ComfyUI workflows to others(your audience or team members), you've still likely heard these responses:

    • "Custom Node not found"
    • "Cannot find the correct model file"
    • "Missing Python dependencies"

    These are fundamental challenges in workflow sharing – every component should match exactly: custom nodes, model files, and Python dependencies. Modern pacakge managers like npm and poetry intruduced "lock" feature, which means record the exact version for every requirement. ComfyUI Manager isn't designed for that.

    We learned it from our community and developed comfy-pack to address these problems. With a single click, it captures and locks your entire workflow environment into a .cpack.zip file, including Python packages, custom nodes, model hashes, and required assets.

    Users can recreate the exact environment with one command:

    comfy-pack unpack workflow.cpack.zip
    

    This means you can focus on your creative work while comfy-pack handles the rest.

    Usages

    Installation

    We recommend you use ComfyUI Manager to install comfy-pack. Simply search for comfy-pack and click Install. Restart the server and refresh your ComfyUI interface to apply changes.

    install_node

    Alternatively, clone the project repository through git.

    cd ComfyUI/custom_nodes
    git clone https://github.com/bentoml/comfy-pack.git
    

    To install the comfy-pack CLI, run:

    pip install comfy-pack
    

    Pack a ComfyUI workflow and its environment

    You can package a workflow and the environment required to run the workflow into an artifact that can be unpacked elsewhere.

    1. Click the Package button to create a .cpack.zip artifact.
    2. (Optional) Select the models that you want to include (only model hash will be recorded, so you won't get a 100GB zip file).

    pack

    Unpack the ComfyUI environments

    Unpacking a .cpack.zip artifact will restore the ComfyUI environment for the workflow. During unpacking, comfy-pack will perform the following steps.

    1. Prepare a Python virtual environment with the exact packages used to run the workflow.
    2. Clone ComfyUI and custom nodes from the exact revisions required by the workflow.
    3. Search for and download models from common registries like Hugging Face and Civitai. Unpacking workflows using the same model will not cause the model to be downloaded multiple times. Instead, model weights will be symbolically linked.

    To unpack:

    comfy-pack unpack workflow.cpack.zip
    

    For example cpack files, check our examples folder.

    Deploy a workflow as an API

    You can turn a ComfyUI workflow into an API endpoint callable using any clients through HTTP.

    <details> <summary> 1. Annotate input & output </summary>

    Use custom nodes provided by comfy-pack to annotate the fields to be used as input and output parameters. To add a comfy-pack node, right-click and select Add Node > ComfyPack > output/input > [Select a type]

    Input nodes:

    • ImageInput: Accepts image type input, similar to the official LoadImage node
    • StringInput: Accepts string type input (e.g., prompts)
    • IntInput: Accepts int type input (e.g., dimensions, seeds)
    • AnyInput: Accepts combo type and more input (e.g., custom nodes)

    input

    Output nodes:

    • ImageOutput: Outputs image type, similar to the official SaveImage node
    • FileOutput: Outputs file path as string type and saves the file under that path

    output

    More field types are under way.

    </details> <details> <summary> 2. Serve the workflow </summary>

    Start an HTTP server at http://127.0.0.1:3000 (default) to serve the workflow under the /generate path.

    serve

    You can call the /generate endpoint by specifying parameters configured through your comfy-pack nodes, such as prompt, width, height, and seed.

    [!NOTE] The name of a comfy-pack node is the parameter name used for API calls.

    Examples to call the endpoint:

    CURL

    curl -X 'POST' \
      'http://127.0.0.1:3000/generate' \
      -H 'accept: application/octet-stream' \
      -H 'Content-Type: application/json' \
      -d '{
      "prompt": "rocks in a bottle",
      "width": 512, 
      "height": 512,
      "seed": 1
    }'
    

    BentoML client

    Under the hood, comfy-pack leverages BentoML, the unified model serving framework. You can invoke the endpoint using the BentoML Python client:

    import bentoml
    
    with bentoml.SyncHTTPClient("http://127.0.0.1:3000") as client:
            result = client.generate(
                prompt="rocks in a bottle",
                width=512,
                height=512,
                seed=1
            )
    
    </details> <details> <summary> 3. (Optional) Pack the workflow and environment </summary>

    Pack the workflow and environment into an artifact that can be unpacked elsewhere to recreate the workflow.

    # Get the workflow input spec
    comfy-pack run workflow.cpack.zip --help
    
    # Run
    comfy-pack run workflow.cpack.zip --src-image image.png --video video.mp4
    
    </details> <details> <summary> 4. (Optional) Deploy to the cloud </summary>

    Deploy to BentoCloud with access to a variety of GPUs and blazing fast scaling.

    Follow the instructions here to get your BentoCloud access token. If you don’t have a BentoCloud account, you can sign up for free.

    image

    </details>

    Roadmap

    This project is under active development. Currently we are working on:

    • Enhanced user experience
    • Docker support
    • Local .cpack file management with version control
    • Enhanced service capabilities

    Community

    comfy-pack is actively maintained by the BentoML team. Feel free to reach out 👉 Join our Slack community!

    Contributing

    As an open-source project, we welcome contributions of all kinds, such as new features, bug fixes, and documentation. Here are some of the ways to contribute:

    • Repost a bug by creating a GitHub issue.
    • Submit a pull request or help review other developers’ pull requests.