PyTorch extension for ComfyUI featuring extensive PyTorch wrapper nodes for seamless tensor operations and PyTorch model training.
You can use ComfyUI-Pt-Wrapper to build a Transformer encoder model from scratch for text classification.
Learn how to construct a Transformer encoder using basic components such as multi-head attention, layer normalization, linear layers, embedding layers, and residual connections. This workflow allows you to train the model for IMDB text classification and achieve around 85% accuracy!
The complete setup of all required nodes is included in the example workflow.
Check out the Building Transformer From Scratch guide.
ComfyUI-Pt-Wrapper brings PyTorch model building and training into ComfyUI's node graph environment—no coding required.
It is built for ComfyUI users who want to explore machine learning without writing Python, and for researchers who want to prototype directly in visual workflows. Every operation, from tensor math to full training pipelines, can be configured through nodes.
Originally a focused spin-off of ComfyUI-Data-Analysis, this extension supports a wide range of ML workflows in image and text domains.
add
, gather
, scatter
, where
, etc.Train an image classifier on your own dataset—entirely in ComfyUI nodes.
Train a ResNet achieving 94% validation accuracy. A flexible baseline for your own image classification tasks.
Train a text classification model using a configurable Transformer model—all node-based.
embedding_transformer_classification.json
This project does not accept pull requests. Unsolicited PRs will be closed without review.
To suggest a feature or report an issue, open an Issue. All issues are reviewed and prioritized.
Every supported node is documented in detail. Browse the Node Reference to explore tensor operations, models, tokenization, distributions, loss functions, tensor operations and more.
Links in the reference section point directly to individual node docs for quick lookup.