ComfyUI Extension: LF Nodes
Custom nodes with a touch of extra UX, including: history for primitives, JSON manipulation, logic switches with visual feedback, LLM chat... and more!
Custom Nodes (0)
README
LF Nodes
<div align="center">Overview
A suite of custom nodes for ComfyUI aimed at enhancing user experience with more interactive and visually engaging widgets.
Whether you're after quality-of-life improvements or specific functionalities, LF Nodes has you covered. The nodes are designed to be user-friendly and intuitive, making them accessible to users of all skill levels.
Most UI elements used by the frontend belong to the LF Widgets webcomponents library, a modern collection of modular and customizable webcomponents built on Stencil.js specifically to integrate with LF Nodes.
What kind of nodes does it offer?
That's a tough one—the nodes span quite a few categories. Here's a quick breakdown:
- Analytics nodes: Visualize and track data, like checkpoint/LoRA usage or image histograms.
- Configuration nodes: Manage CivitAI metadata, and control the suite via the Control Panel.
- Image manipulation nodes: Tools to manipulate images, such as filter and resize nodes.
- IO Operations nodes: Load and save files to/from the file system.
- JSON nodes: Tools to manipulate and display JSON data.
- LLM nodes: Interface with locally running LLMs, like the Messenger node, which also manages characters.
- Logic nodes: Control flow using simple switches.
- Primitive nodes: Work with primitive data types, offering features like history.
- Seed generation nodes: Generate seeds for complex workflows.
- Selector nodes: Resource selection widgets with metadata display for models.
Table of Contents
Installation
Using ComfyUI Manager
- Open ComfyUI Manager.
- Search LF Nodes.
- Install the node suite and restart ComfyUI.
Manual
- Go to the
ComfyUI/custom_nodes
folder. - Open a terminal.
- Copy and paste this command
git clone https://github.com/lucafoscili/lf-nodes.git
.
Workflow samples
Compare images
e2e
Flux + LLM Character manager
I2I (Refine)
LLM Chat
LoRA tester
Markdown documentation
Resize for web + blurred placeholder
Notes
The LLM nodes were tested with Koboldcpp, but any Open AI-compatible endpoint that does not require authentication/an API key should work. The model used in the workflows samples is UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3 with ChaoticNeutrals/LLaVA-Llama-3-8B-mmproj-Updated.
Contributing
Contributions to this repository are welcome, feel free to submit pull requests or open issues for discussion! To setup the environment clone this repository, then from the root open a terminal and run the command
pip install -r requirements.txt
This will install all the required dependencies for the Python backend.
To build the frontend, you will need to have Node.js and Yarn installed, then run the command
yarn setup
This command will install all the dependencies. Note that the repository includes the compiled frontend sources, so you can skip this step if you don't plan to modify the frontend.
yarn build
This command will compile all the frontend sources and generate/refresh the actual web directory.
License
MIT License