An implementation of the RAG LlamaIndex with Agents from AutoGen
The project integrates the Retrieval Augmented Generation (RAG) tool Llama-Index, Microsoft's AutoGen, and LlaVA-Next with ComfyUI's adaptable node interface, enhancing the functionality and user experience of the platform.
š„ May 9, 2024: Added agents, more information can be found here.
Follow these steps to set up the environment:
Set up a virtual environment as needed.
Navigate to ComfyUI/custom_nodes
.
Clone the repository: git clone https://github.com/get-salt-AI/SaltAI_LlamaIndex
Change to the cloned directory: cd SaltAI_Llama-index
Install dependencies:
5.a Python venv:
pip install -r requirements.txt
5.b ComfyUI Portable:
path\to\ComfyUI\python_embeded\python.exe -m pip install -r requirements.txt
You may need to update your environments packaging, wheels, and setuptools for newer Transformers and LlaVA-Next models.
pip install --upgrade packaging setuptools wheel
Orpath\to\ComfyUI\python_embeded\python.exe -m pip install --upgrade packaging setuptools wheel
Example workflows and images can be found in the Examples Section folder.
If you encounter issues due to package conflicts, ensure your virtual environment is configured correctly.
You can install and use any GGUF files loaded into your ComfyUI/custom_nodes/models/llm
folder.
Here is probably the world's largest repository of those:
Detailed documentation and guidelines for contributing to the project will be provided soon.
You can find out existing documentation at https://docs.getsalt.ai/
The project is open-source under the MIT license.