github TransformerOptimus/SuperAGI v0.0.14

2 years ago

✨SuperAGI v0.0.14✨

🚀 Enhanced Local LLM Support with Multi-GPU 🎉

New Feature Highlights 🌟

⚙️ Local Large Language Model (LLM) Integration:

  • SuperAGI now supports the use of local large language models, allowing users to leverage their own models seamlessly within the SuperAGI framework.
  • Easily configure and integrate your preferred LLMs for enhanced customization and control over your AI agents.

⚡️ Multi-GPU Support:

  • SuperAGI now provides multi-GPU support for improved performance and scalability.

How to Use

To enable Local Large Language Model (LLM) with Multi-GPU support, follow these simple steps:

  1. LLM Integration:
    • Add your model path in the celery and backend volumes in the docker-compose-gpu.yml file.
    • Run the command:
      docker compose -f docker-compose-gpu.yml up --build
    • Open localhost:3000 in your browser.
    • Add a local LLM model from the model section.
    • Use the added model for running your agents.

What’s Changed

Don't miss a new SuperAGI release

NewReleases is sending notifications on new releases.