github zylon-ai/private-gpt v0.6.0

latest releases: v0.6.2, v0.6.1, 0.1.4...
3 months ago

0.6.0 (2024-08-02)

What's new

Introducing Recipes!

Recipes are high-level APIs that represent AI-native use cases. Under the hood, recipes execute complex pipelines to get the work done.

With the introduction of the first recipe, summarize, our aim is not only to include that useful use case in PrivateGPT but also getting the project ready to onboard community-built recipes!

Summarization Recipe

summarize is the first recipe included in PrivateGPT. The new API lets users summarize ingested documents, customize the resulting summary and use it as streaming. Read the full documentation here.

POST /v1/summarize

Improved cold-start

We've put a lot of effort to run PrivateGPT from a fresh clone as straightforward as possible, defaulting to Ollama, auto-pulling models, making the tokenizer optional...

More models and databases support

Support for Gemini (both LLM and Embeddings) and for Milvus and Clickhouse vector databases.

Breaking changes

  • The minimum required Python version is now 3.11.9, and Poetry must be >= 1.7.1. However, we recommend updating to Poetry 1.8.3. Instructions for updating:

    • Python 3.11.9:

      1. Before proceeding, make sure pyenv is installed on your system. If it isn't, you can install it by following the instructions on the PrivateGPT documentation.

      2. Use pyenv to install the specific version of Python.

        pyenv install 3.11.9
      3. Verify the installation by running python --version in your terminal.

    • Poetry 1.8.3:

      1. Update Poetry if already installed:

        poetry self update 1.8.3
      2. Verify the installation by running poetry --version in your terminal.

  • Default LLM model to LLaMA 3.1 for both Ollama and Llamacpp local setups. If you want to keep on using v0.5.0 defaults, place this settings-legacy.yaml file next to your settings.yaml file and run privateGPT with PGPT_PROFILES=legacy make run. Learn more about profiles here.

  • Default Embeddings to nomic-embed-text for both Ollama and Llamacpp local setups. This embeddings model may work with a different dimension than the one you were using before, making it incompatible with already ingested files. If you want to keep on using v0.5.0 defaults to not lose your ingested files, place this settings-legacy.yaml file next to your settings.yaml file and run privateGPT with PGPT_PROFILES=legacy make run. Learn more about profiles here. As an alternative, if you prefer to start fresh, you can just wipe your existing vector database by removing the local_data folder.

Full Changelog

Features

Bug Fixes

Don't miss a new private-gpt release

NewReleases is sending notifications on new releases.