github TabbyML/tabby v0.16.1

latest releases: nightly, v0.31.2, v0.31.2-rc.0...
13 months ago

⚠️ Notice

  • Starting from this version, we are utilizing websockets for features that require streaming (e.g., Answer Engine and Chat Side Panel). If you are deploying tabby behind a reverse proxy, you may need to configure the proxy to support websockets.

🚀 Features

  • Discussion threads in the Answer Engine are now persisted, allowing users to share threads with others.

🧰 Fixed and Improvements

  • Fixed an issue where the llama-server subprocess was not being reused when reusing a model for Chat / Completion together (e.g., Codestral-22B) with the local model backend.
  • Updated llama.cpp to version b3571 to support the jina series embedding models.

💫 New Contributors

Full Changelog: v0.15.0...v0.16.1

Don't miss a new tabby release

NewReleases is sending notifications on new releases.