github TabbyML/tabby v0.11.0

latest releases: v0.31.1, v0.31.1-rc.0, nightly...
16 months ago

⚠️ Notice

  • BREAKING: The --webserver flag is now enabled by default in tabby serve. To turn off the webserver and only use OSS features, use the --no-webserver flag.
  • The /v1beta/chat/completions endpoint has been moved to /v1/chat/completions, while the old endpoint is still available for backward compatibility.

🚀 Features

  • Added storage usage statistics in the System page.
    image

  • Added support for integrating repositories from GitHub and GitLab using personal access tokens.

    image
  • Introduced a new Activities page to view user activities.

    image
  • Included an Ask Tabby feature in the source code browser to provide in-context help from AI.

    image
  • Upgraded llama.cpp to version b2715.

  • Implemented incremental indexing for faster repository context updates.

🧰 Fixes and Improvements

  • Changed the default model filename from q8_0.v2.gguf to model.gguf in MODEL_SPEC.md.
  • Excluded activities from deactivated users in reports.

💫 New Contributors

Full Changelog: v0.10.0...v0.11.0

Don't miss a new tabby release

NewReleases is sending notifications on new releases.