github mudler/LocalAI v2.7.0

latest releases: v2.18.0, v2.17.1, v2.17.0...
5 months ago

This release adds support to the transformer backend for LLM as well!

For now instance you can run codellama-7b with transformers with:

docker run -ti -p 8080:8080 --gpus all localai/localai:v2.7.0-cublas-cuda12 codellama-7b

In the quickstart there are more examples available https://localai.io/basics/getting_started/#running-models.

Note: As llama.cpp is ongoing with changes that could possible cause breakage, this release does not includes changes from ggerganov/llama.cpp#5138 (the future versions will).

What's Changed

Bug fixes 🐛

  • fix(paths): automatically create paths by @mudler in #1650

Exciting New Features 🎉

  • feat(transformers): support also text generation by @mudler in #1630
  • transformers: correctly load automodels by @mudler in #1643
  • feat(startup): fetch model definition remotely by @mudler in #1654

👒 Dependencies

Other Changes

Full Changelog: v2.6.1...v2.6.2

Don't miss a new LocalAI release

NewReleases is sending notifications on new releases.