github mudler/LocalAI v2.5.0

latest releases: v2.20.1, v2.20.0, v2.19.4...
8 months ago

What's Changed

This release adds more embedded models, and shrink image sizes.

You can run now phi-2 ( see here for the full list ) locally by starting localai with:

docker run -ti -p 8080:8080 localai/localai:v2.5.0-ffmpeg-core phi-2

LocalAI accepts now as argument a list of short-hands models and/or URLs pointing to valid yaml file. A popular way to host those files are Github gists.

For instance, you can run llava, by starting local-ai with:

docker run -ti -p 8080:8080 localai/localai:v2.5.0-ffmpeg-core https://raw.githubusercontent.com/mudler/LocalAI/master/embedded/models/llava.yaml

Exciting New Features 🎉

  • feat: more embedded models, coqui fixes, add model usage and description by @mudler in #1556

👒 Dependencies

  • deps(conda): use transformers-env with vllm,exllama(2) by @mudler in #1554
  • deps(conda): use transformers environment with autogptq by @mudler in #1555
  • ⬆️ Update ggerganov/llama.cpp by @localai-bot in #1558

Other Changes

Full Changelog: v2.4.1...v2.5.0

Don't miss a new LocalAI release

NewReleases is sending notifications on new releases.