llamafile versions starting from 0.10.0 use a new build system, aimed at keeping our code more easily
aligned with the latest versions of llama.cpp. This means they support more recent models and functionalities,
but at the same time they might be missing some of the features you were accustomed to (check out this doc for a high-level description of what has been done).
If you liked the "classic experience" more, you will always be able to access the previous versions from our releases page. Our pre-built llamafiles show which version of the server they have been bundled with (0.9.* example, 0.10.* example), so you will always know which version of the software you are downloading.
What's Changed
- Accept array in chat message content field by @henfiber in #760
- chore: Update README.md to include call for community feedback on llamafile by @njbrake in #812
- chore: integrate whisper.cpp as a submodule by @njbrake in #813
- chore: convert stable diffusion to submodule by @njbrake in #818
- chore: llama.cpp as submodule by @njbrake in #819
- feat: move docs to mkdocs by @njbrake in #824
- chore: Add
update-llama-cppworkflow. by @daavoo in #846 - fix(update-llama-cpp): Use
new_build_wipas base ref. by @daavoo in #850 - Fixed broken llamafile URL in docs by @aittalam in #873
- update supported OpenBSD versions by @sthen in #897
- llamafile reloaded (v0.10.0) by @aittalam in #867
New Contributors
- @henfiber made their first contribution in #760
- @njbrake made their first contribution in #812
- @daavoo made their first contribution in #846
- @sthen made their first contribution in #897
Full Changelog: 0.9.3...0.10.0