Open Interpreter's --local
mode is now powered by Mistral 7B
.
Significantly more architectures supported locally via ooba
, a headless Oobabooga wrapper.
What's Changed
- Fix bug when trying to use local non-CodeLlama model by @alexweberk in #571
- Update README_ZH.md by @orangeZSCB in #563
- chore: update test suite by @ericrallen in #594
- Fixed a bug in setup_text_llm.py by @kylehh in #560
- feat: add %tokens magic command that counts tokens via tiktoken by @ericrallen in #607
- feat: add support for loading different config.yaml files by @ericrallen in #609
- feat: add optional prompt token/cost estimate to %tokens by @ericrallen in #614
- Added powershell language by @DaveChini in #620
- Local Update by @KillianLucas in #625
New Contributors
- @alexweberk made their first contribution in #571
- @orangeZSCB made their first contribution in #563
- @kylehh made their first contribution in #560
- @DaveChini made their first contribution in #620
Full Changelog: v0.1.7...v0.1.8