What's Changed
- Bugfix/aenglema/issue#4665 by @drew9781 in #4670
- Fix "Illegal instruction" bug in llama.cpp CPU only version by @oobabooga in #4677
- add XTTSv2 by @kanttouchthis in #4673
- Merge dev branch by @oobabooga in #4683
- Merge dev branch by @oobabooga in #4686
- Detect Orca 2 template by @oobabooga in #4697
- Bump to flash-attention 2.3.4 + switch to Github Actions wheels on Windows by @oobabooga in #4700
- Bump llama-cpp-python to 0.2.19 & add min_p and typical_p parameters to llama.cpp loader by @oobabooga in #4701
- Merge dev branch by @oobabooga in #4702
New Contributors
- @drew9781 made their first contribution in #4670
- @kanttouchthis made their first contribution in #4673
Full Changelog: snapshot-2023-11-19...snapshot-2023-11-26