3.0.0-beta.39 (2024-07-28)
Bug Fixes
- Gemma chat wrapper bug (#273) (e3e0994)
- GGUF metadata nested key conflicts (#273) (e3e0994)
- adapt to
llama.cppbreaking changes (#273) (e3e0994) - preserve function calling chunks (#273) (e3e0994)
- format JSON objects like models expect (#273) (e3e0994)
Features
- Llama 3.1 support (#273) (e3e0994)
- Phi-3 support (#273) (e3e0994)
- model metadata overrides (#273) (e3e0994)
- use LoRA on a context instead of on a model (#273) (e3e0994)
onTextChunkoption (#273) (e3e0994)
Shipped with llama.cpp release b3479
To use the latest
llama.cpprelease available, runnpx --no node-llama-cpp download --release latest. (learn more)