3.0.0-beta.2 (2024-01-20)
Bug Fixes
- adapt to breaking changes of
llama.cpp(#117) (595a6bc) - threads parameter (#139) (5fcdf9b)
- disable Metal for
x64arch by default (#139) (5fcdf9b)
Features
- function calling (#139) (5fcdf9b)
- chat syntax aware context shifting (#139) (5fcdf9b)
- stateless
LlamaChat(#139) (5fcdf9b) - improve chat wrapper (#139) (5fcdf9b)
LlamaTextutil (#139) (5fcdf9b)- show
llama.cpprelease in GitHub releases (#142) (36c779d)
Shipped with llama.cpp release b1892
To use the latest
llama.cpprelease available, runnpx --no node-llama-cpp download --release latest(learn more)