github withcatai/node-llama-cpp v3.7.0

latest releases: v3.18.0, v3.17.1, v3.17.0...
11 months ago

3.7.0 (2025-03-28)

Features

  • extract function calling syntax from a Jinja template (#444) (c070e81)
  • Full support for Qwen and QwQ via QwenChatWrapper (#444) (c070e81)
  • export a llama instance getter on a model instance (#444) (c070e81)

Bug Fixes

  • better handling for function calling with empty parameters (#444) (c070e81)
  • reranking edge case crash (#444) (c070e81)
  • limit the context size by default in the node-typescript template (#444) (c070e81)
  • adapt to breaking llama.cpp changes (#444) (c070e81)
  • bump min nodejs version to 20 due to dependencies' requirements (#444) (c070e81)
  • defineChatSessionFunction type (#444) (c070e81)

Shipped with llama.cpp release b4980

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

Don't miss a new node-llama-cpp release

NewReleases is sending notifications on new releases.