github withcatai/node-llama-cpp v3.16.0

7 hours ago

3.16.0 (2026-02-19)

Features

Bug Fixes

  • adjust the default VRAM padding config to reserve enough memory for compute buffers (#553) (57e8c22)
  • support function call syntax with optional whitespace prefix (#553) (57e8c22)
  • change the default value of useDirectIo to false (#553) (57e8c22)
  • Vulkan device dedupe (#553) (57e8c22)

Shipped with llama.cpp release b8095

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

Don't miss a new node-llama-cpp release

NewReleases is sending notifications on new releases.