github withcatai/node-llama-cpp v3.4.0

latest releases: v3.17.1, v3.17.0, v3.16.2...
14 months ago

3.4.0 (2025-01-08)

Features

Bug Fixes

  • check for Rosetta usage on macOS x64 when using the inspect gpu command (#405) (632a7bf)
  • detect running under Rosetta on Apple Silicone and show an error message instead of crashing (#405) (632a7bf)
  • switch from "nextTick" to "nextCycle" for the default batch dispatcher (#405) (632a7bf)
  • remove deprecated CLS token (#405) (632a7bf)
  • pipe error logs in inspect gpu command (#405) (632a7bf)

Shipped with llama.cpp release b4435

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

Don't miss a new node-llama-cpp release

NewReleases is sending notifications on new releases.