github ggml-org/llama.cpp b7339

latest releases: b8591, b8590, b8589...
3 months ago

Warning

Release Format Update: Linux releases will soon use .tar.gz archives instead of .zip. Please make the necessary changes to your deployment scripts.

Add DIAG for CUDA (#17873)

  • Add DIAG for CUDA

  • Refactor parameters

macOS/iOS:

Linux:

Windows:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.