github ggml-org/llama.cpp b7397

latest releases: b7418, b7414, b7415...
one day ago

Warning

Release Format Update: Linux releases will soon use .tar.gz archives instead of .zip. Please make the necessary changes to your deployment scripts.

vulkan: improve mul_mat_vec_iq1_s speed (#17874)

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.