github ggml-org/llama.cpp b7311

latest releases: b7315, b7314, b7313...
one day ago

Warning

Release Format Update: Linux releases will soon use .tar.gz archives instead of .zip. Please make the necessary changes to your deployment scripts.

sycl: add missing BF16 conversion support for Intel oneAPI (#17780)

  • sycl: add missing BF16 conversion support for Intel oneAPI

  • Fix Line 645: Trailing whitespace

macOS/iOS:

Linux:

Windows:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.