github ggml-org/llama.cpp b7480

latest releases: b7482, b7481
6 hours ago

Warning

Release Format Update: Linux releases will soon use .tar.gz archives instead of .zip. Please make the necessary changes to your deployment scripts.

Details

presets: refactor, allow cascade presets from different sources, add global section (#18169)

  • presets: refactor, allow cascade presets from different sources

  • update docs

  • fix neg arg handling

  • fix empty mmproj

  • also filter out server-controlled args before to_ini()

  • skip loading custom_models if not specified

  • fix unset_reserved_args

  • fix crash on windows

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.