github ggml-org/llama.cpp b8581

2 hours ago
Details

server: wrap headers for mcp proxy (#21072)

  • server: wrap headers for mcp proxy

  • Update tools/server/server-cors-proxy.h

Co-authored-by: Georgi Gerganov ggerganov@gmail.com

  • fix build

  • chore: update webui build output

  • chore: update webui build output


Co-authored-by: Georgi Gerganov ggerganov@gmail.com
Co-authored-by: Aleksander Grygier aleksander.grygier@gmail.com

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.