github ggml-org/llama.cpp b8196

2 hours ago
Details

impl : use 6 digits for tensor dims (#20094)

Many models have vocabulary sizes, and thus tensor shapes, with more
than 5 digits (ex: Gemma 3's vocab size is 262,208).

I already fixed this for llama_format_tensor_shape but missed it for
llama_format_tensor_shape until now. Oops.

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.