github ggml-org/llama.cpp b7640

latest releases: b8191, b8190, b8189...
one month ago
Details

ggml webgpu: add CEIL operation support (#18605)

  • ggml-webgpu: add CEIL operation support

    Add support for the CEIL unary operation in the WebGPU backend:
    - Add CEIL_FUNC shader template in unary_op.wgsl
    - Add 4 shader variants (f32, f16, inplace versions)
    - Initialize CEIL pipelines in ggml-webgpu.cpp
    - Register CEIL in supports_op function
    
  • docs: update WebGPU ops support for CEIL

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.