github ggml-org/llama.cpp b7640

latest releases: b7658, b7657, b7656...
one day ago
Details

ggml webgpu: add CEIL operation support (#18605)

  • ggml-webgpu: add CEIL operation support

    Add support for the CEIL unary operation in the WebGPU backend:
    - Add CEIL_FUNC shader template in unary_op.wgsl
    - Add 4 shader variants (f32, f16, inplace versions)
    - Initialize CEIL pipelines in ggml-webgpu.cpp
    - Register CEIL in supports_op function
    
  • docs: update WebGPU ops support for CEIL

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.