github ggml-org/llama.cpp b7657

latest releases: b7677, b7676, b7675...
one day ago
Details

vulkan: Warptile tuning for Intel Xe2/Xe3 (#18178)

  • modify warptile tuning for xe3

  • intel vendor check w/ coopmat support

  • fix back formatting

  • fix formatting change 2

  • move intel check to chip specific tuning part

  • Change to support both windows and linux

  • modify m_warptile to l_warptile for intel

  • modify warptile tuning for bf16 matmuls to fix regression (m_warptile to l_warptile)

  • Code style changes

  • Code style changes (2)

  • Code style changes (3)

macOS/iOS:

Linux:

Windows:

openEuler:

Don't miss a new llama.cpp release

NewReleases is sending notifications on new releases.