github rjmalagon/ollama-linux-amd-apu v0.18.0

latest release: v0.18.0.1
9 hours ago

What's changed in this repo:

ROCm v6 runtime is discontinued.
For both older and newer AMD APUs, the official runtimes are now Vulkan and ROCm v7. Any GPU that supports Vulkan can run the container image from this repository. There will be no more runtime selection switches.

  • Fix typo in the environment variable by @stau in #35

New Contributors

  • @stau made their first contribution in #35

What's changed from the main Ollama and its contributors

Can be read here: https://github.com/ollama/ollama/releases/tag/v0.18.0

Relevant TLDR:

  • Fixed issue where GLM-OCR would not work due to incorrect prompt rendering
  • Fixed tool calling parsing and rendering for Qwen 3.5 models
  • build: smarter docker parallelism by @dhiltgen in ollama#14653
  • parsers: repair unclosed arg_value tags in GLM tool calls by @BruceMacD in ollama#14656
  • docs: format compat docs by @mxyng in ollama#14678
  • create: fix localhost handling by @dhiltgen in ollama#14681
  • Improved ordering models when running ollama

Full Changelog: v0.15.6...v0.18.0

Don't miss a new ollama-linux-amd-apu release

NewReleases is sending notifications on new releases.