github rjmalagon/ollama-linux-amd-apu v0.12.5

latest releases: v0.15.0, v0.14.1, v0.13.5...
3 months ago

What's Changed (this repo branch)

  • Sync to v0.12.5
  • AMD GTT patches discontinuation: Main Ollama now supports AMD APUs, RDNA1 class APUs to current, and VEGA class APUs are now discontinued.
  • New main current branch of the repo. We now host container images with additional AMD ROCM optimizations over the current main Ollama
    We offer a container image with the most recent Ollama that is compatible with the old GTT patches.

What's Changed (from Ollama)

What's Changed

  • Ollama's app will now wait until Ollama is running to allow for a conversation to be started
  • Fixed issue where "think": false would show an error instead of being silently ignored
  • Fixed deepseek-r1 output issues
  • macOS 12 Monterey and macOS 13 Ventura are no longer supported
  • AMD gfx900 and gfx906 (MI50, MI60, etc) GPUs are no longer supported via ROCm. We're working to support these GPUs via Vulkan in a future release.

@shengxinjing made their first contribution in ollama#12415

Full Changelog: v0.12.4-rc6...v0.12.5

Don't miss a new ollama-linux-amd-apu release

NewReleases is sending notifications on new releases.