github rjmalagon/ollama-linux-amd-apu v0.6.7

latest releases: v0.13.5, v0.13.3, v.0.13.1-rc0...
8 months ago

What's Changed (this repo branch)

Sync to Ollama main v0.6.7

What's Changed (from Ollama)

New models

  • Qwen 3: Qwen3 is the latest generation of large language models in Qwen series, offering a comprehensive suite of dense and mixture-of-experts (MoE) models.
  • Phi 4 reasoning and Phi-4-mini-reasoning: New state-of-the-art reasoning models from Microsoft
  • Llama 4: state-of-the-art multi-modal models from Meta

What's Changed

  • Add support for Meta's Llama 4 multimodal models
  • Add support for Microsoft's Phi 4 reasoning models, and Phi 4 mini reasoning model
  • Increased default context window to 4096 tokens
  • Fixed issue where image paths would not be recognized with ~ when being provided to ollama run
  • Improved output quality when using JSON mode in certain scenarios
  • Fixed tensor->op == GGML_OP_UNARY errors when running a model due to conflicting inference libraries
  • Fixed issue where model would be stuck in the Stopping... state

New Contributors

Full Changelog: v0.6.5...v0.6.7

Don't miss a new ollama-linux-amd-apu release

NewReleases is sending notifications on new releases.