github rjmalagon/ollama-linux-amd-apu v0.15.6

5 hours ago

What's Changed (this repo branch)

  • Sync to v0.15.6

What's Changed (from Ollama)

New models

  • Qwen3-Coder-Next: a coding-focused language model from Alibaba's Qwen team, optimized for agentic coding workflows and local development.
  • GLM-OCR: GLM-OCR is a multimodal OCR model for complex document understanding, built on the GLM-V encoder–decoder architecture.

What's Changed

  • New ollama launch clawdbot command for launching Clawdbot using Ollama models
  • Renamed ollama launch clawdbot to ollama launch openclaw to reflect the project's new name
  • Improved tool calling for Ministral models
  • docs: add clawdbot by @ParthSareen in ollama#13925
  • cmd/config: Use envconfig.Host() for base API in launch config packages by @gabe-l-hart in ollama#13937
  • ollama launch will now use the value of OLLAMA_HOST when running it
  • ollama launch openclaw will now enter the standard OpenClaw onboarding flow if this has not yet been completed.
  • Sub-agent support for ollama launch for planning, deep research, and similar tasks
  • ollama signin will now open a browser window to make signing in easier
  • Ollama will now default to the following context lengths based on VRAM:
    • < 24 GiB VRAM: 4,096 context
    • 24-48 GiB VRAM: 32,768 context
    • >= 48 GiB VRAM: 262,144 context
  • GLM-4.7-Flash support on Ollama's experimental MLX engine
  • ollama signin will now open the browser to the connect page
  • Fixed off by one error when using num_predict in the API
  • Fixed issue where tokens from a previous sequence would be returned when hitting num_predict
  • Fixed context limits when running ollama launch droid
  • ollama launch will now download missing models instead of erroring
  • Fixed bug where ollama launch claude would cause context compaction when providing images

New Contributors

Full Changelog: v0.15.0...v0.15.6

Don't miss a new ollama-linux-amd-apu release

NewReleases is sending notifications on new releases.