github rjmalagon/ollama-linux-amd-apu v0.11.7

latest releases: v0.15.6, v0.15.0, v0.14.1...
5 months ago

What's Changed (this repo branch)

  • Sync to Ollama main v0.11.7

What's Changed (from Ollama)

DeepSeek-V3.1

DeepSeek-V3.1 is now available to run via Ollama.

This model supports hybrid thinking, meaning thinking can be enabled or disabled by setting think in Ollama's API:

curl http://localhost:11434/api/chat -d '{
  "model": "deepseek-v3.1",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    }
  ],
  "think": true
}'

In Ollama's CLI, thinking can be enabled or disabled by running the /set think or /set nothink commands.

What's Changed

  • Fixed issue where multiple models would not be loaded on CPU-only systems
  • Ollama will now work with models who skip outputting the initial <think> tag (e.g. DeepSeek-V3.1)
  • Fixed issue where text would be emitted when there is no opening <think> tag from a model
  • Fixed issue where tool calls containing { or } would not be parsed correctly

New Contributors

Contributors
@zoupingshi zoupingshi

Full Changelog: v0.11.6...v0.11.7

Don't miss a new ollama-linux-amd-apu release

NewReleases is sending notifications on new releases.