What's Changed (this repo branch)
- Sync to v0.14.1
- Adopt upstream AMD GTT patches.
What's Changed (from Ollama)
What's Changed
- ollama run --experimental CLI will now open a new Ollama CLI that includes an agent loop and the bash tool
- Anthropic API compatibility: support for the /v1/messages API
- A new REQUIRES command for the Modelfile allows declaring which version of Ollama is required for the model
- For older models, Ollama will avoid an integer underflow on low VRAM systems during memory estimation
- More accurate VRAM measurements for AMD iGPUs
- Ollama's app will now highlight swift soure code
- An error will now return when embeddings return NaN or -Inf
- Ollama's Linux install bundles files now use zst compression
- New experimental support for image generation models, powered by MLX (macOS only)
New Contributors
@Vallabh-1504 made their first contribution in ollama#13550
@majiayu000 made their first contribution in ollama#13596
@harrykiselev made their first contribution in ollama#13615
@joshxfi made their first contribution in ollama#13711
@maternion made their first contribution in ollama#13709
Full Changelog: v0.13.5...v0.14.1