artifacthub helm/ollama-helm/ollama 0.1.0
v0.1.0

latest releases: 1.14.0, 1.13.0, 1.12.0...
16 months ago

Ollama for Linux

Ollama for Linux is now available, with GPU acceleration enabled out-of-the-box for Nvidia GPUs.

💯 Ollama will run on cloud servers with multiple GPUs attached
🤖 Ollama will run on WSL 2 with GPU support
😍 Ollama maximizes the number of GPU layers to load to increase performance without crashing
🤩 Ollama will support CPU only, and small hobby gaming GPUs to super powerful workstation graphics cards like the H100

Download

curl https://ollama.ai/install.sh | sh

Manual install steps are also available.

Changelog

  • Ollama will now automatically offload as much of the running model as is supported by your GPU for maximum performance without any crashes
  • Fix issue where characters would be erased when running ollama run
  • Added a new community project by @TwanLuttik in #574

New Contributors

Full Changelog: v0.0.21...v0.1.0

Don't miss a new ollama release

NewReleases is sending notifications on new releases.