What's Changed
- Fix crash when deserializing chats with saved context from 2.5.x and earlier (#1859)
- New light mode and dark mode UI themes (#1876)
- Update to latest llama.cpp after merge of Nomic's Vulkan PR (#1819, #1883)
- Support offloading only some layers of the model if you have less VRAM (#1890)
- Support Maxwell and Pascal Nvidia GPUs (#1895)
Fixes
- Don't show "retrieving localdocs" if there are no collections (#1874)
- Fix potential crash when loading fails due to insufficient VRAM (6db5307, Issue #1870)
- Fix VRAM leak when switching models (Issue #1840)
- Support Nomic Embed as LocalDocs embedding model via Atlas (d14b95f)
New Contributors
- @realKarthikNair made their first contribution in #1871
Full Changelog: v2.6.1...v2.6.2