Important
If you want to use local Gemma, Mistral, Llama, and other models through NextChat, please refer to this document for setup: https://docs.nextchat.dev/models/ollama
What's Changed
- [Cherry Pick] Fix [UI/UX] [Front End] Settings Page by @H0llyW00dzZ in #4032
- chore: fix typo in next.config.mjs by @eltociear in #4072
- feat: Add vision support by @TheRamU in #4076
- Fix temperature range by @WqyJh in #4083
- [Cherry Pick] Improve Github Issue Template by @H0llyW00dzZ in #4041
- chore: adjust for ollama support by @fred-bf in #4129
- feat: bump version by @fred-bf in #4133
New Contributors
Full Changelog: v2.10.3...v2.11.2