Headline
Add support for Vision Language Model (VLM) GGUFs in Lemonade Server's llamacpp backend (@danielholanda).
- Added
Gemma-3-4b-it-GGUFandQwen2.5-VL-7B-InstructVLMs to the suggested models list. - Try it by installing one of those VLMs, selecting the VLM in Open WebUI, and sending an image with your message.
Additional Changes
- Add
Llama-xLAM-2-8b-fc-r-Hybrid, the best OGA+Hybrid model for tool calling, to the suggested models list (@danielholanda) - Replace the old installation guide markdown files with a new webapp (@jeremyfowers)
- Fix the pkg_resources deprecation notice (@danielholanda)
- Enable randomness in repeated prompts in the
llm-prompttool (@amd-pworfolk)
Full Changelog: v7.0.2...v7.0.3