[v4.82.1]
-
#2021
02adf7c
Thanks @chrarnoldus! - OpenRouter inference providers whose context window is smaller than that of the top provider for a particular model are now automatically ignored by default. They can still be used by selecting them specifically in the Provider Routing settings. -
#2015
e5c7641
Thanks @mcowger! - Add API key support to the Ollama provider, enabling usage of Ollama Turbo -
#2029
64c6955
Thanks @kevinvandijk! - Add search to provider list and sort it alphabetically