Fix temperature error for reasoning models + OpenRouter free models
Temperature fix (Sadi's report):
Models like gpt-5-mini, o1, o3, o4 reject custom temperature values (only default 1.0 is supported). The driver now detects reasoning models and omits the temperature parameter entirely. Also has a runtime fallback that strips temperature on 400 errors. 11 new tests.
OpenRouter free models (Marsel's report):
Added 6 free model variants to the catalog: gemma-2-9b-it:free, llama-3.1-8b-instruct:free, qwen-2.5-7b-instruct:free, mistral-7b-instruct:free, zephyr-7b-beta:free, deepseek-r1:free. These now appear in the dashboard model selector.
Live tested: daemon started, real Groq LLM calls, tool invocation, budget tracking, OpenRouter models in API — all working.