v0.5.0 — Use Any LLM Provider
New Provider Connections
- Codex · ChatGPT Plus — Sign in with your ChatGPT Plus or Pro subscription to use OpenAI's Codex models directly (#2, #44)
- GitHub Copilot — Connect your GitHub Copilot subscription with one-click device code authentication (#49)
- Google AI Studio — Use Gemini models with your Google AI Studio API key. First-class support with native Google Search grounding built in. (#240)
- OpenRouter, Ollama & custom endpoints — Bring your own API key for any OpenAI-compatible provider — OpenRouter, Ollama, or any custom endpoint (#14, #27)
- Searchable model picker — When a provider has many models, choose from three tiers (Best, Balanced, Fast) via searchable dropdowns with smart defaults
Improvements
- Provider icons — Connection settings now show the upstream provider's logo and name so you can tell your connections apart at a glance
- MCP sources work with all providers — Your connected MCP sources are now proxied to non-Anthropic backends, so tools like Linear, GitHub, and Slack work regardless of which LLM you're using
- Session list redesign — Reusable entity-list primitives for a cleaner, more consistent UI
- Skill metadata —
requiredSourcesandiconare now displayed in the skill info page and accepted by validation (#290, #249) - Copilot model filtering — Only models enabled by your organization's policy are shown (92200095)
Bug Fixes
- Status icon colors — Restored colors in the right-click context menu (6fa9b85b)
- Windows paths — Fixed backslash handling in workspace slug extraction (3d22e75f)
- Model tier preservation — Your selected models are no longer silently overwritten by provider defaults (1ddfe96f)
- Windows build — Fixed native module resolution (koffi), Vite OOM, and interceptor file packaging for reliable Windows builds (0ae044a2, ff791b03, 258e3e39)
- Windows runtime — Upgraded vendored Bun 1.3.5 → 1.3.9 to fix Windows crash (c4f46abc)
- Pi SDK 0.55.0 — Bumped Pi SDK to fix streaming freezes and malformed OpenAI partial chunk JSON (718d4ad6)
- Install scripts — Fixed Windows and macOS/Linux install scripts to work without the version endpoint; scripts now fetch YAML manifests directly (#273, #276)
Security
- Web-fetch hardening — Protection against SSRF, unbounded reads, and other edge cases (3904a786)
- Explore mode — Blocked
find -execand similar shell escape vectors (#272)
Infrastructure
- Cloudflare migration — Migrated R2 bucket and Workers to new Cloudflare account (55bb899e)
- CI — Windows builds now use
windows-latestrunner (c4ac80a6)
Breaking Changes
- Codex/Copilot backends consolidated — The standalone Codex and Copilot backends have been replaced by a unified Pi SDK backend. Existing Codex and Copilot connections will need to be re-configured through the new onboarding flow.
- Removed Claude OAuth via Pi — The option to use a Claude subscription through the Pi backend was removed for Anthropic ToS compliance. Use the direct Claude Pro/Max connection instead.