✨ Minor Changes
-
5bafaad: feat(providers): support Anthropic-compatible custom providers
Custom providers can now speak the Anthropic Messages API (
/v1/messages) in addition to OpenAI's/v1/chat/completions. When adding a custom provider, pick the API format in the new segmented control on the form — Manifest's existing Anthropic adapter handles the translation so agents continue to call the OpenAI-compatible proxy unchanged. Useful for Azure's Anthropic endpoint and any other gateway that exposes the native Anthropic protocol.
🐛 Patch Changes
- d6afa94: Add a llama.cpp provider tile to the API Keys tab in the self-hosted version. Clicking it probes
http://localhost:8080/v1/modelson the default llama-server port, lists every model the server exposes, and lets you connect them in one click. Pre-b3800 llama.cpp builds that don't expose/v1/modelsget a hint to upgrade or fall back to the custom-provider form. Messages and dashboard filters render llama.cpp and LM Studio as first-class providers instead of opaquecustom:<uuid>rows.