๐ External Embedding API Support
This release adds support for using external OpenAI-compatible embedding APIs instead of local models!
โจ Key Features
- ๐ OpenAI-Compatible API Integration: Connect to any OpenAI-compatible
/v1/embeddingsendpoint- vLLM, Ollama, Text Embeddings Inference (TEI), OpenAI, and more
- โ๏ธ Simple Configuration: Set up via environment variables
- ๐ก๏ธ Graceful Fallback: Automatically falls back to local models if external API unavailable
- ๐ Automatic Dimension Detection: Detects embedding dimensions from API responses
- โ Backend Validation: Ensures configuration is compatible with selected storage backend
๐ Configuration
export MCP_EXTERNAL_EMBEDDING_URL=http://localhost:8890/v1/embeddings
export MCP_EXTERNAL_EMBEDDING_MODEL=nomic-embed-text
export MCP_EXTERNAL_EMBEDDING_API_KEY=sk-xxx # Optionalโ ๏ธ Important
Only supported with sqlite_vec backend (not compatible with hybrid or cloudflare backends).
๐ Documentation
See docs/deployment/external-embeddings.md for complete setup guide with examples for vLLM, Ollama, TEI, and OpenAI.
๐งช Testing
- 10/10 core tests passing
- Backend validation verified
๐ Contributors
Special thanks to @isiahw1 for this excellent contribution!
Full Changelog: https://github.com/doobidoo/mcp-memory-service/blob/v10.2.0/CHANGELOG.md#1020---2026-01-28