First major release since 0.1.1. Ships the complete initial roadmap plus an MCP server for AI agent integration.
Highlights
AI agent integration
- MCP server (
llmwiki serve) exposes llmwiki's automated pipelines as Model Context Protocol tools so agents can ingest, compile, query, search, lint, and read pages programmatically. Ships with 7 tools and 5 read-only resources. See the README for Claude Desktop / Cursor setup.
Retrieval and search
- Semantic search via embeddings — pre-filters the wiki index to the top 15 most similar pages before calling the selection LLM. Transparent fallback to full-index selection when no embeddings store exists.
- Provider-specific embedding models: Voyage
voyage-3-litefor Anthropic,text-embedding-3-smallfor OpenAI,nomic-embed-textfor Ollama.
Multi-provider support
- Swap LLM backends via
LLMWIKI_PROVIDER=anthropic|openai|ollama|minimax. - Anthropic provider supports
ANTHROPIC_AUTH_TOKEN, custom base URLs, and falls back to~/.claude/settings.jsonfor credentials and model. - MiniMax provider added via the OpenAI-compatible endpoint.
Wiki quality
llmwiki lint— six rule-based checks (broken wikilinks, orphaned pages, missing summaries, duplicate concepts, empty pages, broken citations). No LLM calls, no API key required.- Paragraph-level source attribution — compiled pages now include
^[filename.md]citation markers pointing back to the source files that contributed the content.
Obsidian integration
- LLM-extracted tags (2–4 per concept) and deterministic aliases (slug, conjunction swap, abbreviation) surface in frontmatter.
- Auto-generated
wiki/MOC.mdgroups all concept pages by tag.
Infrastructure
- GitHub Actions CI with Node 18/20/22 build+test matrix plus Fallow codebase health check (required for merges).
- Tests grew from 91 to 211.
Contributors
Thanks to @FrankMa1, @PipDscvr, @goforu, and @socraticblock for their contributions.
Install
```bash
npm install -g llm-wiki-compiler@0.2.0
```