The first public release of OpenLegion — a container-isolated multi-agent
runtime for autonomous AI agent fleets.
Highlights
- Container isolation — every agent runs in its own Docker container
or Sandbox microVM - Credential vault — API keys never leave the host; agents call
through the mesh proxy - 100+ LLM providers via LiteLLM (OpenAI, Anthropic, Gemini, xAI,
Groq, DeepSeek, etc.) - Multi-channel — CLI, Telegram, Discord, Slack, WhatsApp
- DAG workflow engine — YAML-defined multi-agent pipelines with
conditions and fan-out - Per-agent memory — SQLite + FTS5 + sqlite-vec with
write-then-compact context management - Cost tracking & budgets — per-agent daily/monthly limits with
automatic enforcement - Heartbeat system — cron-based autonomous operation with
probe-before-LLM optimization - MCP tool support — connect agents to external MCP servers
- Live dashboard — real-time agent monitoring, blackboard viewer, cost
charts - Self-authoring skills — agents can create and hot-reload their own
tools at runtime - 1107 tests passing across Python 3.11 and 3.12
Get started
git clone https://github.com/openlegion-ai/openlegion.git && cd openlegion
./install.sh
openlegion start
See QUICKSTART.md for the full setup guide.