What's New
TerraVision now generates architecture annotations with AI. Pass --ai-annotate <backend> to draw and TerraVision asks an LLM to suggest diagram titles, edge labels, external actors, and numbered flow sequences — written to a dedicated terravision.ai.yml file and merged into your diagram at render time.
poetry run terravision draw --source ./terraform --ai-annotate ollama
poetry run terravision draw --source ./terraform --ai-annotate bedrockHighlights
-
Two backends out of the box
ollama— runs a localllama3model onlocalhost:11434. Fully offline, no cloud calls.bedrock— uses AWS Bedrock via the infrastructure inai-backend-terraform/. Great for CI pipelines already running with an AWS IAM role.
-
The graph is never touched by the LLM. Suggestions go into
terravision.ai.ymlonly — the deterministicgraphdictis byte-identical with or without--ai-annotate. You can diff the file, review it, commit it, or delete it. -
Two-file annotation model. User-authored
terravision.ymland AI-generatedterravision.ai.ymlare auto-discovered and merged. User annotations always win on conflicts, so the AI can't override decisions you have made. -
Numbered flow sequences (format 0.2). The AI can propose
flows:blocks that render as numbered badges on nodes/edges plus a legend. Steps can target nodes (aws_lambda_function.api) or edges (source -> target) and numbering is continuous across flows. -
Auditable provenance. Every AI file includes a
generated_byblock recording backend, model, and ISO 8601 timestamp so you know exactly what produced the annotations. -
Single unified prompt. Replaces the old per-provider
*_REFINEMENT_PROMPTconstants and the graph-mutatingrefine_with_llm()path with oneANNOTATION_PROMPTinmodules/llm.py. Resource references are validated before writing, so hallucinated resource names are dropped. -
CI/CD ready. Drop
--ai-annotate bedrockinto existing pipelines — same IAM role, no extra secrets.
Migration Notes
- The old
refine_with_llm()and provider-specific refinement prompts have been removed. If you relied on LLM-modified graphs, switch to--ai-annotate <backend>and read annotations fromterravision.ai.yml. - Consider whether to commit
terravision.ai.yml(track AI suggestions over time) or.gitignoreit (regenerate each run).
Docs
- Annotations Guide
- Usage Guide
- CI/CD Integration