github patrickchugh/terravision v0.36.0
v0.36.0 — AI-Powered Annotations

12 hours ago

What's New

TerraVision now generates architecture annotations with AI. Pass --ai-annotate <backend> to draw and TerraVision asks an LLM to suggest diagram titles, edge labels, external actors, and numbered flow sequences — written to a dedicated terravision.ai.yml file and merged into your diagram at render time.

poetry run terravision draw --source ./terraform --ai-annotate ollama
poetry run terravision draw --source ./terraform --ai-annotate bedrock

Highlights

  • Two backends out of the box

    • ollama — runs a local llama3 model on localhost:11434. Fully offline, no cloud calls.
    • bedrock — uses AWS Bedrock via the infrastructure in ai-backend-terraform/. Great for CI pipelines already running with an AWS IAM role.
  • The graph is never touched by the LLM. Suggestions go into terravision.ai.yml only — the deterministic graphdict is byte-identical with or without --ai-annotate. You can diff the file, review it, commit it, or delete it.

  • Two-file annotation model. User-authored terravision.yml and AI-generated terravision.ai.yml are auto-discovered and merged. User annotations always win on conflicts, so the AI can't override decisions you have made.

  • Numbered flow sequences (format 0.2). The AI can propose flows: blocks that render as numbered badges on nodes/edges plus a legend. Steps can target nodes (aws_lambda_function.api) or edges (source -> target) and numbering is continuous across flows.

  • Auditable provenance. Every AI file includes a generated_by block recording backend, model, and ISO 8601 timestamp so you know exactly what produced the annotations.

  • Single unified prompt. Replaces the old per-provider *_REFINEMENT_PROMPT constants and the graph-mutating refine_with_llm() path with one ANNOTATION_PROMPT in modules/llm.py. Resource references are validated before writing, so hallucinated resource names are dropped.

  • CI/CD ready. Drop --ai-annotate bedrock into existing pipelines — same IAM role, no extra secrets.

Migration Notes

  • The old refine_with_llm() and provider-specific refinement prompts have been removed. If you relied on LLM-modified graphs, switch to --ai-annotate <backend> and read annotations from terravision.ai.yml.
  • Consider whether to commit terravision.ai.yml (track AI suggestions over time) or .gitignore it (regenerate each run).

Docs

  • Annotations Guide
  • Usage Guide
  • CI/CD Integration

Don't miss a new terravision release

NewReleases is sending notifications on new releases.