Added
- 🧠 System prompt endpoint (
GET /system) — returns a structured system prompt for LLM integration, grounding the model in the environment (OS, hostname, user, shell, Python version) with directives for tool usage. Gated byOPEN_TERMINAL_ENABLE_SYSTEM_PROMPT(defaulttrue); advertised viafeatures.systeminGET /api/configso consumers can check support before fetching. - ⚙️
OPEN_TERMINAL_ENABLE_SYSTEM_PROMPT— environment variable (orenable_system_promptin config.toml) to enable/disable the/systemendpoint and feature flag. Defaults totrue. - ⚙️
OPEN_TERMINAL_SYSTEM_PROMPT— environment variable (orsystem_promptin config.toml) to fully override the generated system prompt with custom content.