github machinewrapped/llm-subtrans v1.6.0
Support for terminology maps

7 hours ago

Added support for a terminology map to be passed to the translator to ensure consistent translations, with an opt-in system that allows the LLM to add new terminology to the map itself, which will be fed-forward into future batches.

Supports editing the terminology map (on the fly, to some extent).

2026-04-26 Terminology Map

The UI has been restructured to give more space to the project settings panel, as it was getting a bit crowded.

The generated terminology map depends a lot on the model - some are parsimonious, some want to record every detail.

The default instructions contain a terminology_instructions section with the prompt that is appended to the request when automatic terminology mapping is enabled - feel free to tweak them until you get the level of detail you want.

The default fallback used by for other instructions files is currently identical, but custom terminology_instructions can be added to any instructions.

What's Changed

  • Persist project progress on failures and standardize CLI translation … by @machinewrapped in #399
  • Implement parent() method to fix NotImplementedError in SubtitleListModel by @PhilBug in #402
  • Handle Ollama reasoning field in CustomClient; fall back to it when content is empty by @Copilot in #406
  • Improve local LLM resilience by @machinewrapped in #407
  • Fix init_project destructively re-batching when resuming an existing project by @machinewrapped in #411
  • Add CLI progress logging and token usage reporting by @machinewrapped in #412
  • Add opt-in terminology map for consistent translations by @machinewrapped in #413

New Contributors

Full Changelog: v1.5.8...v1.6.0

Don't miss a new llm-subtrans release

NewReleases is sending notifications on new releases.