One-liner RAG has arrived in tlm
v1.2! 🎉
Version 1.2 of tlm
introduces one-liner Retrieval Augmented Generation (RAG) with the new tlm ask
command. This beta feature allows you to ask questions and get contextually relevant answers directly within your codebase and documentation.
Inspired by the Repomix project, tlm ask
provides a similar context gathering mechanism, implemented efficiently in Go. However, tlm
goes a step further by bridging context retrieval with local and open-source LLM prompting, enabling security and privacy with the comfort of your terminal.
Key Features of tlm ask
:
- Instant Answers: Get quick answers to direct questions using
tlm ask "<prompt>"
. - Contextual Understanding: Enhance answer accuracy by providing context. Use the
--context
flag and specify a directory for analysis, e.g.,tlm ask --context . "<prompt>"
. - Granular Context Control: Further refine the context using
--include
and--exclude
flags with file patterns. Target specific files or exclude irrelevant ones, e.g.,tlm ask --context . --include *.md "<prompt>"
ortlm ask --context . --exclude **/*_test.go "<prompt>"
.
Example Usage:
tlm ask "What is the main purpose of this function?"
tlm ask --context ./src --include *.go "How does authentication work?"
tlm ask --context ./docs --include *.md --exclude README.md "Summarize the key concepts."
tlm ask --interactive "What are the dependencies?"