Breaking Changes
- The
ENTITY_TYPESenvironment variable has been deprecated; Please replace it with ENTITY_TYPE_PROMPT_FILE before launching this new version.
Major Improvements
- Implement role-specific LLM configuration support, introducing four distinct roles—EXTRACT, QUERY, KEYWORDS, and VLM—each with independent LLM settings.
- Enable task-aware embedding support for asymmetric models, including
voyage-3,text-embedding-004,embed-multilingual-v3.0, andjina-embeddings-v3. - Add optional JSON-formatted LLM output to enhance stability in the entity and relation extraction pipeline.
- Introduce
ENTITY_TYPE_PROMPT_FILEto empower users with enhanced guidance for LLM-driven entity type recognition and extraction. - Fully support Amazon and Anthropic models on *AWS Bedrock API.
What's Changed
- feat: explicit voyageai embed support by @laszukdawid in #2484
- feat: Add task-aware embedding support by @StoreksFeed in #2560
- feat: integrate structured extraction and multimodal processing pipeline by @MrGidea in #2830
- ♻️ refactor(documentManager): reorganize document status filtering by @danielaskdd in #2851
- feat: enhance entity extraction stability dev by @yunzhongxiaxi in #2864
- Remove config.ini from compose samples by @danielaskdd in #2906
- fix: handle OpenAI length finish reason fallback by @danielaskdd in #2913
- feat(extraction): configurable per-response entity/relation limits by @danielaskdd in #2950
- ♻️ refactor(llm): unify keyword extraction across providers by @danielaskdd in #2953
- ♻️ refactor(llm): unify structured output control via response_format by @danielaskdd in #2956
- refactor(bedrock): support default and custom endpoints by @danielaskdd in #2958
- refactor(gemini): improve default endpoint handling and sdk integration by @danielaskdd in #2957
- refactor(setup): use sentinel endpoints for Gemini and Bedrock defaults by @danielaskdd in #2959
- fix: remove
streamparameter from.parse()call whenresponse_formatis present by @PaulTitto in #2965 - perf(postgres): use binary parameter for vector similarity queries by @wkpark in #2949
#2966 - feat(bedrock): Rename
aws_bedrocktobedrockand implement comprehensive option support by @danielaskdd in #2966 - chore(deps): bump react-router-dom from 7.14.0 to 7.14.1 in /lightrag_webui in the react group by @dependabot[bot] in #2967
- feat(prompt): externalize entity type extraction profiles by @danielaskdd in #2964
- chore(deps-dev): bump the build-tools group in /lightrag_webui with 3 updates by @dependabot[bot] in #2968
- chore(deps): bump the frontend-minor-patch group across 1 directory with 3 updates by @dependabot[bot] in #2969
- Fix role LLM max async fallback by @danielaskdd in #2973
- fix(llm): tighten client and stream cleanup across LLM bindings by @danielaskdd in #2974
- chore(deps): bump lucide-react from 0.577.0 to 1.6.0 in /lightrag_webui by @dependabot[bot] in #2970
- docs: add role-specific LLM configuration guide by @danielaskdd in #2976
- refactor: unify role LLM config via ROLES registry + queue observability by @danielaskdd in #2978
- feat(status): role-based LLM observability and storage workspace info by @danielaskdd in #2980
- feat(rerank): add independent concurrency and timeout configuration by @danielaskdd in #2981
- Fix LLM cache role identity isolation by @danielaskdd in #2982
- Add Podman-compatible compose file by @tears710 in #2983
- Add role LLM provider options logging and change role provider options to start from empty not default by @danielaskdd in #2984
- Fix bedrock/gemini host leak from env.example on make server/storage by @danielaskdd in #2985
New Contributors
- @yunzhongxiaxi made their first contribution in #2864
- @tears710 made their first contribution in #2983
- @PaulTitto made their first contribution in #2965
- @laszukdawid made their first contribution in #2484
Full Changelog: v1.4.15...v1.5.0rc1