Release Notes
[2025-08-14]
llama-index-core
[0.13.2]
- feat: allow streaming to be disabled in agents (#19668)
- fix: respect the value of NLTK_DATA env var if present (#19664)
- fix: Order preservation and fetching in batch non-cached embeddings in
a/get_text_embedding_batch()
(#19536)
llama-index-embeddings-ollama
[0.8.1]
llama-index-graph-rag-cognee
[0.3.0]
- fix: Update and fix cognee integration (#19650)
llama-index-llms-anthropic
[0.8.4]
- fix: Error in Anthropic extended thinking with tool use (#19642)
- chore: context window for claude 4 sonnet to 1 mln tokens (#19649)
llama-index-llms-bedrock-converse
[0.8.2]
- feat: add openai-oss models to BedrockConverse (#19653)
llama-index-llms-ollama
[0.7.1]
- fix: fix ollama role response detection (#19671)
llama-index-llms-openai
[0.5.3]
- fix: AzureOpenAI streaming token usage (#19633)
llama-index-readers-file
[0.5.1]
- feat: enhance PowerPoint reader with comprehensive content extraction (#19478)
llama-index-retrievers-bm25
[0.6.3]
- fix: fix persist+load for bm25 (#19657)
llama-index-retrievers-superlinked
[0.1.0]
- feat: add Superlinked retriever integration (#19636)
llama-index-tools-mcp
[0.4.0]
- feat: Handlers for custom types and pydantic models in tools (#19601)
llama-index-vector-stores-clickhouse
[0.6.0]
- chore: Updates to ClickHouse integration based on new vector search capabilities in ClickHouse (#19647)
llama-index-vector-stores-postgres
[0.6.3]
- fix: Add other special characters in
ts_query
normalization (#19637)