Release Notes
[2025-07-09]
llama-index-core
[0.12.48]
- fix: convert dict chat_history to ChatMessage objects in AgentWorkflowStartEvent (#19371)
- fix: Replace ctx.get/set with ctx.store.get/set in Context (#19350)
- Bump the pip group across 6 directories with 1 update (#19357)
- Make fewer trips to KV store during Document Hash Checks (#19362)
- Don't store Copy of document in metadata and properly return Nodes (#19343)
- Bump llama-index-core from 0.12.8 to 0.12.41 in /docs in the pip group across 1 directory (#19345)
- fix: Ensure CallbackManager is applied to default embed_model (#19335)
- fix publish sub-package workflow (#19338)
llama-index-embeddings-huggingface-optimum-intel
[0.3.1]
- Fix IntelEmbedding base.py (#19351)
llama-index-indices-managed-lancedb
[0.1.0]
- Fix broken lancedb tests (#19352)
llama-index-indices-managed-llamacloud
[0.7.10]
llama-index-llms-google-genai
[0.2.4]
- Add Cached Content Support to GoogleGenAI LLM Integration (#19361)
llama-index-llms-oci-genai
[0.5.1]
- Add support of Image prompt for OCI generative AI Llama models (#19306)
llama-index-readers-file
[0.4.11]
- swamp xml for defusedxml (#19342)
llama-index-storage-chat-stores-postgres
[0.2.2]
- Update asyncpg (#19365)