pypi llama-index-core 0.12.48
v0.12.48 (2025-07-09)

latest releases: 0.13.5, 0.13.4, 0.13.3...
one month ago

Release Notes

[2025-07-09]

llama-index-core [0.12.48]

  • fix: convert dict chat_history to ChatMessage objects in AgentWorkflowStartEvent (#19371)
  • fix: Replace ctx.get/set with ctx.store.get/set in Context (#19350)
  • Bump the pip group across 6 directories with 1 update (#19357)
  • Make fewer trips to KV store during Document Hash Checks (#19362)
  • Don't store Copy of document in metadata and properly return Nodes (#19343)
  • Bump llama-index-core from 0.12.8 to 0.12.41 in /docs in the pip group across 1 directory (#19345)
  • fix: Ensure CallbackManager is applied to default embed_model (#19335)
  • fix publish sub-package workflow (#19338)

llama-index-embeddings-huggingface-optimum-intel [0.3.1]

  • Fix IntelEmbedding base.py (#19351)

llama-index-indices-managed-lancedb [0.1.0]

  • Fix broken lancedb tests (#19352)

llama-index-indices-managed-llamacloud [0.7.10]

  • vbump llama-cloud (#19355)
  • Fix async retrieval of page figure nodes (#19334)

llama-index-llms-google-genai [0.2.4]

  • Add Cached Content Support to GoogleGenAI LLM Integration (#19361)

llama-index-llms-oci-genai [0.5.1]

  • Add support of Image prompt for OCI generative AI Llama models (#19306)

llama-index-readers-file [0.4.11]

  • swamp xml for defusedxml (#19342)

llama-index-storage-chat-stores-postgres [0.2.2]

Don't miss a new llama-index-core release

NewReleases is sending notifications on new releases.