github run-llama/llama_index v0.10.1
v0.10.1 (and V0.10.0)

latest releases: v0.10.52, v0.10.51, v0.10.50...
4 months ago

Today we’re excited to launch LlamaIndex v0.10.0. It is by far the biggest update to our Python package to date (see this gargantuan PR), and it takes a massive step towards making LlamaIndex a next-generation, production-ready data framework for your LLM applications.

LlamaIndex v0.10 contains some major updates:

  • We have created a llama-index-core package, and split all integrations and templates into separate packages: Hundreds of integrations (LLMs, embeddings, vector stores, data loaders, callbacks, agent tools, and more) are now versioned and packaged as a separate PyPI packages, while preserving namespace imports: for example, you can still usefrom llama_index.llms.openai import OpenAI for a LLM.
  • LlamaHub will be the central hub for all integrations: the former llama-hub repo itself is consolidated into the main llama_index repo. Instead of integrations being split between the core library and LlamaHub, every integration will be listed on LlamaHub. We are actively working on updating the site, stay tuned!
  • ServiceContext is deprecated: Every LlamaIndex user is familiar with ServiceContext, which over time has become a clunky, unneeded abstraction for managing LLMs, embeddings, chunk sizes, callbacks, and more. As a result we are completely deprecating it; you can now either directly specify arguments or set a default.

Upgrading your codebase to LlamaIndex v0.10 may lead to some breakages, primarily around our integrations/packaging changes, but fortunately we’ve included some scripts to make it as easy as possible to migrate your codebase to use LlamaIndex v0.10.

Full Blog Post

v0.10 Documentation

v0.10 Installation Guide

v0.10 Quickstart

Updated Contribution Guide

Temporary v0.10 Package Registry

v0.10 Migration Guide

Don't miss a new llama_index release

NewReleases is sending notifications on new releases.