New Features
- Added an option
reuse_client
to openai/azure to help with async timeouts. Set toFalse
to see improvements (#9301) - Added support for
vLLM
llm (#9257) - Add support for python 3.12 (#9304)
- Support for
claude-2.1
model name (#9275)
reuse_client
to openai/azure to help with async timeouts. Set to False
to see improvements (#9301)
vLLM
llm (#9257)
claude-2.1
model name (#9275)
Don't miss a new llama_index release
NewReleases is sending notifications on new releases.