Estimated end-of-life date, accurate to within three months: 05-2027
See the support level definitions for more information.
Bug Fixes
- LLM Observability: This fix resolves an issue where
cache_creation_input_tokensandcache_read_input_tokenswere not captured when using the LiteLLM integration with providers that support prompt caching (e.g., Anthropic, OpenAI, Deepseek).