Estimated end-of-life date, accurate to within three months: 05-2027
See the support level definitions for more information.
Bug Fixes
- LLM Observability
- Resolves an issue where the langchain integration incorrectly marked openai responses as duplicate LLM spans even if there was a downstream LLM span generated by the openai integration.
- litellm: This fix resolves an issue where litellm>=1.74.15 wrapped router streaming responses in
FallbackStreamWrapper(introduced for mid-stream fallback support) that caused anAttributeErrorwhen attempting to access the.handlerattribute. The integration now gracefully handles both the original response format and wrapped responses by falling back to ddtrace's own stream wrapping when needed.
- profiling
- A bug where
asynciotask stacks could contain duplicated frames when the task was on-CPU is now fixed. The stack now correctly shows each frame only once. - This fix resolves an issue where memory profiler flamegraphs were upside down.
- A bug where