New Features
- azure-api-management: This introduces inferred proxy support for Azure API Management.
- Stats computation: Enable stats computation by default for python 3.14 and above.
- LLM Observability: Adds
LLMObs.publish_evaluator()to sync a locally-defined
LLMJudgeevaluator to the Datadog UI as a custom LLM-as-Judge evaluation.
- LLM Observability: Experiments now report their execution status to the backend. Status transitions to
runningwhen execution starts,completedon success,failedwhen tasks or evaluators error withraise_errors=False, andinterruptedwhen the experiment is stopped by an exception. #16713
Bug Fixes
- celery: Propagate distributed tracing headers for tasks that are not registered locally so traces link correctly across workers. #16662
- profiling: This fix resolves an issue where the lock profiler's wrapper class did not support PEP 604 type union syntax (e.g.,
asyncio.Condition | None). This was causing aTypeErrorat import time for libraries such as kopf that use union type annotations at class definition time.
- Fix for a potential race condition affecting internal periodic worker threads that could have caused a
RuntimeErrorduring forks.
- profiling: Fix lock contention in the profiler's greenlet stack sampler that could cause connection pool exhaustion in gevent-based applications (e.g. gunicorn + gevent + psycopg2). #16657
- Add a timeout to Unix socket connections to prevent thread I/O hangs during pre-fork shutdown.
Other Changes
- LLM Observability: Exports
LLMJudge,BooleanStructuredOutput,ScoreStructuredOutput, andCategoricalStructuredOutputto the publicddtrace.llmobsmodule level.