Estimated end-of-life date, accurate to within three months: 05-2027
See the support level definitions for more information.
This is a major-version release that contains many backwards-incompatible changes to public APIs. To find which of these your code relies on, follow the "deprecation warnings" instructions here.
Breaking Changes
- Support for ddtrace with Python 3.8 is removed after being deprecated in the 3.0 release line. Use ddtrace 4.x with Python 3.9 or newer.
- mongoengine
- Drops support for the
ddtrace.Pinobject with mongoengine. With this change, the ddtrace library no longer directly supports mongoengine. Mongoengine will be supported through thepymongointegration.
- Drops support for the
- CI Visibility
- Removed deprecated entry points for the
pytest_benchmarkandpytest_bddintegrations. These plugins are now supported by the regularpytestintegration.
- Removed deprecated entry points for the
- dynamic instrumentation
- removed the deprecated
DD_DYNAMIC_INSTRUMENTATION_UPLOAD_FLUSH_INTERVALvariable.
- removed the deprecated
- exception replay
- removed the deprecated
DD_EXCEPTION_DEBUGGING_ENABLEDvariable.
- removed the deprecated
- tracing
- Deprecated methods have been removed
Span.set_tag_strhas been removed, useSpan.set_taginstead.Span.set_struct_taghas been removed.Span.get_struct_taghas been removed.Span._pprinthas been removedSpan.finishedsetter was removed, please useSpan.finish()method instead.Tracer.on_start_spanmethod has been removed.Tracer.deregister_on_start_spanmethod has been removed.ddtrace.trace.Pinhas been removed.Span.finish_with_ancestorswas removed with no replacement.
- Some methods have had their type signatures changed
Span.set_tagtyping is nowset_tag(key: str, value: Optional[str] = None) -> NoneSpan.get_tagtyping is nowget_tag(key: str) -> Optional[str]Span.set_tagstyping is nowset_tags(tags: dict[str, str]) -> NoneSpan.get_tagstyping is nowget_tags() -> dict[str, str]Span.set_metrictyping is nowset_metric(key: str, value: int | float) -> NoneSpan.get_metrictyping is nowget_metric(key: str) -> Optional[int | float]Span.set_metricstyping is nowset_metrics(metrics: Dict[str, int | float]) -> NoneSpan.get_metricstyping is nowget_metrics() -> dict[str, int | float]
Span.record_exception'stimestampandescapedparameters are removed
- Deprecated methods have been removed
- LLM Observability
- manual instrumentation methods, including
LLMObs.annotate(),LLMObs.export_span(),LLMObs.submit_evaluation(),LLMObs.inject_distributed_headers(), andLLMObs.activate_distributed_headers()now raise exceptions instead of logging. LLM Observability auto-instrumentation is not affected. LLMObs.submit_evaluation_for()has been removed. Please useLLMObs.submit_evaluation()instead for submitting evaluations. To migrate:LLMObs.submit_evaluation_for(...)users: rename toLLMObs.submit_evaluation(...)LLMObs.submit_evaluation_for(...)users: rename thespan_contextargument tospan, i.e.LLMObs.submit_evaluation(span_context={"span_id": ..., "trace_id": ...}, ...)toLLMObs.submit_evaluation(span={"span_id": ..., "trace_id": ...}, ...)
- manual instrumentation methods, including
- profiling
- this updates echion (the Python stack sampler) to the latest version, which introduces an experimental faster memory copy function.
- The V1 stack profiler is removed. V2 has been enabled by default since v2.20.0.
DD_PROFILING_STACK_V2_ENABLEDis now removed.
- freezegun
- The deprecated
freezegunintegration is now removed.
- The deprecated
- opentracer
- This change removes the deprecated
opentracerpackage
- This change removes the deprecated
- google_generativeai
- The
google_generativeaiintegration has been removed as thegoogle_generativeailibrary has reached end-of-life.
As an alternative, you can use the recommendedgoogle_genailibrary and corresponding integration instead.
- The
- openai
- Streamed chat/completions will no longer have token counts computed using the
tiktokenlibrary, and instead
will default to having their token counts estimated if not explicitly provided in the OpenAI response object. To guarantee accurate streamed token metrics, setstream_options={"include_usage": True}in the OpenAI request.
- Streamed chat/completions will no longer have token counts computed using the
- django
- This upgrades the default tracing behavior to enable minimal tracing mode by default (
DD_DJANGO_TRACING_MINIMALnow defaults totrue). Django ORM, cache, and template instrumentation are disabled by default to eliminate duplicate span creation since library integrations for database drivers (psycopg, MySQLdb, sqlite3), cache clients (redis, memcached), template renderers (Jinja2), and other supported libraries continue to be traced. This reduces performance overhead by removing redundant Django-layer instrumentation. To restore all Django instrumentation, setDD_DJANGO_TRACING_MINIMAL=false, or enable individual features usingDD_DJANGO_INSTRUMENT_DATABASES=true,DD_DJANGO_INSTRUMENT_CACHES=true, andDD_DJANGO_INSTRUMENT_TEMPLATES=true. - When
DD_DJANGO_INSTRUMENT_DATABASES=true(defaultfalse), database instrumentation now merges Django-specific tags into database driver spans created by supported integrations (psycopg, sqlite3, MySQLdb, etc.) instead of creating duplicate Django database spans. If the database cursor is not already wrapped by a supported integration, Django wraps it and creates a span. This change reduces overhead and duplicate spans while preserving visibility into database operations.
- This upgrades the default tracing behavior to enable minimal tracing mode by default (
- Other
- This change removes the
ddtrace.settingspackage. Environment variables should be used to adjust settings. - This change removes the deprecated non_active_span parameter to
HttpPropagator.inject - This change removes the deprecated environment variable
DEFAULT_RUNTIME_METRICS_INTERVAL.
- This change removes the
Deprecation Notes
- Support for ddtrace with Python 3.9 is deprecated after Python 3.9 reached its end-of-life.
New Features
- AAP
- This introduces security response id for easy identification of blocking responses.
- API Security schema collection is now supported in AWS Lambda behind an Application Load Balancer or the Lambda Function URL service where the endpoint cannot be reliably known. API Security reuses the endpoint inferred by the trace resource renaming feature or recomputes it when it is not available to perform sampling instead.
- AppSec instrumentation for downstream request is now enabled by default for
urllib3andrequests. It does not require enabling APM instrumentation forurllib3anymore.
- profiling
- Add support for
threading.RLock(reentrant lock) profiling. The Lock profiler now tracks boththreading.Lockandthreading.RLockusage, providing comprehensive lock contention visibility for Python applications.
- Add support for
- LLM Observability
- Previous dataset versions can be optionally pulled by passing the
versionargument toLLMObs.pull_dataset - Datasets have new properties
versionandlatest_versionto provide information on the version of the dataset that is being worked with and the latest global version of the dataset, respectively
- Previous dataset versions can be optionally pulled by passing the
Bug Fixes
- CI Visibility
- This fix resolves an issue where repo tags would be fetched while unshallowing to extract commit metadata, causing performance issues for repos with a large number of tags.
- This fix resolves performance issue affecting coverage collection for Python 3.12+
- data_streams
- This fix resolves an issue where payload size statistics were not being sent to the backend for Data Streams Monitoring (DSM).
- core
- This fix resolves an issue where forksafe locks used patched threading primitives from the profiling module, causing performance issues. The forksafe module now uses unpatched threading primitives (
Lock,RLock,Event).
- This fix resolves an issue where forksafe locks used patched threading primitives from the profiling module, causing performance issues. The forksafe module now uses unpatched threading primitives (
- LLM Observability
- add support for
HTTPS_PROXY. - Resolves an issue in the bedrock integration where invoking cohere rerank models would result in missing spans due to output formatting index errors.
- Corrected the description of the
assessmentargument insubmit_evaluation(). - Resolves an issue where the
langchainintegration would incorrectly mark Azure OpenAI calls as duplicate llm operations even if theopenaiintegration was enabled.
- add support for
- Error Tracking
- Modifies the way exception events are stored such that the exception id is stored instead of the exception object, to prevent TypeErrors with custom exception objects.
- profiling
- This fix resolves an issue where importing the profiler module after an asyncio Event Loop had been started would make the Profiler blind to the existing Event Loop and its Tasks.
DD_PROFILING_API_TIMEOUTdoesn't have any effect, and is marked to be removed in upcoming 4.0 release. New environment variableDD_PROFILING_API_TIMEOUT_MSis introduced to configure timeout for uploading profiles to the backend. The default value is 10000 ms (10 seconds)- Upgrades echion to resolve an issue where stack profiler can allocate a large amount of memory unnecessarily. Resolves another issue where the profiler can loop infinitely on Python 3.13.
- This fix resolves an issue where AssertionError exceptions were silently suppressed in the
_acquiremethod of the Lock profiler (note: this only occurs when assertions are enabled.)
- kafka
- This fix resolves an issue where only the first message in a batch was dispatched to Data Streams Monitoring (DSM) when consuming multiple Kafka messages
- langchain
- This fix resolves an issue where auto instrumented prompt templates incorrectly included a
versionfield. The version field is now omitted unless explicitly set by the user.
assessmentnow refers to whether the evaluation itself passes or fails according to your application, rather than the validity of the evaluation result.
Thelangchainintegration will trace Azure OpenAI spans as workflow spans if there is an equivalent llm span from theopenaiintegration. - Fixes an issue where streamed responses that end before the first chunk is received would result in an
IndexError.
- This fix resolves an issue where auto instrumented prompt templates incorrectly included a
- openai
- This fix resolves an issue where using async iteration with paginated methods (e.g.,
async for model in client.models.list()) caused aTypeError: 'async for' requires an object with __aiter__ method, got coroutine. See issue #14574.
- This fix resolves an issue where using async iteration with paginated methods (e.g.,
- opentelemetry
- Fixed circular import when enabling multiple OpenTelemetry signals (metrics + logs) simultaneously.
- Prevents OpenTelemetry OTLP exporter connections from being traced by ddtrace. ddtrace internal connections (gRPC and HTTP) are now excluded from tracing to prevent circular instrumentation.
- pytest plugin
- fix for potential
KeyErrorexceptions in test runs when gevent is detected within the environment.
- fix for potential
- code origin
- ensure that code location information is added to entry spans when Code Origin is enabled remotely.
- ray
- This fix resolves an issue where the tracer raised an error when submitting Ray tasks without explicitly calling
ray.init(). - This fix resolves an issue where exceptions raised in Ray child spans were not properly recorded in the trace.
- This fix stops instrumenting internal Ray actors (those starting with underscore) that were causing excessive noise, and adds
ray.data._internalto the module denylist.
- This fix resolves an issue where the tracer raised an error when submitting Ray tasks without explicitly calling
- IAST
- Fixed an issue where using weak hashing or cipher algorithms outside of a request context (e.g., during application startup) could raise an unhandled exception. The fix ensures proper error handling when IAST operations are performed without an active request context.
- tracer
- This fix resolves an issue where an application instrumented by ddtrace could crash at start. Fix compatibility with zope.event==6.0
- This fix ensures compatibility with wrapt 2.0.0
- logging
- Fixed ddtrace internal logging when trace-log correlation is disabled. Prevents
ValueError: Formatting field not found in record: 'dd.service'.
- Fixed ddtrace internal logging when trace-log correlation is disabled. Prevents
- Other
- Fix a potential race condition in the tracer.
- Fix the Python Detector regular expression so it also detects paths ending with only the major version number.
- Prevent a potential
ResourceWarningin multiprocess scenarios. - Prevent startup failure when a temporary directory is not available.
Other Changes
- profiling
- This removes the
wraptlibrary dependency from the Lock Profiler implementation, improving performance and reducing overhead during lock instrumentation.
- This removes the