github open-telemetry/opentelemetry-python-contrib opentelemetry-instrumentation-openai-v2==2.4b0
opentelemetry-instrumentation-openai-v2 2.4b0

4 hours ago
  • Migrate experimental path from deprecated LLMInvocation to InferenceInvocation, using handler.start_inference() and invocation.stop()/invocation.fail() directly (#4502)
  • Use create_duration_histogram and create_token_histogram from opentelemetry-util-genai instead of defining bucket boundaries locally (#4501)
  • Import OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT from opentelemetry.util.genai.environment_variables instead of re-defining it locally, making opentelemetry-util-genai the single source of truth for this constant. (#4455)
  • Fix compatibility with wrapt 2.x by using positional arguments in wrap_function_wrapper() calls (#4445)
  • Fix ChoiceBuffer crash on streaming tool-call deltas with arguments=None (#4350)
  • Fix StreamWrapper missing .headers and other attributes when using with_raw_response streaming (#4113)
  • Add opt-in support for latest experimental semantic conventions (v1.37.0). Set OTEL_SEMCONV_STABILITY_OPT_IN to gen_ai_latest_experimental to enable. Add dependency on opentelemetry-util-genai pypi package. (#3715)
  • Add wrappers for OpenAI Responses API streams and response stream managers (#4280)
  • Add async wrappers for OpenAI Responses API streams and response stream managers (#4325)
  • Add strongly typed Responses API extractors with validation and content extraction improvements (#4337)
  • Add completion hook support. (#4315)
  • Fix response_format handling: map json_object/json_schema to json output type. (#4315)
  • Skip attribute values with openai.Omit value. (#4315)
  • Default empty string for gen_ai.request.model attribute on missing model. (#4494)

Don't miss a new opentelemetry-python-contrib release

NewReleases is sending notifications on new releases.