github BerriAI/litellm v1.77.7.dev3

9 hours ago

What's Changed

  • 1-79-0 docs by @ishaan-jaff in #15936
  • feat(lasso): Upgrade to Lasso API v3 and fix ULID generation by @oroxenberg in #15941
  • Enable OpenTelemetry context propagation by external tracers by @eycjur in #15940
  • Fix documentation for videos by @Sameerlite in #15937
  • Fix duplicate trace in langfuse_otel by @eycjur in #15931
  • [Feat] add support for dynamic client registration by @uc4w6c in #15921
  • Update IBM Guardrails to correctly use SSL Verify argument by @RobGeada in #15975
  • feat: support during_call for model armor guardrails by @bjornjee in #15970
  • docs(openrouter): add base_url config with environment variables by @shanto12 in #15946
  • [Buf fix] - Azure OpenAI, fix ContextWindowExceededError is not mapped from Azure openai errors by @ishaan-jaff in #15981
  • [Fix] DD logging - ensure key's metadata + guardrail is logged on DD by @ishaan-jaff in #15980
  • [Feat] OTEL - Ensure error information is logged on OTEL by @ishaan-jaff in #15978
  • [Fix] Minor fix proxy - ensure User API key and team id and user id missing from custom callback is not misfiring by @ishaan-jaff in #15982
  • [Fix] Azure OpenAI - Add handling for v1 under azure api versions by @ishaan-jaff in #15984
  • Fix: Respect LiteLLM-Disable-Message-Redaction header for Responses API by @Sameerlite in #15966
  • [Feat] UI - Changed API Base from Select to Input in New LLM Credentials by @yuneng-jiang in #15987
  • [Bug Fix] Remove limit from admin UI numerical input fix by @yuneng-jiang in #15991
  • [Feature] UI - Key Already Exist Error Notification by @yuneng-jiang in #15993
  • [Fix] - Responses API - add /openai routes for responses API. (Azure OpenAI SDK Compatibility) by @ishaan-jaff in #15988
  • Add deprecation dates for models by @dima-hx430 in #15976
  • docs(guardrails/ibm_guardrails): add additional detail to ibm_guardrails.md by @m-misiura in #15971
  • Perf speed up pytest by @uc4w6c in #15951
  • fix: Preserve Bedrock inference profile IDs in health checks by @ylgibby in #15947
  • Fix: Support tool usage messages with Langfuse OTEL integration by @eycjur in #15932
  • Add Haiku 4.5 pricing for open router by @Somtom in #15909
  • fix(opik): enhance requester metadata retrieval from API key auth by @Thomas-Mildner in #15897
  • [feat]: graceful degradation for pillar service when using litellm by @afogel in #15857
  • Add GitlabPromptCache and enable subfolder access by @deepanshululla in #15712
  • Add OpenAI client usage documentation for videos and fix navigation visibility by @Sameerlite in #15996
  • [Feature] Config Models should not be editable by @yuneng-jiang in #16020
  • [Fix] Guardrails - Ensure Key Guardrails are applied by @ishaan-jaff in #16025
  • [UI] Feature - Add Apply Guardrail Testing Playground by @ishaan-jaff in #16030
  • [Fix] SQS Logger - Add Base64 handling by @ishaan-jaff in #16028
  • Fix mutation of original request for gemini request by @Sameerlite in #16002
  • Fix: Redact reasoning summaries in ResponsesAPI output when message logging is disabled by @Sameerlite in #15965
  • fix: Support text.format parameter in Responses API for providers without native ResponsesAPIConfig by @rodolfo-nobrega in #16023
  • Remove unnecessary model variable assignment by @Mte90 in #16008
  • Add license metadata to health/readiness endpoint. by @bernata in #15997
  • chore(deps): bump hono from 4.9.7 to 4.10.3 in /litellm-js/spend-logs by @dependabot[bot] in #15915
  • docs: improve Grayswan guardrail documentation by @TeddyAmkie in #15875
  • fix(apscheduler): prevent memory leaks from jitter and frequent job intervals by @jatorre in #15846
  • Python entry-point for CustomLLM subclasses by @AlbertDeFusco in #15881
  • Allow using ARNs when generation images via Bedrock by @komarovd95 in #15789
  • Added fallback logic for detecting file content-type when S3 returns generic by @langpingxue in #15635
  • fix: prevent httpx DeprecationWarning memory leak in AsyncHTTPHandler by @AlexsanderHamir in #16024
  • [Feat] Add FAL AI Image Generations on LiteLLM by @ishaan-jaff in #16067
  • Feat: Mistral API - add codestral-embed-2505 by @ishaan-jaff in #16071

New Contributors

Full Changelog: v1.79.0-nightly...v1.77.7.dev3

Don't miss a new litellm release

NewReleases is sending notifications on new releases.