pypi litellm 1.60.2
v1.60.2

latest releases: 1.84.0rc1, 1.84.0.dev2, 1.84.0.dev1...
15 months ago

What's Changed

  • Control Model Access by IDP 'groups' by @krrishdholakia in #8164
  • build(schema.prisma): add new sso_user_id to LiteLLM_UserTable by @krrishdholakia in #8167
  • Litellm dev contributor prs 01 31 2025 by @krrishdholakia in #8168
  • Improved O3 + Azure O3 support by @krrishdholakia in #8181
  • test: add more unit testing for team member endpoints by @krrishdholakia in #8170
  • Add azure/deepseek-r1 by @Klohto in #8177
  • [Bug Fix] - /vertex_ai/ was not detected as llm_api_route on pass through but vertex-ai was by @ishaan-jaff in #8186
  • (UI + SpendLogs) - Store SpendLogs in UTC Timezone, Fix filtering logs by start/end time by @ishaan-jaff in #8190
  • Azure AI Foundry - Deepseek R1 by @elabbarw in #8188
  • fix(main.py): fix passing openrouter specific params by @krrishdholakia in #8184
  • Complete o3 model support by @krrishdholakia in #8183
  • Easier user onboarding via SSO by @krrishdholakia in #8187
  • LiteLLM Minor Fixes & Improvements (01/16/2025) - p2 by @krrishdholakia in #7828
  • Added deprecation date for gemini-1.5 models by @yurchik11 in #8210
  • docs: Updating the available VoyageAI models in the docs by @fzowl in #8215
  • build: ui updates by @krrishdholakia in #8206
  • Fix tokens for deepseek by @SmartManoj in #8207
  • (UI Fixes for add new model flow) by @ishaan-jaff in #8216
  • Update xAI provider and fix some old model config by @zhaohan-dong in #8218
  • Support guardrails mode as list, fix valid keys error in pydantic, add more testing by @krrishdholakia in #8224
  • docs: fix typo in lm_studio.md by @foreign-sub in #8222
  • (Feat) - New pass through add assembly ai passthrough endpoints by @ishaan-jaff in #8220
  • fix(openai/): allows 'reasoning_effort' param to be passed correctly by @krrishdholakia in #8227

New Contributors

Full Changelog: v1.60.0...v1.60.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 170.0 187.78487681207412 6.365583292626693 0.0 1905 0 135.5453470000043 3644.0179759999864
Aggregated Passed ✅ 170.0 187.78487681207412 6.365583292626693 0.0 1905 0 135.5453470000043 3644.0179759999864

Don't miss a new litellm release

NewReleases is sending notifications on new releases.