🚨 Relevant Changes:
- LiteLLM Proxy Virtual Keys: Unique Key Aliases will be enforced on /key/generate and /key/update requests
- datadog integration will use StandardLoggingPayload (from LiteLLM v1.53.0+) & also supports logging failures #6929
If you need to use the v1 of the payload (not recommended), you can set this in your config
litellm_settings:
datadog_use_v1: True
Benefits of using StandardLoggingPayload for datadog
- It's a standard logging object so should be consistent over time across our logging integrations
- Added support for logging LLM failures
- Has additional info like cache_hit , request_tags etc. Full payload is here https://docs.litellm.ai/docs/proxy/logging#what-gets-logged
What's Changed
- LiteLLM Minor Fixes & Improvements (11/24/2024) by @krrishdholakia in #6890
- (feat) pass through llm endpoints - add
PATCH
support (vertex context caching requires for update ops) by @ishaan-jaff in #6924 - sonnet supports pdf, haiku does not by @paul-gauthier in #6928
- (feat) DataDog Logger - Add Failure logging + use Standard Logging payload by @ishaan-jaff in #6929
- (feat) log proxy auth errors on datadog by @ishaan-jaff in #6931
- (feat) Allow using include to include external YAML files in a config.yaml by @ishaan-jaff in #6922
- (feat) dd logger - set tags according to the values set by those env vars by @ishaan-jaff in #6933
- LiteLLM Minor Fixes & Improvements (11/26/2024) by @krrishdholakia in #6913
- LiteLLM Minor Fixes & Improvements (11/27/2024) by @krrishdholakia in #6943
- Update Argilla integration documentation by @sdiazlor in #6923
- (bug fix) /key/update was not storing
budget_duration
in the DB by @ishaan-jaff in #6941 - (fix) handle json decode errors for DD exception logging by @ishaan-jaff in #6934
- (docs + fix) Add docs on Moderations endpoint, Text Completion by @ishaan-jaff in #6947
- (feat) add enforcement for unique key aliases on /key/update and /key/generate by @ishaan-jaff in #6944
- (fix) tag merging / aggregation logic by @ishaan-jaff in #6932
- (feat) Allow disabling ErrorLogs written to the DB by @ishaan-jaff in #6940
New Contributors
Full Changelog: v1.52.16...v1.53.1
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.1
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 200.0 | 215.7709455547284 | 6.292082946554957 | 0.0 | 1882 | 0 | 178.3981389999667 | 2851.1550680000255 |
Aggregated | Passed ✅ | 200.0 | 215.7709455547284 | 6.292082946554957 | 0.0 | 1882 | 0 | 178.3981389999667 | 2851.1550680000255 |