What's Changed
- Add empower-functions integration to litellm by @liuyl in #3955
- [Fix] Authentication on /thread endpoints on Proxy by @ishaan-jaff in #4627
- [Feat] Add support for
litellm.create_assistants
by @ishaan-jaff in #4624 - ui - allow setting allowed ip addresses by @ishaan-jaff in #4632
- build(deps): bump zipp from 3.18.2 to 3.19.1 by @dependabot in #4628
- [Feat-Proxy] Add DELETE /assistants by @ishaan-jaff in #4645
- [Feat] Add
litellm.delete_assistant
for OpenAI by @ishaan-jaff in #4643 - [Feat] Add LIST, DELETE, GET
/files
by @ishaan-jaff in #4648 - [Feat] Add GET /files endpoint by @ishaan-jaff in #4646
- [fix] slack alerting reports - add validation for safe access into attributes by @ishaan-jaff in #4642
New Contributors
Full Changelog: v1.41.14...v1.41.15
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.41.15
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 120.0 | 136.08624143681106 | 6.374398266336339 | 0.0 | 1907 | 0 | 97.52214299999196 | 604.2804530000012 |
Aggregated | Passed ✅ | 120.0 | 136.08624143681106 | 6.374398266336339 | 0.0 | 1907 | 0 | 97.52214299999196 | 604.2804530000012 |