github BerriAI/litellm v1.43.13

latest releases: v1.52.0-stable, v1.52.2-dev1, v1.52.3...
2 months ago

We're launching Day 0 support for Anthropic Prompt Caching on LiteLLM πŸ‘‰ Start here: https://docs.litellm.ai/docs/providers/anthropic#prompt-caching

πŸ“– Cut Costs and latency, use Anthropic prompt caching for the following scenarios:

πŸ› οΈ [Fix-Proxy] Allow running docker, docker-database as non-root user (h/t Oz Elhassid)

πŸ“ˆ [Fix] Prometheus use 'litellm_' prefix for new deployment metrics (h/t Filipe Andujar)

βœ… [Feat-Proxy] Add failure logging for GCS bucket logging https://docs.litellm.ai/docs/proxy/bucket
Group 5955

What's Changed

New Contributors

Full Changelog: v1.43.12...v1.43.13

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.43.13

Don't want to maintain your internal proxy? get in touch πŸŽ‰

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed βœ… 84 97.96550381346324 6.506952562817539 0.0 1946 0 66.09550899997885 1639.4581249999192
Aggregated Passed βœ… 84 97.96550381346324 6.506952562817539 0.0 1946 0 66.09550899997885 1639.4581249999192

Don't miss a new litellm release

NewReleases is sending notifications on new releases.