github BerriAI/litellm v1.18.4

latest releases: v1.52.9, v1.52.8.dev1, v1.52.8...
10 months ago

What's Changed

[Feat] Proxy - Add Spend tracking logs by @ishaan-jaff in #1498

New SpendTable when Using LiteLLM Virtual Keys - Logs API Key, CreatedAt Date + Time, Model, Spend, Messages, Response
Docs to get started: https://docs.litellm.ai/docs/proxy/virtual_keys

Group 197

[Feat] Proxy - Track Cost Per User (Using user passed to requests) by @ishaan-jaff in #1509

  • Proxy Server Track Cost Per User
    Request:
curl --location 'http://0.0.0.0:8000/chat/completions' \
        --header 'Content-Type: application/json' \
        --header 'Authorization: Bearer sk-RwPq' \
        --data ' {
        "model": "BEDROCK_GROUP",
        "user": "litellm-is-awesome-user",
        "messages": [
            {
            "role": "user",
            "content": "what llm are you-444"
            }
        ],
        }'

Cost Tracked in LiteLLM Spend Tracking DB

Screenshot 2024-01-18 at 5 56 17 PM

Notes:

  • If a user is passed to the request the proxy tracks cost for it
  • If the user does not exist in the User Table, we make a new user with the spend

feat(parallel_request_limiter.py): add support for tpm/rpm rate limits for keys by @krrishdholakia in #1501

Full Changelog: v1.18.3...v1.18.4

Don't miss a new litellm release

NewReleases is sending notifications on new releases.