github BerriAI/litellm v1.18.0

latest releases: v1.48.6, v1.48.5.dev1, v1.48.5-stable...
8 months ago

What's Changed

https://docs.litellm.ai/docs/simple_proxy

  • [Feat] Proxy - Access Key metadata in callbacks by @ishaan-jaff in #1484
    • Access Proxy Key metadata in callbacks
    • Access Endpoint URL in calbacks - you can see if /chat/completions, /embeddings, /image/generation etc is called
    • Support for Langfuse Tags, We log request metadata as langfuse tags

PS. no keys leaked - these are keys to my local proxy
Screenshot 2024-01-17 at 6 10 10 PM

Support for model access groups

Use this if you have keys with access to specific models, and you want to give all them access to a new model.

You can now assign keys access to model groups, and add new models to that group via the config.yaml - https://docs.litellm.ai/docs/proxy/users#grant-access-to-new-model

curl --location 'http://localhost:8000/key/generate' \
-H 'Authorization: Bearer <your-master-key>' \
-H 'Content-Type: application/json' \
-d '{"models": ["beta-models"], # 👈 Model Access Group
            "max_budget": 0,}'

Langfuse Tags logged:

Screenshot 2024-01-17 at 6 11 36 PM * feat(proxy_server.py): support model access groups by @krrishdholakia in https://github.com//pull/1483

Full Changelog: v1.17.18...v1.18.0

What's Changed

Full Changelog: v1.17.18...v1.18.0

Don't miss a new litellm release

NewReleases is sending notifications on new releases.