github huggingface/huggingface_hub v1.1.5
[v1.1.5] Welcoming OVHcloud AI Endpoints as a new Inference Provider & More

11 hours ago

⚡️ New Inference Provider: OVHcloud AI Endpoints

OVHcloud AI Endpoints is now an official Inference Provider on Hugging Face! 🎉
OVHcloud delivers fast, production ready inference on secure, sovereign, fully 🇪🇺 European infrastructure - combining advanced features with competitive pricing.

import os
from huggingface_hub import InferenceClient

client = InferenceClient(
    api_key=os.environ["HF_TOKEN"],
)

completion = client.chat.completions.create(
    model="openai/gpt-oss-20b:ovhcloud",
    messages=[
        {
            "role": "user",
            "content": "What is the capital of France?"
        }
    ],
)

print(completion.choices[0].message)

More snippets examples in the provider documentation 👉 here.

  • Add OVHcloud AI Endpoints as an Inference Provder in #3541 by @eliasto

QoL Improvements

Installing the CLI is now much faster, thanks to @Boulaouaney for adding support for uv, bringing faster package installation.

  • Add uv support to installation scripts for faster package installation in #3486 by @Boulaouaney

Bug Fixes

This release also includes the following bug fixes:

Don't miss a new huggingface_hub release

NewReleases is sending notifications on new releases.