pypi huggingface-hub 0.18.0
v0.18.0: Collection API, translated documentation and more!

latest releases: 0.24.2, 0.24.1, 0.24.0...
9 months ago

Collection API 🎉

Collection API is now fully supported in huggingface_hub!

A collection is a group of related items on the Hub (models, datasets, Spaces, papers) that are organized together on the same page. Collections are useful for creating your own portfolio, bookmarking content in categories, or presenting a curated list of items you want to share. Check out this guide to understand in more detail what collections are and this guide to learn how to build them programmatically.

Create/get/update/delete collection:

  • get_collection
  • create_collection: title, description, namespace, private
  • update_collection_metadata: title, description, position, private, theme
  • delete_collection

Add/update/remove item from collection:

  • add_collection_item: item id, item type, note
  • update_collection_item: note, position
  • delete_collection_item

Usage

>>> from huggingface_hub import get_collection
>>> collection = get_collection("TheBloke/recent-models-64f9a55bb3115b4f513ec026")
>>> collection.title
'Recent models'
>>> len(collection.items)
37
>>> collection.items[0]
CollectionItem: {
    {'_id': '6507f6d5423b46492ee1413e',
    'id': 'TheBloke/TigerBot-70B-Chat-GPTQ',
    'author': 'TheBloke',
    'item_type': 'model',
    'lastModified': '2023-09-19T12:55:21.000Z',
    (...)
}}
>>> from huggingface_hub import create_collection

# Create collection
>>> collection = create_collection(
...     title="ICCV 2023",
...     description="Portfolio of models, papers and demos I presented at ICCV 2023",
... )

# Add item with a note
>>> add_collection_item(
...     collection_slug=collection.slug,  # e.g. "davanstrien/climate-64f99dc2a5067f6b65531bab"
...     item_id="datasets/climate_fever",
...     item_type="dataset",
...     note="This dataset adopts the FEVER methodology that consists of 1,535 real-world claims regarding climate-change collected on the internet."
... )

📚 Translated documentation

Documentation is now available in both German and Korean thanks to community contributions! This is an important milestone for Hugging Face in its mission to democratize good machine learning.

Preupload files before committing

(Disclaimer: this is a power-user usage. It is not expected to be used directly by end users.)

When using create_commit (or upload_file/upload_folder), the internal workflow has 3 main steps:

  1. List the files to upload and check if those are regular files (text) or LFS files (binaries or huge files)
  2. Upload the LFS files to S3
  3. Create a commit on the Hub (upload regular files + reference S3 urls at once). The LFS upload is important to avoid large payloads during the commit call.

In this release, we introduce preupload_lfs_files to perform step 2 independently of step 3. This is useful for libraries like datasets that generate huge files "on-the-fly" and want to preupload them one by one before making one commit with all the files. For more details, please read this guide.

Miscellaneous improvements

❤️ List repo likers

Similarly to list_user_likes (listing all likes of a user), we now introduce list_repo_likers to list all likes on a repo - thanks to @issamarabi.

>>> from huggingface_hub import list_repo_likers
>>> likers = list_repo_likers("gpt2")
>>> len(likers)
204
>>> likers
[User(username=..., fullname=..., avatar_url=...), ...]

Refactored Dataset Card template

Template for the Dataset Card has been updated to be more aligned with the Model Card template.

QOL improvements

This release also adds a few QOL improvement for the users:

  • Suggest to check firewall/proxy settings + default to local file by @Wauplin in #1670
  • debug logs to debug level by @Wauplin (direct commit on main)
  • Change TimeoutError => asyncio.TimeoutError by @matthewgrossman in #1666
  • Handle refs/convert/parquet and PR revision correctly in hffs by @Wauplin in #1712
  • Document hf_transfer more prominently by @Wauplin in #1714

Breaking change

A breaking change has been introduced in CommitOperationAdd in order to implement preupload_lfs_files in a way that is convenient for the users. The main change is that CommitOperationAdd is no longer a static object but is modified internally by preupload_lfs_files and create_commit. This means that you cannot reuse a CommitOperationAdd object once it has been committed to the Hub. If you do so, an explicit exception will be raised. You can still reuse the operation objects if the commit call failed and you retry it. We hope that it will not affect any users but please open an issue if you're encountering any problem.

⚙️ Small fixes and maintenance

Docs fixes

Misc fixes

Internal

  • bump version to 0.18.0.dev0 by @Wauplin in #1658
  • sudo apt update in CI by @Wauplin (direct commit on main)
  • fix CI tests by @Wauplin (direct commit on main)
  • Skip flaky InferenceAPI test by @Wauplin (direct commit on main)
  • Respect HTTPError spec by @Wauplin in #1693
  • skip flaky test by @Wauplin (direct commit on main)
  • Fix LFS tests after password auth deprecation by @Wauplin in #1713

🤗 Significant community contributions

The following contributors have made significant changes to the library over the last release:

  • @martinbrose
    • Correct typo in upload guide (#1677)
    • 🌐 [i18n-DE] Translate docs to German (#1646)
    • Fixes filtering by tags with list_models and adds test case (#1673)
    • Add German concepts guide (#1686)
    • Address failing _check_disk_space() when path doesn't exist yet (#1692)
  • @wonhyeongseo
    • 🌐 [i18n-KO] Translated README, landing docs to Korean (#1667)

Don't miss a new huggingface-hub release

NewReleases is sending notifications on new releases.