pypi transformers 4.8.0
v4.8.0 Integration with the Hub and Flax/JAX support

latest releases: 4.44.2, 4.44.1, 4.44.0...
3 years ago

v4.8.0 Integration with the Hub and Flax/JAX support

Integration with the Hub

Our example scripts and Trainer are now optimized for publishing your model on the Hugging Face Hub, with Tensorboard training metrics, and an automatically authored model card which contains all the relevant metadata, including evaluation results.

Trainer Hub integration

Use --push_to_hub to create a model repo for your training and it will be saved with all relevant metadata at the end of the training.

Other flags are:

  • push_to_hub_model_id to control the repo name
  • push_to_hub_organization to specify an organization

Visualizing Training metrics on huggingface.co (based on Tensorboard)

By default if you have tensorboard installed the training scripts will use it to log, and the logging traces folder is conveniently located inside your model output directory, so you can push them to your model repo by default.

Any model repo that contains Tensorboard traces will spawn a Tensorboard server:

image

which makes it very convenient to see how the training went! This Hub feature is in Beta so let us know if anything looks weird :)

See this model repo

Model card generation

image

The model card contains info about the datasets used, the eval results, ...

Many users were already adding their eval results to their model cards in markdown format, but this is a more structured way of adding them which will make it easier to parse and e.g. represent in leaderboards such as the ones on Papers With Code!

We use a format specified in collaboration with [PaperswithCode] (https://github.com/huggingface/huggingface_hub/blame/main/modelcard.md), see also this repo.

Model, tokenizer and configurations

All models, tokenizers and configurations having a revamp push_to_hub() method as well as a push_to_hub argument in their save_pretrained() method. The workflow of this method is changed a bit to be more like git, with a local clone of the repo in a folder of the working directory, to make it easier to apply patches (use use_temp_dir=True to clone in temporary folders for the same behavior as the experimental API).

Flax/JAX support

Flax/JAX is becoming a fully supported backend of the Transformers library with more models having an implementation in it. BART, CLIP and T5 join the already existing models, find the whole list here.

General improvements and bug fixes

Don't miss a new transformers release

NewReleases is sending notifications on new releases.