github huggingface/transformers v4.11.3
v4.11.3: Patch release

latest releases: v4.45.1, v4.45.0, v4.44.2...
2 years ago

v4.11.3: Patch release

This patch release fixes a few issues encountered since the release of v4.11.2:

  • [DPR] Correct init (#13796)
  • Fix warning situation: UserWarning: max_length is ignored when padding=True" (#13829)
  • Bart: check if decoder_inputs_embeds is set (#13800)
  • include megatron_gpt2 in installed modules (#13834)
  • Fixing 1-length special tokens cut. (#13862)
  • Fixing empty prompts for text-generation when BOS exists. (#13859)
  • Fixing question-answering with long contexts (#13873)
  • Fixing GPU for token-classification in a better way. (#13856)
  • Fixing Backward compatiblity for zero-shot (#13855)
  • Fix hp search for non sigopt backends (#13897)
  • Fix trainer logging_nan_inf_filter in torch_xla mode #13896 (@ymwangg)
  • [Trainer] Fix nan-loss condition #13911 (@anton-l)

Don't miss a new transformers release

NewReleases is sending notifications on new releases.