github huggingface/transformers v4.52.4
Patch release: v4.52.4

3 months ago

The following commits are included in that patch release:

  • [qwen-vl] Look for vocab size in text config (#38372)
  • Fix convert to original state dict for VLMs (#38385)
  • [video utils] group and reorder by number of frames (#38374)
  • [paligemma] fix processor with suffix (#38365)
  • Protect get_default_device for torch<2.3 (#38376)
  • [OPT] Fix attention scaling (#38290)

Don't miss a new transformers release

NewReleases is sending notifications on new releases.