github huggingface/transformers v4.36.2
Patch release: v4.36.2

latest releases: v4.44.2, v4.44.1, v4.44.0...
9 months ago

Patch release to resolve some critical issues relating to the recent cache refactor, flash attention refactor and training in the multi-gpu and multi-node settings:

  • Resolve training bug with PEFT + GC #28031
  • Resolve cache issue when going beyond context window for Mistral/Mixtral FA2 #28037
  • Re-enable passing config to from_pretrained with FA #28043
  • Fix resuming from checkpoint when using FDSP with FULL_STATE_DICT #27891
  • Resolve bug when saving a checkpoint in the multi-node setting #28078

Don't miss a new transformers release

NewReleases is sending notifications on new releases.