github huggingface/optimum-intel v1.20.1
v1.20.1: Patch release

13 hours ago
  • Fix lora unscaling in diffusion pipelines by @eaidova in #937
  • Fix compatibility with diffusers < 0.25.0 by @eaidova in #952
  • Allow to use SDPA in clip models by @eaidova in #941
  • Updated OVPipelinePart to have separate ov_config by @e-ddykim in #957
  • Symbol use in optimum: fix misprint by @jane-intel in #948
  • Fix temporary directory saving by @eaidova in #959
  • Disable warning about tokenizers version for ov tokenizers >= 2024.5 by @eaidova in #962
  • Restore original model_index.json after save_pretrained call by @eaidova in #961
  • Add v4.46 transformers support by @echarlaix in #960

Don't miss a new optimum-intel release

NewReleases is sending notifications on new releases.