github huggingface/diffusers v0.10.2
v0.10.2: Patch release

latest releases: v0.30.3, v0.30.2, v0.30.1...
21 months ago

This patch removes the hard requirement for transformers>=4.25.1 in case external libraries were downgrading the library upon startup in a non-controllable way.

🚨🚨🚨 Note that xformers in not automatically enabled anymore 🚨🚨🚨

The reasons for this are given here: #1640 (comment):

We should not automatically enable xformers for three reasons:

It's not PyTorch-like API. PyTorch doesn't by default enable all the fastest options available
We allocate GPU memory before the user even does .to("cuda")
This behavior is not consistent with cases where xformers is not installed

=> This means: If you were used to have xformers automatically enabled, please make sure to add the following now:

from diffusers.utils.import_utils import is_xformers_available

unet = ... # load unet

if is_xformers_available():
    try:
        unet.enable_xformers_memory_efficient_attention(True)
    except Exception as e:
        logger.warning(
            "Could not enable memory efficient attention. Make sure xformers is installed"
            f" correctly and a GPU is available: {e}"
        )

for the UNet (e.g. in dreambooth) or for the pipeline:

from diffusers.utils.import_utils import is_xformers_available

pipe = ... # load pipeline

if is_xformers_available():
    try:
        pipe.enable_xformers_memory_efficient_attention(True)
    except Exception as e:
        logger.warning(
            "Could not enable memory efficient attention. Make sure xformers is installed"
            f" correctly and a GPU is available: {e}"
        )

Don't miss a new diffusers release

NewReleases is sending notifications on new releases.