github huggingface/diffusers v0.5.0
v0.5.0: JAX/Flax and TPU support

latest releases: v0.31.0, v0.30.3, v0.30.2...
2 years ago

🌾 JAX/Flax integration for super fast Stable Diffusion on TPUs.

We added JAX support for Stable Diffusion! You can now run Stable Diffusion on Colab TPUs (and GPUs too!) for faster inference.

Check out this TPU-ready colab for a Stable Diffusion pipeline: Open In Colab
And a detailed blog post on Stable Diffusion and parallelism in JAX / Flax 🤗 https://huggingface.co/blog/stable_diffusion_jax

The most used models, schedulers and pipelines have been ported to JAX/Flax, namely:

  • Models: FlaxAutoencoderKL, FlaxUNet2DConditionModel
  • Schedulers: FlaxDDIMScheduler, FlaxDDIMScheduler, FlaxPNDMScheduler
  • Pipelines: FlaxStableDiffusionPipeline

Changelog:

🔥 DeepSpeed low-memory training

Thanks to the 🤗 accelerate integration with DeepSpeed, a few of our training examples became even more optimized in terms of VRAM and speed:

✏️ Changelog

Don't miss a new diffusers release

NewReleases is sending notifications on new releases.