[0.0.31] - 2025-06-25
Pre-built binary wheels are available for PyTorch 2.7.1.
Added
- xFormers wheels are now python-version agnostic: this means that the same wheel can be used for python 3.9, 3.10, ... 3.13
- Added support for Flash-Attention 3 on Ampere GPUs
Removed
- We will no longer support V100 or older GPUs, following PyTorch (pytorch/pytorch#147607)
- Deprecated support for building Flash-Attention 2 as part of xFormers. For Ampere GPUs, we now use Flash-Attention 3 on windows, and Flash-Attention 2 can still be used through PyTorch on linux.