TorchCodec 0.12 is out! This is a small release that focuses on completing the stable ABI migration and switching our default cuda backend to the faster backend. In 0.12, we are also aligning with pytorch repo’s cuda support by dropping cuda 12.8 and adding support for cuda 13.2.
Faster Cuda Backend is the new default
Starting in TorchCodec 0.12, the faster CUDA backend (previously known as ‘beta’) becomes the default backend. This will be a transparent and backward-compatible change.
# Previously, this used the slower 'FFmpeg' backend.
# Now this uses the faster backend by default.
decoder = VideoDecoder(..., device="cuda")Users who want to stay on the less efficient FFmpeg backend should explicitly use set_cuda_backend:
with set_cuda_backend("ffmpeg"):
decoder = VideoDecoder(..., device="cuda")ABI Stability
TorchCodec 0.12 will be ABI stable from torch 2.11 (yes, 2.11)! Previously, each new version of torch required a corresponding version of TorchCodec, which made dependency management complex for users. From 0.12, TorchCodec should be largely forward-compatible with future versions of torch, simplifying the installation and dependency management process.