Releases around vllm-project/vllm-omni v0.16.0 on GitHub

Maybe you can find something interesting in this list
dockerhub
hexpm/elixir on Docker Hub 1.19.5-erlang-26.2.4-debian-trixie-20260223 1.19.4-erlang-26.2.4-debian-trixie-20260223
dockerhub
hexpm/elixir on Docker Hub 1.20.0-rc.1-erlang-26.2.4-debian-trixie-20260223 1.20.0-rc.0-erlang-26.2.4-debian-trixie-20260223
github
pytorch/pytorch on GitHub trunk/02f1f58a4a0945f64b671fc6028dc3b0252e2efd
quay
stackrox-io/scanner on Quay 2.39.5-12-g0656f530ca-ppc64le
quay
stackrox-io/scanner on Quay 2.39.5-12-g0656f530ca-arm64
github
pytorch/pytorch on GitHub trunk/e91d055606145a979ea53fab1daf405b0c62bde2
quay
stackrox-io/scanner on Quay 2.39.5-12-g0656f530ca-amd64
dockerhub
localai/localai on Docker Hub sha-42e580b-gpu-hipblas sha-42e580b-gpu-nvidia-cuda-12
quay
go-skynet/local-ai on Quay sha-42e580b-gpu-hipblas
quay
go-skynet/local-ai on Quay sha-42e580b-gpu-nvidia-cuda-12
dockerhub
localai/localai on Docker Hub sha-42e580b-gpu-intel
quay
go-skynet/local-ai on Quay sha-42e580b-gpu-intel
quay
stackrox-io/scanner on Quay 2.39.x-59-gb71b7ab138
dockerhub
localai/localai on Docker Hub sha-42e580b-nvidia-l4t-arm64-cuda-13

Don't miss a new release

NewReleases is sending notifications on new releases.