vLLM on RTX5090: Working GPU setup with torch 2.9.0 cu128

You’re correct—PyTorch 2.9.0 nightly is so new that many ecosystem packages (like xformers, torchvision, and others used by ComfyUI) may not yet have compatible wheels, especially for CUDA 12.8 and sm_120. This leads to frequent dependency conflicts and missing features, and uv (the package manager) can sometimes make version resolution even trickier. Until the ecosystem catches up, mixing vLLM with other frameworks on the same environment will be challenging.

If you need both vLLM and ComfyUI, consider using separate virtual environments for each, or wait for official releases of xformers and other dependencies with PyTorch 2.9/cu128 support. Keeping an eye on upstream releases and GitHub issues is the best way to know when compatibility improves.

Would you like tips on managing multiple environments or tracking dependency updates?

Sources:

1 Like