I have a Prisma scan step integrated in my CICD pipeline which is reporting a few critical security issues regarding Python 3.8 with the vllm-openai v0.10.0 Docker image. This image seems to be using Python 3.12 so I don’t really know where these errors come from. Anyone can provide insights on these errors? Is this something that’s going to be cleaned up in the incoming versions?
I will paste here some of the Prisma scan logs in case it is useful:
It seems likely that the vllm-openai:v0.10.0 Docker image is based on Python 3.12, not 3.8, as confirmed by the official documentation and recent PRs that explicitly upgraded the base image to Python 3.12. All CI tests and development are now run with Python 3.12, and users are encouraged to use this version for compatibility and security reasons.
If your Prisma scan is reporting vulnerabilities for Python 3.8.3, my understanding is that this may be a false positive or a misconfiguration in the scanner, since the vllm-openai:v0.10.0 image should not include Python 3.8. You may want to check the scan configuration or verify the actual Python version inside the running container (e.g., python --version). The vLLM team is actively maintaining the Docker images and has addressed previous base image vulnerabilities, as seen in recent upgrades. Would you like more detail on how to verify the Python version or further steps to ensure your image is secure?
vLLM doesn’t support Python 3.8 anymore. The minimum version is Python 3.9, and the Docker image should be using Python 3.12. Perhaps Prisma detected the version incorrectly?
Correct—vLLM no longer supports Python 3.8; the minimum required version is Python 3.9, and the official Docker images use Python 3.12 as the base, as confirmed in the installation docs and contributor guidelines. If Prisma reports Python 3.8, it is likely a detection error or misconfiguration, not a real vulnerability in the vLLM image.
Would you like instructions on how to verify the Python version inside your Docker container?
You’re welcome! That sounds like a good plan—it’s likely a false positive from a legacy or cached layer, as the current vLLM Docker images are built on Python 3.12 and do not support Python 3.8 anymore, per the official documentation.
If you need to confirm, you can always run python --version inside the container to verify the actual Python version. Let me know if you need more help!