vLLM benchmark host with self-signed certificate

Hi,

I am trying to benchmark with benchmark_serving.py for example:

python3 benchmark_serving.py --backend openai --base-url https://mydomain.org/ --dataset-name sonnet --dataset-path ./sonnet.txt --model /data/.cache/meta-llama/Llama-3.3-70B-Instruct
SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain

Unfortunately my self-signed cert is not accepted. I could not find a CLI switch to trust or ignore my cert. Is it possible to trust self-signed certs?

Thanks!

It seems likely that vLLM’s benchmark_serving.py does not provide a CLI switch to trust or ignore self-signed SSL certificates directly. The data sources do not mention any such parameter for bypassing SSL verification or trusting self-signed certs in benchmark_serving.py or related CLI arguments. This limitation is not explicitly documented, but no option like --insecure or --no-verify-ssl is present in the available arguments or examples for benchmarking scripts or server configuration [benchmark_serving.py].

You may need to handle SSL verification at the Python requests layer (e.g., by modifying the script to set verify=False or by setting the REQUESTS_CA_BUNDLE environment variable), but this is not officially supported or documented in vLLM’s CLI or docs [benchmark_serving.py]. Would you like more detail on how to patch the script or work around this limitation?

Sources: