500 internal server error when using webP

source code string cannot contain null bytes

Model: llama-4 Maverick FP8

vLLM version: 0.14.1

It seems likely that the error “source code string cannot contain null bytes” is not directly documented in the vLLM data sources, but may be related to issues with model files or tokenizer files containing unexpected binary data or corruption. There is no explicit mention of this error for Llama-4 Maverick FP8 in the provided sources, but similar issues have been reported when loading models or tokenizers, especially if files are not in the expected format or are corrupted. For Llama-4 Maverick FP8, users have reported successful loading and inference with vLLM 0.11.0 and the correct Hugging Face checkpoint, so ensuring your files are intact and using the latest vLLM version is recommended. If the problem persists, try re-downloading the model and tokenizer files and verify you are not passing binary data where text is expected, such as in prompts or configuration overrides.

Would you like more detail on troubleshooting steps or specific configuration examples for Llama-4 Maverick FP8?

Sources:

Hint: Mention @RunLLM in the post for followups.