ComfyUI and Diffusers use their own LoRA format (typically a single .safetensors file with specific key naming and structure), while vLLM-Omni requires LoRA adapters in the PEFT format, which includes both adapter_config.json and adapter_model.safetensors, and follows a different internal structure and naming convention. vLLM-Omni relies on the PEFT standard for compatibility and dynamic loading, which is not directly compatible with the format used by ComfyUI or Diffusers (see example structure, design discussion).
This difference exists because each framework expects LoRA weights to be organized and referenced in a way that matches its own loading and patching logic. As a result, direct interchange of LoRA files between ComfyUI/Diffusers and vLLM-Omni is not possible without conversion.
Sources:
Hint: Mention @RunLLM in the post for followups.