Given a completion text for a fixed prompt text, how to calculate the log_prob of the completion text

I have a completion text for a fixed prompt.
Now i want to compute log_prob(completion | prompt ) but at 2 temperature values.
is there a way in the current API to get this quantity

No, vLLM V1’s current API does not support computing log probabilities of a fixed completion under different temperature values directly. In V1, logprobs are always computed from the model’s raw output before any temperature scaling or sampling adjustments, so the returned logprobs are independent of the temperature parameter. Support for logprobs with post-sampling (i.e., after temperature scaling) is planned but not yet available. See: vLLM V1 User Guide - Logprobs, vLLM V1 User Guide (latest).

Would you like more detail on possible workarounds or the internal reasoning?

Sources:

Hint: Mention @RunLLM in the post for followups.