The OpenAI endpoint doesn't support function strict setting

Hi, I’m trying to use Microsoft Semantic Kernel with vllm however I keep getting an error that the validation has failed due to the presence of the strict option in the request.

The error I am getting is
message=2 validation errors for ChatCompletionRequest
tools.0.function.strict
Extra inputs are not permitted [type=extra_forbidden, input_value=False, input_type=bool]
For further information visit Redirecting...

The request JSON is as follows. Notice the strict: false field. I think this is a new OpenAI protocol feature. Is there a version of vllm that will support this?

{
“tools”: [
{
“function”: {
“description”: “Gets a list of lights and their current state”,
“name”: “Lights-get_lights”,
“strict”: false,
“parameters”: {
“type”: “object”,
“required”: ,
“properties”: {}
}
},
“type”: “function”
},
{
“function”: {
“description”: “Gets a a specified light by name”,
“name”: “Lights-get_light”,
“strict”: false,
“parameters”: {
“type”: “object”,
“required”: [ “lightName” ],
“properties”: { “lightName”: { “type”: “string” } }
}
},
“type”: “function”
},
{
“function”: {
“description”: “Changes the state of the light”,
“name”: “Lights-change_state”,
“strict”: false,
“parameters”: {
“type”: “object”,
“required”: [ “id”, “isOn” ],
“properties”: {
“id”: { “type”: “integer” },
“isOn”: { “type”: “boolean” }
}
}
},
“type”: “function”
}
],
“messages”: [
{
“role”: “system”,
“content”: “You are an AI assistant that can call functions when appropriate. When needed respond in a plain text format”
},
{
“role”: “user”,
“content”: “What is the lights state?”
}
],
“model”: “mistralai/Mistral-7B-Instruct-v0.3”,
“tool_choice”: “auto”
}

Thanks,

Kevin

Currently, vLLM does not support the strict field in function definitions within the tools parameter. If you include "strict": false (or true), vLLM will return a validation error: “Extra inputs are not permitted” for the strict field, as you observed. This is because the vLLM implementation of function calling does not yet recognize or allow the strict parameter, which was introduced by OpenAI after vLLM’s function calling support was added. There is no released version of vLLM that supports this field as of now, and the feature is tracked as a known limitation and open issue in the vLLM repository (see here, see also).

As a workaround, you must remove the strict field from your request payload to avoid the validation error. If you are using a client library that automatically adds strict, you may need to patch or override its behavior until vLLM adds support for this field. Would you like more detail on the current status or possible workarounds?

Sources: