Hi, I’m trying to use Microsoft Semantic Kernel with vllm however I keep getting an error that the validation has failed due to the presence of the strict option in the request.
The error I am getting is
message=2 validation errors for ChatCompletionRequest
tools.0.function.strict
Extra inputs are not permitted [type=extra_forbidden, input_value=False, input_type=bool]
For further information visit Redirecting...
The request JSON is as follows. Notice the strict: false field. I think this is a new OpenAI protocol feature. Is there a version of vllm that will support this?
{
“tools”: [
{
“function”: {
“description”: “Gets a list of lights and their current state”,
“name”: “Lights-get_lights”,
“strict”: false,
“parameters”: {
“type”: “object”,
“required”: ,
“properties”: {}
}
},
“type”: “function”
},
{
“function”: {
“description”: “Gets a a specified light by name”,
“name”: “Lights-get_light”,
“strict”: false,
“parameters”: {
“type”: “object”,
“required”: [ “lightName” ],
“properties”: { “lightName”: { “type”: “string” } }
}
},
“type”: “function”
},
{
“function”: {
“description”: “Changes the state of the light”,
“name”: “Lights-change_state”,
“strict”: false,
“parameters”: {
“type”: “object”,
“required”: [ “id”, “isOn” ],
“properties”: {
“id”: { “type”: “integer” },
“isOn”: { “type”: “boolean” }
}
}
},
“type”: “function”
}
],
“messages”: [
{
“role”: “system”,
“content”: “You are an AI assistant that can call functions when appropriate. When needed respond in a plain text format”
},
{
“role”: “user”,
“content”: “What is the lights state?”
}
],
“model”: “mistralai/Mistral-7B-Instruct-v0.3”,
“tool_choice”: “auto”
}
Thanks,
Kevin