Model Support
Llama DeepSeek
Topic | Replies | Views | Activity | |
---|---|---|---|---|
About the Model Support category |
![]() |
0 | 32 | March 20, 2025 |
Gemma 3 prefix caching in case of multimodal prompts |
![]() ![]() |
4 | 35 | May 22, 2025 |
Will vLLM follow-up DeepSeek's inference system |
![]() ![]() ![]() |
3 | 177 | May 13, 2025 |
Add Support for GLM-4 series model |
![]() ![]() |
1 | 70 | April 16, 2025 |
Why does phi3 implementation in vLLM inherit from llama? |
![]() ![]() |
1 | 22 | April 14, 2025 |
Does the latest version support deepseek-v3 tool call |
![]() |
0 | 50 | April 12, 2025 |