Cachetools Version

Which cachetools version is needed to use vllm ?

Im trying to train my model with lora enabled as true, but im getting the following error

ERROR 11-04 08:27:34 [core.py:390] AttributeError: ‘LoRALRUCache’ object has no attribute ‘_LRUCache__update’

It could be possible that the error you’re seeing, AttributeError: ‘LoRALRUCache’ object has no attribute ‘_LRUCache__update’, is not directly caused by the cachetools version, but rather by an implementation detail in vLLM’s LoRALRUCache class. According to the discussion in the vLLM repository, this error occurs because LoRALRUCache does not implement the _LRUCache__update method, and this method cannot be found anywhere in the repository. There is no explicit mention in the provided data sources of a required cachetools version for vLLM, nor is there a documented fix for this error by changing the cachetools version. The issue seems to be internal to vLLM’s codebase rather than a dependency mismatch with cachetools itself.

Though I’m not entirely certain, it might be that updating vLLM to the latest version or checking for related bugfixes in the vLLM repository could resolve the issue, as the error is acknowledged by the maintainers. If you need more detail or want to troubleshoot further, let me know.

Sources:

Hint: Mention @RunLLM in the post for followups.

@RunLLM Trouble shoot further, what should i do to workaround the loracache issue

The error 'LoRALRUCache' object has no attribute '_LRUCache__update' is due to a missing method in vLLM’s LoRALRUCache implementation, not a cachetools version mismatch. According to this vLLM issue, the maintainers suggest that this is a code bug and not user error. Updating to the latest vLLM main branch may resolve the issue, as the example code on the latest main does not reproduce the error.

If updating does not help, check that you are not using a custom or outdated LRUCache implementation, and ensure your vLLM and dependencies match those used in the working example. If the problem persists, consider opening a GitHub issue with your package versions and code snippet.

Would you like a step-by-step guide for updating or checking your environment?

Sources:

Hint: Mention @RunLLM in the post for followups.