You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!
Your current environment
How would you like to use vllm
I want to run inference of a [specific model](put link here). I don't know how to integrate it with vllm.
OSError: /u01/xxx
/.cache/modelscope/hub/OpenBMB/MiniCPM-Llama3-V-2_5 does not appear to have a file named preprocessor_config.json. Checkout
The text was updated successfully, but these errors were encountered: