We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am trying to run https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct on serverless but it throws error
https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct
2024-10-09 18:06:06.525 [my1jfmdyltrax5] [error] worker exited with exit code 1 2024-10-09 18:05:50.528 [my1jfmdyltrax5] [error] worker exited with exit code 1 2024-10-09 18:05:34.515 [my1jfmdyltrax5] [error] worker exited with exit code 1 2024-10-09 18:05:19.851 [my1jfmdyltrax5] [error] worker exited with exit code 1 2024-10-09 18:04:18.168 [eau4o046d9uc4d] [error] worker exited with exit code 1 2024-10-09 18:04:01.663 [eau4o046d9uc4d] [error] worker exited with exit code 1 2024-10-09 18:03:45.158 [eau4o046d9uc4d] [error] worker exited with exit code 1 2024-10-09 18:03:28.734 [eau4o046d9uc4d] [error] worker exited with exit code 1 2024-10-09 18:03:12.261 [eau4o046d9uc4d] [error] worker exited with exit code 1 2024-10-09 18:02:55.743 [eau4o046d9uc4d] [error] worker exited with exit code 1 2024-10-09 18:02:39.224 [eau4o046d9uc4d] [error] worker exited with exit code 1
The text was updated successfully, but these errors were encountered:
multi-modal models like Qwen2-VL is supported in pods, but it seems not in serverless endpoints. see my comment here: #114 (comment)
Sorry, something went wrong.
yes, this is fixed in the dev branch of vllm which when released in the next version should be fixed for endpoints as well.
Hi there!, At the moment we do not support visual-LLMs, but we are working on support for these in the future
Hey! Any update about visual-llms support?
No branches or pull requests
I am trying to run
https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct
on serverless but it throws errorThe text was updated successfully, but these errors were encountered: