-
Notifications
You must be signed in to change notification settings - Fork 196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ultravox 0.4.1 doesn't work with vllm #272
Comments
@petersalas I am not sure if the vllm example needs to be updated to work with the latest ultravox model. Please take a look. |
I think the issue is that for some reason the UltravoxProcessor isn't registered as an AutoProcessor in our recent HF models. (vLLM recently changed to depend on UltravoxProcessor) Will take a look this week. |
any update on this? |
Still working through the issue, it seems like there were some changes to UltravoxProcessor relative to 0.3 that are incompatible with latest vLLM. (A workaround in the short-term is to use an older version of vLLM -- I think 0.6.4 should work.) |
Thanks! I can confirm 0.6.4 works, except that |
with the dependency:
following this example here:
(but with "fixie-ai/ultravox-v0_3" changed to "fixie-ai/ultravox-v0_4_1-llama-3_1-8b")
https://github.com/vllm-project/vllm/blob/0794e7446efca1fd7b8ea1cde96777897660cdea/examples/offline_inference/audio_language.py#L27-L45
the error I got:
The text was updated successfully, but these errors were encountered: