Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using xprobe/xinference:v1.2.0-cpu, an error occurred while performing inference with speech1.5. #829

Open
6 tasks done
zhudemiao opened this issue Jan 16, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@zhudemiao
Copy link

Self Checks

  • This template is only for bug reports. For questions, please visit Discussions.
  • I have thoroughly reviewed the project documentation (installation, training, inference) but couldn't find information to solve my problem. English 中文 日本語 Portuguese (Brazil)
  • I have searched for existing issues, including closed ones. Search issues
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template and fill in all required fields.

Cloud or Self Hosted

Self Hosted (Docker)

Environment Details

Using xprobe/xinference:v1.2.0-cpu, an error occurred while performing inference with speech1.5.

Steps to Reproduce

Using xprobe/xinference:v1.2.0-cpu, an error occurred while performing inference with speech1.5.

✔️ Expected Behavior

成功

❌ Actual Behavior

2025-01-16 10:15:30,686 xinference.core.model 88 DEBUG After request speech, current serve request count: 0 for the model FishSpeech-1.5-0
2025-01-16 10:15:30,703 xinference.core.model 88 ERROR stream encountered an error.
Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/xinference/core/model.py", line 602, in _handle_pending_requests
r = await asyncio.to_thread(_wrapper, gen)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/asyncio/threads.py", line 25, in to_thread
return await loop.run_in_executor(None, func_call)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xinference/core/model.py", line 595, in _wrapper
return next(_gen)
^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/xinference/core/model.py", line 507, in _to_generator
for v in gen:
File "/opt/conda/lib/python3.11/site-packages/xinference/model/audio/fish_speech.py", line 158, in _stream_generator
writer = torchaudio.io.StreamWriter(out, format=response_format)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torio/io/_streaming_media_encoder.py", line 197, in init
self._s = ffmpeg_ext.StreamingMediaEncoderFileObj(dst, format, buffer_size)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torio/_extension/utils.py", line 25, in getattr
self._import_once()
File "/opt/conda/lib/python3.11/site-packages/torio/_extension/utils.py", line 39, in _import_once
self.module = self.import_func()
^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torio/_extension/utils.py", line 143, in _init_ffmpeg
ext = _find_ffmpeg_extension(ffmpeg_vers)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torio/_extension/utils.py", line 122, in _find_ffmpeg_extension
raise ImportError(
ImportError: Failed to intialize FFmpeg extension. Tried versions: ['6', '5', '4', '']. Enable DEBUG logging to see more details about the error.

@zhudemiao zhudemiao added the bug Something isn't working label Jan 16, 2025
@Stardust-minus
Copy link
Member

Run apt install ffmpeg

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants