-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: How to set up stream_pipeline_online for inference? #7
Comments
@nguyenphuvinhtoan |
@digital-avatar I also tried implementing a server to handle the streaming response, but I couldn't get it to work successfully. Could you suggest some solutions to manage the streaming response results effectively? Thank you very much! |
I also curious about this. I think need to modify |
@nguyenphuvinhtoan |
Description
I'm trying to implement real-time inference using
stream_pipeline_online
, but I'm unclear about the proper setup process. I've successfully run the offline pipeline usingstream_pipeline_offline
, but need guidance on the online version.Current Setup
stream_pipeline_offline
for inferencestream_pipeline_online
Questions
stream_pipeline_online
?Additional Context
If there's any documentation or examples specifically for
stream_pipeline_online
, please point me in the right direction. Thank you!The text was updated successfully, but these errors were encountered: