Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to set the number of buffers to read from the camera? #138

Open
wangxiaochuTHU opened this issue Aug 11, 2023 · 2 comments
Open

How to set the number of buffers to read from the camera? #138

wangxiaochuTHU opened this issue Aug 11, 2023 · 2 comments

Comments

@wangxiaochuTHU
Copy link

Hello, I currently used this crate for a cross-platform case.

I would like to be able to set the number of buffers to read from the camera, just as V4L2 does, for avoiding missing images.

However, I failed to learn from the examples on how to set that. Can you give any guidelines? Thank you very much.

@wangxiaochuTHU
Copy link
Author

I'm sorry, but in my use case the streamed image frames which flow as slow as 5 fps can be lost/dropped, where actually only a 4 μs-duration-operation is spent on each arrival of frames. So I suspect that the bottleneck might be at your lib.

Since it is common for most drivers to obtain a captured image frame-by-frame ( rather than byte-by-byte or line-by-line), I think we should shorten the handling time after we got the frame and keep capturing as soon as possible. Here is the idea based on this rule, how do you think about it (the code pattern is given as below) ?

The lib API looks like

/// set stream on, and return `rx` to user for receiving `Buffer`
pub fn stream_on(&mut self)  -> mpsc::Receiver<Buffer> {
    /* 
        set the stream on 
    */
    
    let (tx, rx) = mpsc::channel();
    self.tx = Some(tx);
    rx
}

/// infinite loop for requesting the next samples, and use a thread to handle each sample.
pub fn frames(&mut self)  {
     loop {
            let imf_sample: Option<IMFSample> = match unsafe { MFCreateSample() } {
                Ok(sample) => {
                    Some(sample)
                }
                Err(why) => {
                    return Err(NokhwaError::ReadFrameError(why.to_string()));
                }
            };
            
            /* 
                  here, can we handle imf_sample in a background thread? 
                           e.g. handle_task.send(Wrapper<imf_sample>);
                  thus we can wait for the next frame as soon as possible
            */
     }
}

/// the handle thread
pub fn sample_handle_thread(tx : mpsc::Sender<Buffer>) {
      while let Ok(Wrapper<imf_sample>) = handle_task.recv(){
              /*
                    handle the imf_sample and get `Buffer`
              */
              tx.send(buffer);
      }
}
            

Thus users can obtain the Buffer by receiving from rx.

@wangxiaochuTHU
Copy link
Author

Hello, the frame drop problem in my case finally ends up with setting the priority of the capture thread to "Time Critical" (it's on Windows). By doing this, the drop ratio can decrease from about 500 out of 10000 to a relatively low number, which is about 25 out of 10000.

Actually, my use case was to capture video with resolution 5000x5000 at 5Hz and then display&handle&save the images. I found it hard to understand the process/winapi of Windows camera capturing. Thanks to your crate, it let me achieve this goal much more easier and don't need to turn to Linux. Thank you very much, it is really a great project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant