Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SRT latency, it is only needed to be set one side or both ? #3106

Open
salvaaod opened this issue Jan 15, 2025 · 3 comments
Open

SRT latency, it is only needed to be set one side or both ? #3106

salvaaod opened this issue Jan 15, 2025 · 3 comments
Labels
Type: Question Questions or things that require clarification

Comments

@salvaaod
Copy link

I have not yet find a definitive answer to this, generally is always advised to set latency in both sides, I have a system in which the SRT listener, a hardware IP camera, does not have the ability to set latency but in the caller, a software receiver, I can set the latency.

Does SRT latency is applied if only set in the caller side?

Thanks for your help.

Salva

@salvaaod salvaaod added the Type: Question Questions or things that require clarification label Jan 15, 2025
@ethouris
Copy link
Collaborator

The setting on both sides matters. The effective latency for the connection is the maximum value of the latency settings on both sides (although separately per direction). The default value is 120ms, so if you set a bigger value on one side (doesn't matter which one), it will usually suffice.

@salvaaod
Copy link
Author

The setting on both sides matters. The effective latency for the connection is the maximum value of the latency settings on both sides (although separately per direction). The default value is 120ms, so if you set a bigger value on one side (doesn't matter which one), it will usually suffice.

So, if in this case I cannot set a latency in the TX side (camera, listener) and I can set a latency target in the RX side (monitor, caller), what will be the behavior ?

Salva

@ethouris
Copy link
Collaborator

Meaning, the only device you have the access to, to apply the setting, is the monitor (which is receiver)?

If you set, say, 250ms latency on the monitor, you'll have 250ms latency (as >120). If you set less than 120, you'll have 120. Note that it's true only as long as the camera's SRT application doesn't have this default 120 value somehow overridden.

The latency can be set differently per direction, if you want to set only in one direction, then it's SRTO_RCVLATENCY on the receiver and SRTO_PEERLATENCY on the sender. Usually you don't use both directions and therefore not care which direction you configure, so you can configure it in both directions using SRTO_LATENCY option.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Type: Question Questions or things that require clarification
Projects
None yet
Development

No branches or pull requests

2 participants