Select Page

What is Low Latency Video Streaming?

Published on Apr 27, 2021 | Video Streaming - Webcasting

Latency occurs in video streaming when there is a delay between the images you capture with your camera and what appears on your audience’s screens. This interferes with the live streaming experience which can negatively impact viewership and ratings. Here we look at low latency in video streaming and why it is the gold standard for live video streaming.

Latency in Video Streaming

As mentioned above latency refers to timing issues between the video you capture and the images your audience is seeing. In the case of latency the higher the latency on your live video streams, the less authentic and “live” your audience’s experience. Delays such as those often seen on live news interviews can make it confusing not only for the viewers but also the interviewer and interviewee.

As well, when watching live sports events, for example, viewers are experiencing the excitement at a delayed time lag which can interfere with their perception of the event. Even 20 seconds can make a huge impact on the viewer experience. Imagine a sportscaster in the studio commenting on a tackle, but on the field, the ref is calling a foul. Or in the case of a live call-in show, the broadcaster is receiving questions but answering one question behind. These examples show how easy it is for latency in video streaming to interfere with the quality of broadcast you provide.

Low Latency in Video Streaming

Low latency in video streaming reduces the time-lapse between capturing live video and when it is viewed. This is why low latency is the ultimate goal for live stream productions as it truly is live with little to no delays.

Why Does Latency Occur?

Latency can occur due to many contributing factors including:

  • Configuration and poor output signal quality that needs to take media in small chunks before sending it
  • The type of connection such as shared wireless versus a leased line set-up from your own studio
  • Propagating content between different caches
  • Audience connection quality or location
  • Poorly optimized buffer configuration

Experiencing just one of these issues can contribute to high latency. A combination of issues can prove devastating to the quality of your video feed.

Different Industry Approaches to Latency

There are a number of things to consider when trying to reduce latency. The main consideration is how much latency would be acceptable for your particular set-up. Some industry protocols used include:

  • Typical latency: If you have a linear broadcast that is in no way time-sensitive you probably require the average HLS and MPEG-DASH setup. As long as there is zero interaction required, this should suffice.
  • Reduced latency: This would be your typical live streaming set up for news and sports events. Here acceptable latency reduction can be achieved with a tweak to your HLS and MPEG-DASH streams, while also focusing on a reduction in segment size and increase of infrastructure size.

Where the main challenges lie is in situations with major interaction and keeping things as close to real-time as possible. This would include interviews, question and answer sessions and other situations where there is two-way conferencing and communication.

How to Achieve Low Latency During Video Streaming

To achieve low latency in live interactions scenarios you must consider the following:

  • Bandwidth: You need adequate bandwidth to avoid latency issues. The more data you need to capture and send, the more bandwidth you need. If you have bandwidth challenges with a bandwidth less than 10 Gbps, you can increase it to 10 Gbps or you can compress the video signal.
  • Distance: To minimize latency, reduce the distance between the transmitter and the receiver without allowing the user to perceive the input device and the output device at the same time.
  • Hardware and Software: It is never enough to rely on compression software Instead, to avoid increased latency issues caused by CPU loads and memory transfers you need to have the right combination of software and hardware to reduce the buffering time and minimize latency.

Regardless of the type of video streaming scenario you face, the bottom line is that achieving low latency is the gold standard all broadcasters should aim for. It improves the quality of the video viewer experience and avoids ending up with unusable video that loses its potency quickly during live stream situations. When real-time is the goal, low latency is the only option to provide quality broadcasts that meet the expectations of your viewers.

For a quote, questions, or other inquiries, contact VidOvation today.

Continue Reading