A broadcaster’s cable-cutting customer watches their
favorite sports team live online, but learns what happens
from social media before they see it in the actual “live”
video stream. Talk about spoilers!
Still, this situation is all too common. Broadcasting live video
to consumers uses satellites, the public internet, streaming
formats such as DASH and HLS, and CDNs, all of which
contribute to latency levels that can exceed several minutes.
Even in an end-to-end dedicated IP network, low-latency live
video delivery via peer-to-peer, unicast, or multicast can be
challenging if systems are not designed correctly.
Video system designers need to overcome these challenges
in many different real-world scenarios, including the following:
• For broadcast contribution, broadcasters want backhaul
solutions that can use public internet connections to
transport live remote video and on-the-ground coverage
into studio environments so that it can be seamlessly
integrated into any production including live interviews,
commentary, and online content.
• Executives with global organizations want to deliver
smooth, seamless interactive “live” internal all-hands
and town hall presentations with Q&A to every employee
within their offices and at home.
• With gaming, off-track betting, and other private
broadcasts, video streaming latency is critical to assure
interactivity and real-time delivery.
For each of these scenarios, the end-to-end latency threshold
must remain very low to ensure that interactions are fluid
and video content is delivered in a timely manner. Remote
conversations require less than 150 milliseconds of end-to-end
latency. Real-time broadcasts such as off-track betting typically
demand less than 4-6 seconds.
Neither benchmark can be satisfied with today’s segmented
streaming workflows applied for general-purpose internet
streaming. Therefore, video system designers must consider
all the factors that contribute to the total delays from when an
event actually happens and is captured on camera, to when a
viewer can see it on their screen (glass-to-glass).
In this article, we examine key factors to consider when
delivering video using the public internet. The key factors
are inherent to the different stages of delivering low-latency
video glass-to-glass. The stages are the following: Encoding,
Decoding and Transporting.
We’ll start by examining encoding and decoding, then
look at how the transporting process can impact video
delivery, as it has its own complexities that should be
Delivering High Quality,
Low Latency Video
Across the Internet
Video latency can have a huge impact
on how viewers experience live video