Skip to main content

What’s Holding Back Low-Latency Streaming?

Ateme
(Image credit: Ateme)

You’re watching a major soccer final, live streamed to your main screen, when your neighbors cheer for a goal 40 seconds before you see it. This is the classic, borderline cliched conundrum, routinely trotted out to illustrate how the user experience is spoiled when viewing live content Over the Top (OTT).

The real question is, why has this problem not yet been tackled?

Despite the growing popularity of streaming services, reducing OTT latency remains a challenge for content providers, broadcasters and service providers, and a major source of frustration for audiences. Is there a way to efficiently deliver lower latency at a large scale while preserving visual quality? Is low latency realistically achievable today in production?

The answer is an emphatic yes! Low-latency OTT is no longer theoretical: it is real.

The trouble with latency
Latency is introduced by the different stages required to process and deliver video content within the streaming pipeline. Some components are already operating very close to their physical limit, i.e., the speed of light, with sub-millisecond processing delays at the sending and the receiving end. Other components, such as the video encoder, come in different versions, many of which include low-latency options to reduce the delay between the last input pixel and the first encoded byte. 

Typical broadcast linear stream delay ranges anywhere from five to ten seconds. In contrast, OTT multi-screen video delivery has historically been anywhere from 30 seconds to over 60 seconds, depending on the viewing device and video workflow used. 

Latency is a critical issue, both for the viewer in terms of Quality of Experience (QoE), and for the streaming provider in terms of monetizing the investment. The challenge for the industry is to reduce this latency to a range closer to linear broadcast latency, however achieving very low latency at the expense of poor picture quality won’t work either.

The aim must be to achieve low-latency streaming on par with broadcast while advancing the audio and picture quality to exceed standard broadcast.

Low latency for both DASH and HLS using existing workflows
Achieving low latency traditionally required investing in different dedicated workflows: one for traditional OTT and another for low latency, both requiring outputs for HLS (iOS products principally) and MPEG-DASH. This meant a considerable upfront investment and additional complexity.

It has taken several years and several iterations for the chief players in each camp (DASH-IF and Apple) to devise a low-latency solution that would be interoperable among devices of both parties.

In 2020, Apple’s release of a new version of LL-HLS finally simplified the integration of the technology over a wide range of video players. It meant that the entire market could, for the first time, take advantage of Just-in-Time (JIT) packaging for both LL-DASH and LL-HLS, making it possible for operators to achieve low latency using existing workflows. 

Where operators were previously reluctant to invest in technologies that would only serve one-half of the market, now with a JIT Packager, low-latency streaming is ready to take off. Since content is only packaged at the request of the subscriber, processing and storage requirements are reduced. This equates to fewer servers and lower energy consumption, leading to lower operational costs and an improved environmental footprint.

To make the JIT Packager even more efficient in this workflow, the Common Media Applications Format, otherwise known as CMAF can be used by the packager, both for input from the encoder as well as the output to the CDN. The advantages of using CMAF are that it maintains the highest quality while also optimizing the streaming workflow, since both DASH and HLS can take the same format of video and audio chunks from the packager.

Moreover, streaming service providers can leverage the same workflow for more typical OTT services—including time-shifted TV, start-over capabilities, and VOD—on any device. Viewers can therefore enjoy a range of high-quality experiences.

Every step matters
Packaging is critical to achieving low latency, but it is not the only component in the video delivery chain that can introduce latency. Each step needs to be considered for further optimization, from encoders to CDN, and finally the player at the consumer device. Driving down OTT latency requires an end-to-end strategy, with every stage playing its role.

The aim must be to minimize latency while maximizing video quality, even with 4K HDR content. For example, the encoder needs to ingest source streams and produce the right size of chunks and segments that can then be uploaded to the origin server for delivery.

Scalability
It is also necessary that the CDN can easily and quickly scale. It is not just a matter of bandwidth capacity but of having the functionality to manage a high volume of simultaneous streaming sessions.

Operators need to increase streaming capacity quickly during prominent events that generate traffic peaks. This is where efficient CDNs often use lightweight caching applications that can spin up to scale quickly and turned off to repurpose the resources when not needed.

There are solutions that support all these functionalities at a level of performance that enables low-latency deployment at scale, relying on fully virtualized software technology that provides the elasticity needed to scale rapidly.

Improving monetization and user experience
Improving the latency to the viewers opens several avenues to increase the revenue component for the streaming service. 

Managing Dynamic Ad Insertion (DAI) with low latency streams is one important challenge that many customers are interested in solving.  

It’s easy to see why. Events where low latency is highly relevant such as live sports attract large audiences and consequently high-value ad inventory. Being able to insert advertising on a geographic, demographic, postcode or personalized basis on the fly would generate even greater monetization opportunities for the broadcaster.

The key here is Manifest Conditioning—the ability in the Origin packager to indicate where an ad break starts and finishes. Enabling Manifest Conditioning for low latency channels is another active R&D focus of industry leaders today.

Advertising also offers the ability to insert placement ads directly into streams, this can be from the advertising angle to promote and gain ad revenue, but can also be from a shopping angle to insert placement of new revenue opportunities. 

Another path for monetization comes in the form of sports betting, which is seen as a major growth segment. Here latency is critical, and being able to achieve close to 5 seconds should allow streaming providers to leverage this market.   

Conclusion
Delivering low-latency live streams is essential for high-value content such as live sports. The technology is primed and ready for OTT providers to offer audiences the quality of experience and services they expect from broadcast TV—at scale.

Dave Brass is VP NA Strategy & Market Development at ATEME