Skip to main content

Combatting Streaming's Latency Problem

watching sports
(Image credit: iStock)

What’s the biggest concern among streaming providers today? In a word: latency

Simply defined, latency is the time it takes to get from here to there—or to put it in the context for streaming, it is the time content leaves the source and is played out by the intended audience.  For sports, low latency is desirable and necessary; nobody wants someone knowing about a sports play before anybody else, even if we are talking minutes.

The Sting

(Image credit: Universal)

An example of very bad latency was in the 1973 film The Sting. The gambling house knew the results of the horse race before the bets were placed—that is neither good nor desirable.

Latency is the result of content having to pass through an array of devices on its way to the viewer. Let’s take a simple audio file:  First it is played out, then the audio is processed and then encoded with metadata. It is then sent through the network switches and routers out to the internet. 

Depending on your connection, the packets may make some more stops before reaching the CDN, which then transcodes the packets and streams them to the audience’s network connection and finally to your audience. YES, THIS TAKES TIME!

The audience can hear a delay and notice it, especially if they are comparing it to over-the-air content. The trick is to get the amount of latency down to the point of acceptance.

Softvelum Low Delay Protocol
To try to lessen the inherent delay you can use the SLDP (Softvelum Low Delay Protocol), a “last mile” delivery protocol. Whether you are encoding a RTMP, SRT, RTSP, NDI, MPEG-TS, HLS, Icecast and SHOUTcast stream, the SLDP protocol at the player side will pass the content to the audience with sub-second delay.  

SLDP is supported by modern browsers that support MSE (Media Source Extensions). SLDP is proprietary and must be decoded with a free HTML5 player and dedicated mobile application. A custom mobile app experience can be created by subscribing to a mobile specific SDK.

The SLDP protocol also allows for synchronized playback across devices, ensuring that all members of your audience are viewing the same media at the same time. This can be incredibly important for second screen usage at live events or even for any kind of real time broadcast that both low-latency and consistent experience are important. 

With sports as a key example again, imagine two viewers in a room together watching on their own devices, both getting 1-2 second delays, but with one about half a second ahead of the other. Each exciting play or devastating mistake is spoiled for the other viewer as the quicker of the two reacts first. 

Synchronized low-latency not only gives your audience a great experience compared to traditional over-the-air broadcast, but also ensures you maintain the shared experience that would otherwise be lost when viewing streamed content.

Web-based Real Time Communications
Another way is to use WebRTC (Web-based Real Time Communications), which operates very similar to SLDP. However, the issue with this Google-developed open source solution is that there is not a standard implementation, so different services are not deploying it the same way.

But WebRTC is fast—a real-time latency could be below 500ms. WebRTC is also supported by many browsers and is native to IOS.

According to Eduardo Martinez, Director of Technology for StreamGuys, the advantage of SLDP is the standardization of deployment.

"When you use a purpose-built protocol for ultra low latency streaming you can significantly cut down on the delay inherent in traditional segmented streaming protocols,” he said.

When it comes to streaming of events (mainly sports and breaking news) the audience will not tolerate high latency. In this world of multiple streams, the streamer does not want to be slower than an over-the-air broadcast. To quote Tom Petty, “the waiting is the hardest part.”

David Bialik

David Bialik is president of David K. Bialik & Associates and an AES Fellow.