Greener Pastures: Greater Sustainability via Just-in time and On-demand Video Streaming

streaming
(Image credit: Getty)

Most of us are taught to turn off the lights when we leave a room to preserve energy—and save money for that matter. We should consider thinking along those lines for a lot of different aspects of our lives, including video streaming. Why, you might ask? Because video will make up an eye-watering 82% of all internet traffic this year, according to Cisco’s Visual Network Indicator. With streaming services contributing a far higher carbon footprint than linear delivery, it’s time to find a more sustainable way to stream. 

With most matters relative to sustainability, there is no silver bullet, but we believe that with innovation and creative thinking that address the various elements of the streaming ecosystem, great improvements are on the horizon.

One approach that is gaining traction is architecting a completely new way of video delivery. The current just-in-case model is past its sell-by date. It requires that the video is processed for all the different possible flavors of an OTT stream—requiring multiple profiles, packaging and encryption—and then transported to sit in the CDN regardless of whether someone might watch it. It’s just like leaving the lights on in every room at home, regardless of where they might be needed.  

To address this waste, we need to turn the traditional push model on its head. Instead of "just-in-case," we need to pivot to a "just-in-time approach," which provides only the resources required to deliver a specific piece of video content at any given time when there is demand. So, if no one is watching a channel, it completely frees up resources—across CPU, network and storage—reducing energy usage significantly. 

As well as cutting CO2, just-in-time technology provides time-to-market and cost advantages over existing cloud approaches by ensuring that every deployed resource has a purpose. It also uses spot instances that take advantage of spare cloud capacity at a fraction of the typical operational cost, by load-balancing the current workload to an over-provisioned instance while the preemptive instance is drained. For example, for long tail content, just-in-time technology reduces cloud costs up to 67% according to Quortex

Seems like a no-brainer, but how is just-in-time streaming possible?
Synamedia Quortex’s multi-tenant SaaS patented technology builds video streams on-the-fly, based on the end users’ requirements, and matches to their locations, devices and time zones. It quickly adapts to fluctuating audience demand, unpredictable network traffic, and infrastructure context and limitations. It also automatically scales cloud resources up and down, while maintaining the best quality of experience. 

This means that video content is delivered only when and how a viewer wants it, and in premium quality.

Can you measure video streaming sustainability today?
The CO2 measurement of streaming has to consider the impact of each element required to deliver the stream including the CDN, the player, the viewing device, the transcoding, the cloud provider and others. This presents many challenges. Cloud providers, for example, are committed to using a significant amount of renewable energy in the future but are not providing exact figures on their consumption. 

At the same time, they want to have all available resources operating efficiently 24/7, which may sound like the opposite of saving energy when in fact, it’s this desire to operate at maximum efficiency that has led to cloud providers building custom cooling systems, automated scaling of physical resources and other technology improvements that will lead to greater energy conservation.  

For a deeper dive, let’s examine one part of the ecosystem—the CDN. There are several challenges when it comes to quantifying CDN energy usage. One of the factors to consider is how energy consumption of different technologies is measured, as there is no standard in the industry. 

Another factor is that in traditional environments, servers could be turned off to lower energy costs. However, now that servers that are shared internally (multi-tenancy) and externally (connected over the internet) they use a network that cannot be shut down. 

Lastly, some content is considered more essential than others and has to be prioritized. Natural disaster warnings, for example, are traditionally stored on an edge server. Other content, however, can be served up from the cache rather than downloaded every time it is needed, reducing infrastructure, energy and costs. However, the cache uses energy as well.

A greener future
“The cheapest and greenest energy is the energy we don’t use” according to Jürgen Fischer, Danfoss President of Climate Solutions. By moving from the traditional push model to a pull model with just-in-time processing, video delivery has the potential to consume less energy and cost less. Luckily, there are great organizations out there—such as Greening of Streamingthat are focused on helping create unified thinking around end-to-end energy efficiency in the streaming supply chain.

As we carry on turning off the lights when we leave a room, let’s continue to think of ways we can apply this philosophy to other areas of our lives—like with on-demand and just-in time video streaming—to conserve energy and cut costs, for a greener, brighter future. 

If you would like to read more about the facts and figures on energy for streaming, then check out this white paper.

Marc Baillavoine