Skip to main content

How Has the Global Lockdown Impacted Video Production Workflows?

"American Idol" went remote to finish its 18th season. (Image credit: LTN Global)

A number of major video productions have managed to continue successfully despite having their workflows disrupted by COVID-19 and the restrictions lockdown measures enforce. One of the main reasons some producers endure is because their teams have been able to adapt to working in the “new normal” by utilizing a combination of emerging technologies and time-tested IP transport solutions.

Yet these are not simply band-aid solutions, but instead are examples of an acceleration in the evolution of production workflows taking place across the media and sports industries. This transition to IP infrastructure has exposed new capabilities not available through traditional satellite, and is proving to be a framework on which to build solutions that will overcome the challenges facing the media industry today and in the future. 

Reliably transporting video over the internet using intelligent networks, harnessing cloud solutions to enrich video acquisition and working with machine learning applications to curate content are just some of the ways media companies have adapted their production workflows.

IP-ENABLED AGILE PRODUCTIONS AND OVERCOMING THE LIMITATIONS OF THE PUBLIC INTERNET

This year, Fremantle North America, the producers behind “American Idol,” had to get very creative to keep season 18 afloat after having to suspend filming because of the most challenging production conditions in the show’s history. To keep with social distancing rules, American Idol Productions set up the remaining contestants with home production packages that used iPhones to film and transport video to a production facility over the internet. 

The internet is increasingly becoming a prominent method of video transmission. However, IP-based transmissions can pose several challenges for media organizations because of the networks within which they operate. This is especially true with live and real-time video that must be reliably received within a very low latency time frame. Handling real-time applications over the internet requires further processing than is available within the internet's simple core. 

PLUS: Has Remote Production Changed Production ... Forever?

By using intelligent network architectures, media companies like Fremantle can gain universally accessible, secure and highly reliable video transport services. When combined with a fully managed service offering and critical technology to workaround inevitable and frequent internet choke points, it becomes possible for live video to be transmitted over an IP network to any location in the world with the same reliability and broadcast-quality of satellite service. 

ADVANCED CONTENT ACQUISITION CAPABILITIES

Over the years, the NFL Draft has become one of NFL’s most fan-facing events. However, with the COVID-19 lockdown in place, it was no longer possible for the NFL to have its hyper-engaged fans physically present at the show. So the NFL did this year’s Draft virtually, by bringing in nearly 500 user-generated fan feeds.  

As shown with both the American Idol Productions and NFL Draft, consumer products like smartphones now have cameras with the capacity to film high-quality content, adjust white balance and autofocus. Experts agree that user-generated content (UGC) will continue to become a more prominent part of programming. In fact, analysts from IHS Markit estimate that the number of citizen journalists will increase by 145% each year, from now until 2025, and with the aggressive rollout of 5G, live video transmissions will continue to grow exponentially. 

Cloud-based footage ingest gives producers of news, sports, esports or entertainment shows the capability to acquire content from unlimited concurrent live feeds from multiple sources. This can include professional cameras, encoders, drones, mobile phones and online sources including RIST, RTMP, RTSP, MPEG-TS, WebRTC, SRT, HLS and MPEG-Dash.

WebRTC applications are particularly important because they remove app dependency, enabling anyone to contribute live video to production without the need to install an app—an aspect that is crucial in urgent and time-critical situations. When combined with talk-back/IFB functionality, this feature can be used for remote interviews, call-ins, getting voices live from the scene by switching the program to the on-site mobile reporter, remote talent, fans in a football stadium or even a citizen who is in the vicinity of an incident.

OPTIMIZED PRODUCTION WITH MACHINE LEARNING

As broadcasters continue this transition to live video over IP networks, especially when combined with cloud-based live signal acquisition to unlock an unlimited number of sources, they will find a treasure trove of content. However, managing several hundred live fan feeds like the NFL Draft, or thousands of citizen journalist feeds, the sheer volume can become overwhelming for any production team to manage. 

Luckily, Machine Learning applications, running on cloud-based infrastructure, can simplify mass content acquisition by automating manual tasks in the video workflow. These APIs can be leveraged for signal searches, highlighting scaling ingestion and ease of use during production scenarios and demonstrating exception-based monitoring and resolution scenarios.

As streams from multiple inputs are ingested, they become collated into a browser-based master control room with a continuous playback multiview. In addition to being recorded, every frame can be indexed, making live sources searchable, to find the best out of the mess.

Using location or other ML inserted metadata (like objects detected in frames), content creation is simplified for the editorial teams as they can filter contributors and explore sources by configured inputs such as professional cameras, mobile contributions and geolocations. Production managers could, for example, use such technology to narrow down on team-specific or even athlete-specific feeds from a raft of incoming contributions, both professional and user-generated. 

Joseph Hopkins is senior vice president of Business & Corporate Development at LTN Global. Joseph is a streaming media and broadcast technology entrepreneur and executive who has spent the past 16 years bringing advanced technologies to market, including the first live-streamed Olympics and Super Bowl. He helped launch Verizon Digital Media Services in 2014 and has had several successful technology startup exits, the most recent being Make.TV, which was acquired by LTN Global in 2019.