Once upon a time, you probably owned a physical alarm clock. Maybe you even traveled with it. And a Rolodex. And an answering machine, a desk telephone, a wall calendar. A stop watch, a calculator, a bookshelf. “Bespoke” things, to lift a term from haberdashery that seems to be making the rounds in tech-talk lately.
Now look at your phone. All of those things are in it, plus a travel agent, television, mirror, document scanner, cookbook, pager, and whatever other handy apps you carry with you, every day, that used to be standalone things.
The same thing is happening in broadcast video facilities, to single out one (of many) industrial instances of the “bespoke-to multi-purpose device” phenomenon. Banks of switchers, servers, broadcast-specific production/playout hardware —all of the individual components that make TV signals get from where they start to your screen— are ultimately tilting toward the same general destiny as your formerly physical stuff is to their smartphone doppelganger.
It’s one of the things broadcast-side technologists are talking about during the run-up to the NAB’s annual gathering in April. So far, I’ve spared you the very Dilbert-esque language of it, but I can spare you no further: Virtualization. Orchestration. Automation. Three words, fourteen syllables, lots of room for interpretation, in the overall language of “the cloud.”
VIRTUALIZATION, ORCHESTRATION, AUTOMATION, HUH?
Here’s the short translation: Virtualization is what happens when a big, standalone device becomes the “app” version of itself. Point to any traditional piece of equipment in a broadcast TV control room, and ask the nearest engineer what percentage of the time it’s actively used. Chances are high that it’s a low number.
Chances are also high that when it’s idle, a lot of compute, connectivity and storage resources are essentially twiddling their thumbs. That’s what makes virtualization attractive, especially to the efficiency-minded.
Orchestration is what makes those previously standalone broadcast “appliances” easily navigable, all in one place, with a deliberately “web-like” user experience/UX. Log in, pick a desired event (sporting event, newscast, comedy), drag-and-drop, voila! Good to go.
Automation in a broadcast video sense, is what makes it so that a TV show —and everything about it, behind the scenes— can be set up and torn down, in seconds. It’s the work that automates a series of processes, across a pile of gear, which used to take much longer.
Setting up and tearing down a TV show used to take much longer —hours, or days, depending on the material— because studios and production data centers are largely hard-wired. Control rooms are connected to studios by lots and lots of SDI (Serial Digital Interface) cables, some still managed by physical patch panels. It’s not unusual for dozens of devices to be physically patched to multiple distribution channels via a patch panel.
Plus, today’s “bespoke” broadcast appliances each come with their own management interfaces. Usually, they’re ganged together in some fashion of back office management tool, but even so, a typical day in the life of a broadcast engineer still involves a lot of manual configuration, especially when a broadcast endeavor spans multiple facilities.
“For lots of us, the transition to IP starts with location independence,” said a European broadcast engineer. “When you have a handful of locations separated by hundreds of miles, unifying acquisition is very desirable —but to get on the ‘IP train,’ it has to be cost-effective.”
THE FINGERPRINTS OF IP ARE ALL OVER THIS
At the heart of all of this bespoke-to-virtualized activity is the steady progression of IP (Internet Protocol)-based technologies, permeating all industries, as the goods and services all around us become more “Internet-like.”
In the case of video, it started (as technological advances usually do) with consumer technologies: Internet-connected TVs, smart phones, tablets.
It’s happening now, in video distribution, as satellite providers and the industry we used to call “cable” shifts more and more of its total available capacity to IP-delivered video and services.
Broadcasters and studios are next, as the transition to all-IP marches further “up” the video food chain. If the global transition to “all-IP” could be illustrated, it’d look like one of those maps that follows a moving progression, like spring —a bloom of green to the South, edging ever northward. Except in this case, it’s a progression from what you’re seeing on your screen, all the way “back” to the camera that captured it.
The IP transition for broadcasters didn’t happen earlier because Ethernet networks and interfaces weren’t fast enough to carry uncompressed video. Virtualization didn’t happen earlier, because data centers lacked “GPUs” (Graphical Processing Units), plus they were originally built to support “web-based” and “non-real-time” applications. However, professional media applications, like broadcast video, are necessarily real-time animals, and need GPU heft to operate well. Add in the absence of network scheduling techniques, necessary to avoid packet loss — “no dead air” isn’t a “nice to have” when it comes to TV broadcasts. It’s a basic necessity.
Those restrictions are lifting, rapidly, as the exhibits and sessions at the 2018 NAB Show will assuredly demonstrate. Again, this is as close as I’ll ever get to an endorsement, but from the Department of Full Disclosure, Cisco commissioned this series and provided the engineering contacts and background that informed this piece.
So if “getting virtualized” is on your “figure it out” list, and you long for a web-like, drag-and-drop way to see and manipulate video workflows, or even better, if you need to convince your IT-side colleagues of the changes needed in the data centers for broadcast video, that’d be a good whistle-stop for your NAB wanderings.
Bottom line is, the triad of virtualization, orchestration and automation is both inevitable and a plausible way to optimize your gear. That alone brings cost savings. But like any other technological advance, it’s as much about slivering costs as it is the “gotta have it!” capabilities that come along for the ride.
This is the second in a six-part blog series preceding the 2018 NAB Show. Click here to read more about what broadcasters will be talking about at this year's show. Click the thumbnail below for the next installment in this series.
About the Author: Leslie Ellis is a respected “technology translator,” well known in cable and telecom circles for her award-winning, 20+ year “Translation Please” column in Multichannel News. She took on this Cisco-sponsored pre-NAB series to point out common and frustrating obstacles, for anyone on the sliding transition toward “being more Internet-like.” It is less of a comprehensive representation of available options and more a glimpse into what’s worrisome, on a day-to-day basis for engineers and IT people who work in media and entertainment.