NextGen TV: ‘We Need To Execute Now’ In Race To Deliver Video, Data To Cars

ATSC
(Image credit: ATSC)

VENICE, Fla.—There is no time to waste waiting for a Broadcast Core Network to emerge, be deployed and become widely adopted if broadcasters hope to tap into the lucrative emerging market for data and entertainment delivery to vehicles.

That’s the message of Lynn Rowe, founder, president and COO of Alchemedia. His company provided an innovative solution for the recent coast-to-coast ATSC 3.0 road test in Michigan based on the addition of Application Layer Forward Error Correction (AL-FEC) to the NextGen TV transmissions of participating stations.

 Lynn Rowe, founder, president and COO of Alchemedia.  (Image credit: Alchemedia)

As Rowe sees it, the U.S. is at an inflection point in how data will be delivered to cars and trucks as automakers start with a clean slate and design a new generation of electric vehicles (EVs) to propel society into the future. 

In this interview, Rowe discusses the data delivery needs of the auto industry, the 3.0 awareness level of carmakers, the trillion dollar potential of an integrated IP delivery network, his company’s software-defined network alternative to a Broadcast Core Network and why there’s no time to waste.  

(This is an edited transcript.)

TVTech: A session at the NextGen Broadcast Conference in June focused on the coast-to-coast delivery of video, audio and data via 3.0 in Michigan. Could you discuss the aim of the test?

Lynn Rowe:  The test was a wider initiative organized by Scripps. It involved a collaboration with Fox, Graham, CBS and [Scripps’] WMYD in Detroit for the test track. There was a variety of tests done with Sony, and the collaboration of Heartland Video. Everybody, rightfully so, gives them kudos for connecting the bits and pieces to allow the test. We were aiming at the auto industry. 

TVT: That’s a huge target. What specifically?

LR:  In particular, what the auto industry wants and how we give it to them. They want to deliver massive files—not a few 100-kilobit type of files. They're looking at 10s or even 100s of Gigabytes of software and OS upgrades and doing so regularly over the life of the fleet. 

They have planning cycles of 10 to 15 years. They have to maintain a fleet with a wide variety of functionality as a singular ecosystem. Those are somewhat their baseline requirements to initiate a plug-and-play piece of equipment into a platform like that.

So, you have to have an approach that's going to be manageable in an environment of flexible communication capabilities and compute capabilities over that period of life. Try to look at it from the automobile guys’ point of view.

The other portion of it is how do you make use of an environment that consists of a variety of owned station group guys and independent station operators with a variety of points of view in terms of how to interact with the market, a variety of different vendors, a variety of different configurations in Master Control for air operations, downstream muxes, multi-platform distribution and a variety of expertise in their various operations. That's our ecosystem. That’s sort of the bottom line requirements. How do you interface with that world to enable the product the auto industry guys are looking for?

TVT: What was your company’s involvement?

LR: I've operated in the industry for decades just as Lynn, otherwise known as One World Technologies, but getting a little gray. There's also a long-term ax I've had to grind about how CDNs have taken live linear content and distributed it over the last quarter century, more or less.

In the beginning, no big deal. Nobody noticed. But when you start hitting wireless and fixed frequency allocation assets, which are trying to be deployed in a way that replicates this live linear signal that is not going to work. That screams for a reinvention of the entire framework of distribution of live linear and common file type files.

TVT: Let’s circle back to the auto industry and its data delivery needs. How would you characterize the current interest level and awareness of 3.0 among automakers?

LR: Not as high as I would like to hope. We've had some conversations. My observation is they're interested; they're aware. They hear the so-called common wisdom that seeps out through normal means. 

But I don't believe they have a deep understanding of what the real value proposition of ATSC 3.0 is. Much more substantial discussion with the people who are actually managing the design, build and deployment of these systems needs to happen.

TVT: During the conference, one panelist pointed out that the auto industry is starting with a blank slate as it begins to design EV fleets. As a result, this is a good time for broadcasters to make the case for 3.0 to distribute the data you described. Isn’t that happening?

LR: Well. It's a collective consciousness experiment. That’s the way it appears to me. It's not like the old days when I started. Everything was very much the three networks, and the direction of technology development was driven by a handful of visionaries. A handful of people literally drove the process—Julie [Barnathan at ABC], [Joe] Flaherty [at CBS] and [Michael] Sherlock [at NBC], and then there was the BBC.

Today, there are station groups leading the 3.0 deployment. It’s been slow but steady. By the end of the year, 75% of U.S. TV households will be covered. That's nothing to be sneezed at. There's been great effort in terms of consumer electronics and television deployments. That's all wonderful and key foundational work. It's the traditional model. 

If you listen to the likes of McKinsey [Global Institute], the growing generation of communication solutions providers is a trillion dollar market. That’s all of the bits and pieces. Up to this moment, you have this presumption that a telco is going to do it, or AWS is going to do it.

No, the larger possibility—that trillion dollar possibility—is mixing and matching Wi-Fi, CBRS [Citizen Band Radio Service], LEO [low earth orbit], MEO [medium earth orbit] satellite and ATSC 3.0 to create an integrated IP fabric to solve all of the use case solutions—some telco, some telco bypass. It's a new generation of solutions created by this market of communication solutions providers—that’s Alchemedia SmartGrid.

TVT: You see ATSC 3.0 as one of the components of that fabric? 

LR: Yes. A huge part. If you look at the internet, how much of that traffic is used for distribution of linear video? Around 80%. It’s a huge chunk.

TVT: If ATSC 3.0 is a thread in this fabric, especially for vehicles, it will need to hand off data for transmission to those traveling from one market to another, which was demonstrated in the Michigan coast-to-coast test. The Advanced Television Systems Committee has its eyes set on a nationwide Broadcast Core Network to make that handoff happen and has issued an RFP to get the ball rolling. What are your thoughts about this core network concept?

LR: It all goes back to the initial question. How do I actually do a service for the auto industry today? My belief is day-of-air operations and integration to this type of world in the near future has to be manual or a human-agreed-upon approach like Airbnb.

When you're doing Airbnb, you're making a decision about what you're doing with your house on that platform. There's human intervention in terms of input and scale of relationship. 

It's not I've got control over stations’ master control and access to their bandwidth. That's not a near-term possibility knowing the realities of regulation, ownership wariness about their licenses and operation and what gets on air. Broadcasters want that control.

Alchemedia’s proposition is based on how much bandwidth a broadcaster wants to put in the kitty. I'll book and schedule it, and then I'll put money in your bank account. You don't have to know anything; you don't have to do anything. It works with any of the ATSC 3.0 gateways. Frankly, it could work with ATSC 1.0 through an encapsulator, as well. So, this process is backward-compatible, unlike the AV mux portion of the standard.

TVT: So the end goal is the same, but this model will be more familiar to broadcasters. Is that it?

LR: I'm trying to address the realities of the operational environment. This is how I book bandwidth. 

Outside of that, there is this traditional ATSC 1.0 plug-and-play environment for TV sets and transmitters and encoders, muxes and different vendors. This is all an appliance-transmission-reception chain devised to be plug and play. It all just works.

This is not ATSC 3.0. It is really complicated compared to that. It's an order of magnitude difference in complexity and maybe more. Set and forget was what we did in ATSC 1.0.

With this one, there's a core, and we're going to slice and dice it. It can do this; it can do that; it can sing; it can dance. Wow.

But broadcasters have been doing what they've been doing with personnel and master control for decades. Suddenly, there's this new product, which broadcasters haven't really been trained to understand. How's that going to go? Then we're talking 3GPP. This, that and the other. Wow. How do you operate?

I don't expect that. I'm just making a deal. I plug into a gateway. I have allocated bits, and I book and schedule it, and then hand them back money. 

Our point in the Michigan test is there are software-definable network approaches that can help us quick start real commercializable businesses now. How did we do that? By applying a multipath application layer FEC [forward error correction] approach to what’s there. 

[Editor’s note: For more on AL-FEC, see “ATSC 3.0 Set To Roll As Road To Use In Vehicles Becomes Clearer.”]

I just treat what's there like a pipe. Then I take an application-oriented network capability approach and slap it on that pipe, and it gives me all of the other functionality. It gives me the functionality to be able to scale output between transmitters. If I have 10 transmitters in the market with 1 Mb apiece, I get 10 Megabits.

If I have all the receive chains and I want to stream a 720p [video], I've got allocated 1 Megabit across all of the networks. I can do that because it's a common cloud server with a common receive server that treats the entire infrastructure just like pipes. The entire intelligence is handled by the application. It's software-defined networking that Alchemedia integrates with the ATSC 3.0 standard.

Right now, we're sitting in a place where we have essentially, bandwidth-constrained ATSC 3.0 lighthouses—a lot of them; 75% of the market by year’s end. Quite frankly, with stat mux you could carve out 1 Mb across the entire footprint if broadcasters had the desire to do so and set a price for it. We're right at that sort of market moment, that Steven Jobs’ 99-cents moment.

TVT: Will your software-defined network support the traditional transactional model of broadcasting when multiple markets with multiple ownership groups are supplying the bandwidth for a nationwide IP distribution network?

LR: It will work if a variety of users book in and provide capacity. It’s like Airbnb. They don't have anything to do with each other except they decide to book their house in Airbnb. I don't have to know any of these people. I just want a place to rent in New York. The common platform is neutral to the end users or the leasers of the facility. They're just an enablement factor. 

It's a neutral host operation in the same sense that cell phones require a common operation to enable a cell phone number to get through to any carrier. There is a common neutral function that enables them. 

The alternative is to give up now. That’s the other strategy [the Broadcast Core Network strategy]. Perhaps it's a Don Quixote ask, but I'm a dinosaur with broadcaster roots, and I would love to preserve the work and value that has been generated over decades—and port it into the future.

TVT: Is your software-defined network an interim solution to bridge the industry from where it is today to the ultimate destination of a Broadcast Core Network?

LR: The way I look at this is there's a time issue—time-to-market criticality. The spider is saying to all the flies, “Come into the web.” They're building out a fabulous web. It's going to happen if nothing else happens, telcos being what they are. 

The idea is to enable a quick start. Using a software-definable network means being able to get a viable product out now, which is indifferent to ownership and different competitive vendors. I don't care. This is a viable service that works with everybody now, and it solves the objectives of the auto industry. 

I am just making the argument that ATSC 3.0 has a wonderful value proposition for many use cases, and it should be included in the fabric that makes up a Digital Commons. 

That does not say you cannot proceed with the live, real-time plug-and-playable objectives of the existing ATSC membership and products to get that hardened plug-and-play equipment chain out there.

But if we wait for that to be able to enable a mobile product, that market is gone. In my estimation, we are at the inflection point. We need to execute now.

More information is available online.

Phil Kurz

Phil Kurz is a contributing editor to TV Tech. He has written about TV and video technology for more than 30 years and served as editor of three leading industry magazines. He earned a Bachelor of Journalism and a Master’s Degree in Journalism from the University of Missouri-Columbia School of Journalism.