Television in the Teens: Mobility Reigns

When I was asked to review what had happened in television media over the past decade, the breadth and depth of changes in this industry reminded me of the changes I saw in my daughter as she moved from an adorable, wide-eyed 10-year- old to a more sophisticated young woman attending college and plotting out the course of her life. This analogy is further enhanced as I spend time with my 14-year-old grandson and try to comprehend the tumultuous times and “interesting” choices he makes as he navigates through adolescence. So let me welcome you to the end of the adolescent years of the first century of 2000 millennium. But where to start?

DIGITAL TELEVISION

While digital OTA technically began in the first decade of this century in the United States and a few other countries, it is really in this decade that we have seen terrestrial digital television become the primary standard and analog television service cease operation. The conversion to digital has enabled some very obvious benefits when it comes to the quality and quantity of services offered. Widescreen HD has become the baseline standard for content delivered to home receivers with pristine quality. Multicasts abound as even at my own Iowa Public Television we broadcast four independent program streams, two in high definition and two in standard definition, all with surround sound enabled, descriptive audio and closed captioning. I would venture the majority of OTA television platforms are now multicasts offering as wide a variety of content choices for over-the-air consumers as they may find in the basic subscription of the available MVPD’s.

But digital is not without its more subtle challenges. There are those who viewed the transition from analog to digital as a journey from one location to another location. However, digital was never a destination on a map but a continuum whose ultimate end is still beyond the perceivable horizon. I think the recognition of this fact by everyone involved in this industry is crucial because we have started on the path and we are accelerating. Case in point—NEXTGEN TV.

In 2009 as we joyfully shut off our aging analog transmitters, some here at Iowa Public Television were approaching 30 years old—how many of us were thinking that in 10 years we’d be contemplating shutting off our 15-year-old digital television transmitters? Not to mention convincing the entire viewing population to purchase new televisions because their “old sets” can’t receive the new OTA services. I’m not a big fan of quoting Moore’s Law because I believe the under-lying principle is an unsustainable growth pattern. However, I do believe there are corollaries that should be added if we are going to use it. One of these corollaries is that nothing digital is ever finished.

INFINITE CHANNEL UNIVERSE

While the internet certainly isn’t a product of the last decade, some of what it has enabled is. The internet has transitioned from an optional service to a utility, not unlike power and water. Technology has increased the speed and capacity of internet connectivity, both wired and wireless to enable a plethora of new over-the-top, streaming and on-demand services to be added to the over-the-air and MVPD subscription-based services available. The 500 channel universe that John Malone envisioned in the 1990s has morphed into an infinite selection.

Let’s not overlook another change, probably the most significant one of all: Mobility! I started my career as talent working in radio and back then we knew that the audience was mobile. I was never a good enough disc jockey to land one of the coveted drive time slots on the air but we knew that the vast majority of our audience was listening to us while they were on the go in their cars or laying on the beach on the weekends. The audience was taking the receiver with them and with an audio service it was pretty easy. Video services obviously presented a real challenge since one of the traditional driving forces in consumer television sales is increasing screen size and that the belief that the audience will opt to wait and watch content when they have access to the larger fixed screen.

In many cases however convenience trumps quality and just as people will skip going to see a movie in a theater and wait for it to come out on Netflix, they will likewise watch content on their smartphones while riding and sometimes even driving rather than wait to get home. The industry as a whole now has to come to grips with creating content that will be, not may be consumed on whatever device is most convenient for the audience. In my opinion, this nomadic consumption model is probably one of the most significant changes in our industry—not only bringing challenges to creating content but also new competition as wireless providers evolved, bringing 4G to market in the early part of this decade and the promise of 5G in the next decade.

COMMODITY CONTENT

Looking internally at the industry, the angst of the adolescence analogy still holds up as we adjust our business models, workflows and long-range plans to create content for audiences that have little knowledge or loyalty to the scheduled appointment, channel-based infrastructure of the past. An important concept to grasp is that the vast majority of consumption of preproduced content is still basically streaming in real time. What has fundamentally changed is that the scheduling of the stream has migrated from the content distributor to the content consumer.

Add to this environment the challenge of the ease of distribution and the growth of user-generated content. We saw this in its infancy when shows that took funny, cute and/or embarrassing home movies from VHS tapes and created hit television programs watched by mass audiences. Commoditized technology now allows anyone to purchase a 4K camera for under $100. While there are still “hit” shows based on user generated content easily found with a Google search, there are actually talented story tellers creating compelling content and making it available for free or by donation. This commoditization of the very content being created presents a whole raft of trials for the industry.

My friend and colleague Fred Baumgartner, director of NextGen TV Implication at ONEMedia once suggested I read the book “The Innovator’s Dilemma” by Clayton Christensen. It is a book I have actually read a few times because I want to be reminded that my mindset needs to be open to all possibilities.

One of the fundamental takeaways from the book is that when dealing with disruptive technologies, you cannot use the same metrics for success that are used to measure the success of the existing technology. The proof of that in our industry is apparent when you look at who is driving the market and who is struggling to survive. Companies that were barely blips on the radar of content creation and distribution at the beginning of this decade like Facebook, Apple, Amazon, Netflix and Google (FAANG) are now the big dogs and the long established leaders are consolidating to try and stay relevant.

VIRTUALLY EVERYWHERE

Digital’s entrance into the world of television began decades ago with the first time-base correctors, character generators and still store that allowed analog tape to replace film chains, paper graphics superimposed over video and 35mm slide projectors. Analog tape became digital tape, character generators became graphic systems and still stores became clip stores.

So the transition was from proprietary, on-premise analog hardware running no software to proprietary, on-premise digital hardware running proprietary software to commodity off-the-shelf (COTS), on-premise digital hardware running proprietary software (exhale). They still require the content creators to make a significant investment in hardware that has a significantly shorter useful life and software that is never finished and undergoing constant updates and improvements. Improvements that while increasing the capabilities of the system are simultaneously accelerating the end of life of the very systems they run on.

Enter the cloud and virtualization. The cloud is basically remote hardware and virtualization is running software on that remote hardware. Initially the low hanging fruit in virtualization was storage. Who didn’t love the idea of having their content available to them from anywhere they were at any time they want as long as they had an internet connection?

Initially concerns about security moderated the charge forward including some well publicized hacks at the major film studios. However, we are currently seeing some significant improvements is securing access and making the idea of more permanent cloud storage pick up speed.

At IBC, MovieLabs discussed their 10- year vision for the future of production, post production and creative technologies. One of the key elements is that their pristine content is in the cloud and the applications and software they use work on the content in the virtualized space. So clearly some of the largest creators of content are committed to the concept of virtualized workflows.

My colleague Chris Lennon, CEO of MediAnswers has been talking about microservices for awhile. Microservices for me is somewhat reminiscent of my early days writing programs where the computers had limited memory and the largest program you could execute was 64 kilobits. Pathetically small by modern standards, it required us to write code efficiently to fit the program into the space available and we also had to learn to modularize our programs to allow for seamlessly running a module and then using the results to execute the next module and so on.

This is how complex programs were designed. The commoditization of memory allowed for programs to grow in size so that they were a single large complex of code rather than separate modules. A microservice architecture harkens us back to the original modular programming and therefore allows some modules to run on user equipment and other modules to run on virtualized equipment.

While this is a complex arrangement, if properly managed and executed it brings some of the most astounding and compelling tools into the hands of anyone who wants to use them in a “pay by the drink” arrangement. Chris and other colleagues at SMPTE are in the process of putting together a working group to develop the standards and recommended practices for the development and application of microservices.

THE ENEMY OF GOOD

One of the other key concepts that I picked up for “The Innovator’s Dilemma” is that within a business there is often the push to develop the quality of the product or service far past the expectations and pocketbooks of the end users.

In some ways I think we are seeing this in the home television display marketplace where 4K and 8K television displays are being marketed as the next big thing. I had a conversation with another friend and colleague about this, discussing the shift in what content creators were looking for and what manufacturers were working on, with John Ive, director of Strategic Insight for the International Association of Broadcast Manufacturers.

“With technology I’m noticing that content image quality is maturing with small incremental improvements because we’ve reached the limits of perception,” he said. “So now more money is spent on the management of content and the business processes with image quality and content increasingly taken for granted.”

To an engineer or scientist who may believe that every improvement is worth the resource investment, this may sound a little like heresy but in the real world “good enough” is a valid metric that determination is made by the end user. The technologies needed to create and distribute content are mature and while they will improve and become even more affordable, that simply means that the sea of content will grow.

To be successful will require a focus on making content discoverable and available and this is why as the adolescent decade of this century comes to an end we are seeing a focus on business processes and content management.

So let’s raise a glass to a decade of change and upheaval. I hope that in 2029, I am asked and still capable of writing another decade in review. I wonder if I’ll even be able to recognize the business.

Bill Hayes is the director of engineering for Iowa Public Television.

Bill Hayes

Bill Hayes is the former director of engineering and technology for Iowa PBS and has been at the forefront of broadcast TV technology for more than 40 years. He’s a former president of IEEE’s Broadcast Technology Society, is a Partnership Board Member of the International Broadcasting Convention (IBC) and has contributed extensively to SMPTE and ATSC.  He is a recipient of Future's 2021 Tech Leadership Award and SMPTE Fellow.