A new perspective on interoperability?

There's a hot new topic in the broadcast industry: digital workflow. If you are planning to attend NAB2004, it might be a good idea to spend some time familiarizing yourself with digital-asset-management terminology before you hit the show floor. Perhaps then you will be able to grasp the digital-workflow concepts that vendors will be pushing. Better yet, you might be able to determine which vendors are embracing the spirit behind the development of emerging digital-workflow standards and those who see this as yet another opportunity to lock you into their proprietary solutions.

All content for the Turner Broadcasting System’s NOC is processed through ingest facilities where assets from Turner studios and outside vendors are digitized to the house MPEG-2 formats for SD and HD.

This column will examine the history behind efforts to develop digital workflow solutions for broadcasters and other creators of digital media content. In March we will examine the implications of the emerging digital workflow standards covered here, especially as they relate to multi-channel operations. Consider this column, and the Web resources that accompany it, as your introduction to digital workflow.

If you take the time to understand the fundamentals, you just might discover that broadcasters share the responsibility, with equipment vendors, for the glacial pace of progress in the development of an appropriate digital workflow for the future of digital television. By no small coincidence, this parallels the slow pace of progress with the transition to terrestrial digital broadcasting.

Why? Because most broadcasters think that the forced march from analog-to-digital transmission will not change the underlying business model of TV broadcasting; one linear-program stream delivered to one big stick. Multicasting is beginning to garner some attention; however, most broadcasters are still focused on trying to capture the largest audience possible with one program, rather than fragmenting their audience via multicasts.

Most broadcasters who have built new facilities to support the digital transition have built digital clones of the old analog plant. The most dramatic change has been the shift from tape-based to disk-based servers for commercial insertion and, for a few, program playout. Downstream of the server things look the same; everything is converted to digital baseband, integrated via a master control switcher, and then fed to an MPEG encoder for emission coding. Most facilities have chosen house format for both SD and HD, and they convert everything to those formats.

It would be easy to blame the customer for the lack of innovation and misunderstanding of the opportunity at hand. However, even those broadcasters who do see the opportunity to transform and revitalize terrestrial broadcasting via new digital-broadcasting techniques have had no means by which they could pursue them. Products that support emerging digital workflow concepts and standards simply do not exist … yet.

Historical perspectives

As a starting point, it may be helpful to consider when digital-workflow concepts first appeared on the radar screen. As a journalist, I started writing about these concepts more than a decade ago. As a consultant and participant in the development of digital-workflow standards, I have been bringing vendors and end-users together to facilitate progress since the early ‘90s through OpenStudio and OpenDTV conferences and forums.

It is not surprising that the stimulus for change came from the world of information technology, which was revolutionized in the ‘80s by the personal computer, and in the ‘90s by the proliferation of TCP/IP networks, and their interconnection through the Internet. Desktop audio begot desktop video and transformed the tools used for video production. This created the opportunity and the need to manage digital assets, as opposed to rooms filled with source tapes, EDLs and program masters.

By 1997, however, the industry began to address the need for digital-workflow standards. Much of this work has taken place within the Society of Motion Picture and Television Engineers (SMPTE), in collaboration with the European Broadcasting Union (EBU). The ProMPEG Forum has also played a key role in the development and promotion of digital-workflow standards. At NAB2004 these organizations will be promoting the Material eXchange Format (MXF), which became a SMPTE standard late last year, and the Advanced Authoring Format (AAF), which is tightly linked with MXF.

Vendors of computer-based video-production tools have leveraged IT-based solutions, creating a parallel digital workflow with limited connectivity to traditional broadcast equipment and signal-distribution infrastructures. Many vendors are turning to Extensible Markup Language (XML), a standard developed for the Internet for storing information that describes the actual media content. This descriptive information is called metadata, while the actual media is called essence media.

Meanwhile, a new category of broadcaster has emerged. For the most part, the content delivered by these broadcasters cannot be received via analog- or digital-terrestrial broadcasts. These broadcasters operate networks delivered by cable and DBS in the United States and around the world. They all share a common problem: the need to manage assets that will be distributed via multiple network feeds from a network operations center (NOC). In many cases, multiple versions of these assets must be maintained in order to deal with differences in what is considered acceptable content in various global markets, as well as different languages and differences in formats (PAL, NTSC and, now, HD).

In parallel, these organizations typically have an extended presence through Internet portal sites, thus many assets must also be repurposed for the Internet. Increasingly, these organizations are using their IT infrastructure to manage the parallel digital workflows in their in-house video networks.

One of the most advanced NOCs in the world is in Atlanta, the home of the Turner Broadcasting System (TBS). Turner also operates a separate NOC for CNN in Atlanta. Ron Tarasoff provided an extensive preview of the new Turner Entertainment Group NOC in the March 2003 issue of Broadcast Engineering, (a link to that story is included in the Web resources listed in this column). The center became operational in August 2003 and now feeds 22 channels, half of which reach international audiences. By the time this article is published, another international network will be operational, and in May 2004 Turner's first HD network (TNT-HD) will begin operations.

Planning for the center began about the same time that the SMPTE began work on digital-workflow standards. Turner engineers have been active participants in the development of these standards, but they did not stop there. Together with other organizations such as Discovery Networks, Turner turned the tables on manufacturers and hosted a series of “Perspectives” conferences to educate vendors about their requirements and larger market opportunities.

One of the biggest hurdles Turner had to overcome was the perception that the market opportunity was too small for most vendors to embrace. Traditional broadcast customers were not looking for comparable solutions; most still are not. But several companies, including Snell & Wilcox and Pinnacle, paid close attention and have become key suppliers to Turner. Engineers from both companies played a major role in the development of the MXF standard. Links to two Snell & Wilcox documents about MXF and AAF are included in the Web resources for this column.

Despite this extensive effort by Turner, it was not possible to procure products that supported these emerging standards in time to build the new NOC. Turner developed its own asset-management software and chose vendors who were committed to supporting these standards in the future through product upgrades. Thus, the new facility has parallel IT network and digital audio/video-network infrastructures that do not interoperate well.

All content for the NOC is processed through ingest facilities where assets from Turner studios and outside vendors are digitized to the house MPEG-2 formats for SD and HD. In addition to the high-quality essence media, proxy videos are created that can be streamed over the IT infrastructure. This is used to maintain metadata and manage the digital-media assets stored on the MPEG-2 servers.

What can MXF do?

According to a paper written by Dave Monshaw, of IBM Digital Media, MXF is a versatile file format that can help the broadcaster perform a number of tasks. For example, MXF can wrap itself around any compression format, store cuts-only EDLs and the material they act upon, and wrap up a playlist of files and store the synchronization information. It also can store files in a streamable format, which allows viewing while transferring between heterogeneous equipment, and store simple finished works with metadata.

The perceived differences between the IT and digital video network infrastructures that existed when work began on the MXF standard have narrowed as both have evolved. Protocols now exist to handle file transfers over SMPTE 259-M networks,and IT networks now can handle real-time streaming of compressed sources. And new hybrid solutions, such as IEEE 1394, have been designed to support real-time streaming, as well as direct attachment of hard disks and transfers of streams/files from camcorders and 1394-equipped VCRs. Typically, 1394 connections are localized to a workstation, but the next generation IEEE 1394b specs will support higher transfer speeds and the longer cable runs needed for broadcast facilities.

Craig Birkmaier is a technology consultant at Pcube Labs, and he hosts and moderates the OpenDTV Forum.

Web links

An introduction to MXF and AAF

Metadata, MXF and other AV terms for digital media

A metadata dictionary

Turner Entertainment's Network Operations Center

Send questions and comments to:cbirkmaier@primediabusiness.com