Moving toward a dual-infrastructure facility

In today's multiple-stream transmission environment, broadcast designers and system integrators must find the best way to integrate the processing, distribution and transmission of SD and HD content.

In preparing for this integration, broadcasters must first consider bandwidth. Baseband transmission to the home isn't practical because of the storage and bandwidth requirements. Fortunately, today's technology allows broadcasters to handle broadcast-quality programming at significantly reduced bandwidths, and compression can be visually lossless.

But, as the HD rollout continues, stations will discover that quality is the critical issue — to both viewer and advertiser. With a more visually educated audience, the assumption that the average viewer cannot tell one compression scheme from another may no longer be valid. Artifacts happen, and the new generation of home monitors can display them with alarming precision.

These factors combine to challenge engineers to handle compressed streams in an environment that is intrinsically baseband-oriented. How can broadcasters maintain SD and HD quality through multiple encode/decode cycles along the signal path? And how can a facility do so in the most economic manner, i.e., without adding staff and operational complexity?

Rather than integrating baseband and compressed signals, perhaps broadcasters should consider creating two types of infrastructures: one for each of these signal types. The chief engineer's task is then to strengthen each processing path and build the necessary bridges between the two.

Coping with quality issues

Meeting the bandwidth mandate without delivering the associated quality would be pointless. So, as the use of compressed content has increased, the chief engineer has become both a bandwidth manager and a quality-of-service (QoS) manager.

Facilities that are migrating to an integrated SD/HD environment are either upconverting some or all of their programming, or are purchasing (or originating) HD content, or receiving it from the parent network.

Within each group, the methods of receiving SD and HD content run the gamut from analog to digital to compressed — with compressed taking the forefront. At some point, content reaches uniformity, allowing local switching and insertion. Currently, that uniformity is baseband. The entire content path (from compressed network transmission to local decompression, production, re-encoding and retransmission) presents multiple cycles of encoding and decoding, which have a detrimental effect on image quality.

Facilities today (and for the foreseeable future) need to ingest and support content in multiple formats. If iterative encode/decode cycles are required, consistent codec behavior throughout a facility's infrastructure can lessen any degradation. But the best approach to eliminating the quality loss from these iterative encode/decode is to avoid them altogether. And the right workflow can preclude them.

The uncompressed workflow, centered on the tools for live production, is still a necessity,.

The second workflow is clearly a compressed one. There is a general impression that ASI streams (and the programs within) are untouchable and that, once they've been created, you can't do anything with them — except point them to the transmitter. Not true. Provided that you recognize some caveats, remarkable new technologies enable a facility to work within the compressed domain, which offers great advantages in terms of quality and economy for both HD and SD content.

Inside the compressed workflow

Figure 1 compares compressed and uncompressed workflows, showing the equipment and processes associated with each. Until recently, the only feasible way to mix and switch has been in the baseband domain using traditional tools such as routers, master control switchers, DVEs, logo inserters, etc. Unfortunately, when a facility moves content in and out of the compressed domain (for example, for spot insertion and branding), the process can degrade the signals and introduce artifacts and noise.

By using a compressed workflow, a facility can keep content in compressed form from ingest to output and maintain image quality. In a multicast environment, this idea has great appeal, particularly when only a few local processes are required. Compressed workflow can help a facility manage its bandwidth to a much higher degree (perhaps to the point where the facility could launch new channels or services).

Functions offered by new compressed workflow tools include the ability to open ASI streams and perform image manipulation entirely in the compressed domain without decoding and re-encoding. This means that, with both HD and SD formats, users can switch streams, insert logos and crawls, and display emergency alerts. Several manufacturers also offer servers that can ingest and play out in the ASI domain, while some provide more sophisticated mux, trim and remux functionality. Together, these processing and server tools provide a powerful compressed backbone. The savings in equipment, operational complexity and multiple decode/encode passes can be significant.

When considering a compressed workflow, there are two important caveats:

  1. As a prerequisite, the MPEG-2 processing equipment requires emission-level coding formats such as MPEG 4:2:0 encoded video and Dolby AC-3 encoded audio. There are now several devices on the market that provide this type of processing functionality. If your facility has 4:2:2 encoded video, MPEG Layer 2-encoded audio or Dolby E-encoded audio, then baseband transcoding is the way to go.
  2. If the production requires voice-overs that mix new and/or existing audio, or if it requires over-the-shoulder effects, baseband is the correct path.

Caveats aside, facilities can open new opportunities to preserve quality across the transmission path and reduce the amount of equipment required by organizing the processes according to compressed and uncompressed requirements and then adapting to the trade-offs,

Infrastructure metrics

Linking the compressed and uncompressed infrastructures together through a signal environment comprising shared storage, metadata and format conversion can optimize the interoperability of the two cores. This linked environment benefits the distribution and management of SD and HD content.

When considering this kind of environment, keep in mind three fundamental metrics: storage, metadata and codecs. Attention to these areas will lay the foundation for current technologies and future developments.

Storage — Determining storage requirements is an exercise in bandwidth and capacity management, with many points to consider:

  • Ensure that the facility's storage fabric supports the maximum simultaneous bandwidth requirement, and that it can store an appropriate amount of online content. This metric is complicated by multiple compression standards and formats, including HD and proxy quality.
  • Consider ASI as a storage fabric component, requiring the same real-time QoS as baseband.
  • Reserve an appropriate amount of bandwidth for non-real-time activities such as IP-based file interchange.
  • Consider multiple compressed formats used for different purposes within the same storage (e.g., DVC25 for news editing and MPEG-2 for transmission).
  • Ensure that there is a method to track and catalog assets. This ties in directly with the next metric.

Metadata — Now is the time to consider using metadata on a facilitywide scale. The facility must identify content (SD, HD and compressed) at ingest and normalize it across the entire organization. By bringing those threads together under a common metadata umbrella through the use of unique material identifiers (UMIDs), facilities can realize new cataloging, searching and reporting functions. Yes, this represents a lot of up-front work, but understanding the metadata workflow is a prerequisite to managing assets successfully. Some newer MPEG-2 processing devices actually require a set level of metadata to properly identify and manipulate the program streams.

Codecs — Instead of second-guessing the industry in terms of what compression standard or video format will dominate, consider migrating toward an infrastructure that is codec-aware. The graphics community has worked in such an environment for years using programs that easily convert files between the various formats. Why not incorporate that same flexibility into your facility's video infrastructure? If an architecture supports plug-in codec capability, users can handle anything that arrives at the ingest port. Within the industry, there is movement in this direction. And if Moore's law has anything to do with it, the required processing speed will be there.

Equipment decision points

To gain the advantages of a compressed infrastructure, consider using several categories of equipment that operate in parallel, each with value-added support potential for compressed content.

Routers — Consider a scalable router with a modular architecture in which ASI signals can work side-by-side with standard SDI and baseband signals. This is a cost-effective way to route multiple program streams on single crosspoints.

Servers — Consider a server architecture that enables the facility to store content in the same compressed format in which it will be transmitted. Ensure that the architecture supports distributed storage and simultaneous access from multiple devices. As the compressed workflow increases in scope, more peripherals will be requesting ASI or file-oriented content.

Master control — Consider a master control switcher that can function as a program manager for multiformat routing switcher ports (HD, SD and ASI). Moving to a modular architecture would be ideal and economical, where all of the input processing is offloaded to the upstream switching fabric. The industry is moving in this direction and, here again, metadata will play a vital role. With this kind of sophistication, content and data awareness, automation can properly orchestrate master control functions.

Editing — For both news and production, consider editing in the same format in which the material is ingested. Remaining in the native acquisition format avoids additional decode/encode passes prior to transmission. To minimize artifacts, a facility that ingests in DV should avoid editing in MPEG.

Following the trends

Several industry trends underscore the need to recognize the importance of the compressed workflow.

Networks are now providing member stations with multiple SD and HD programs in ASI format. The affiliates have a choice: convert to baseband and risk adding artifacts, perform a pass-through in the compressed domain without branding, or use MPEG-2 processing devices for real-time manipulation within the ASI stream.

Program assembly is occurring later in the transmission chain, even to the point where intelligent set-top boxes are compiling the final user experience. These new STBs enable a more personalized experience that includes the program's soundtrack, commercials based on demographics, extended data services, selectable camera angles and storage. Compression and multistream delivery make this environment possible.

This is where the technology is going, and there's no viable argument for a comparable scheme in baseband. Thus, the driving force behind a move to strengthen a facility's compressed infrastructure is not simply economic, and not exclusively to comply with an integrated SD/HD mandate. Facilities that follow this path can realize tangible quality benefits and will be positioned for technology's next advance.

Stan Moote, Steve Sulte and Todd Roth lead Leitch's CTO group. Nabil Kanaan is a product manager at Leitch.