The Hybrid World of SDI and IP Compression

Television broadcasters have been using compression and multiplexing since the late 1990s. Primarily used in satellite transmission for backhaul, contribution and internal distribution, compression and multiplexing were used to optimize the use of transponders, enabling service providers to put multiple signals on a single transponder in a cost-effective way.
Author:
Updated:
Original:

Television broadcasters have been using compression and multiplexing since the late 1990s. Compression and multiplexing were primarily used in satellite transmission for backhaul, contribution and internal distribution, to optimize the use of transponders, enabling service providers to put multiple signals on a single transponder in a cost-effective way.

In today’s production and distribution environment, compression is an integral part of the IP workflow from ingest to distribution. There are a considerable number of compression standards and formats used in the creative production process and a different set of compression standards and formats used in distribution and delivery. In creating programming from multiple sources for distribution and delivery, the content is typically compressed and packaged into containers for distribution.

For ATSC, this container is the MPEG Transport Stream (MPEG2.ts). Program channels and program guides are multiplexed into a single stream for transmission. The ATSC tuners and set-top boxes decode this container into the different channels and channel information. The complete MPEG2.ts is a transport stream that carries audio, video and data (i.e., PSIP and EPG).

Multiplexing

Multiplexing is the process to aggregate more than one signal over a single carrier path. There are three technologies used in multiplexing:

  • Time Division Multiplexing – TDM - time shifting packets in a single stream.
  • Frequency Division Multiplexing – FDM – different frequencies modulated to a single carrier.
  • Wave Division Multiplexing – WDM – different optical wavelengths (colors) on single fiber.

In application, there are two primary methods:

  1. Static or Fixed Key. Designating a specific number of channels within the allocated spectrum and assigning a fixed amount of bandwidth to each channel. The program content is encoded based on the fixed channel bandwidth allocation.
  2. Stat Mux. Statistical multiplexing or dynamic allocation. Allowing mathematical calculations (algorithms) and bitrate monitoring to vary bandwidth is based on the instantaneous demand of a stream.

All digital content for distribution is compressed; therefore, the bandwidth it requires constantly varies; it is much more efficient to transmit several channels together and use multiplexing to let them share the same bandwidth. In Stat Mux, each channel gets the instantaneous bandwidth it needs and the allocated bandwidth is dynamically allocated across the number of channels configured. In the case of ATSC, this is what determines the number of HD, SD and Mobile channels that can be placed in the ATSC spectrum.

In digital broadcasting, a statistical multiplexer is a content aggregating device that allows broadcasters to provide the greatest number of programs in a given amount of bandwidth by sharing this bandwidth among multiple streams of varying bitrates. Stat Mux is a more efficient use of bandwidth, as it enables more channels into a fixed spectrum.

With the transition to ATSC and ATSC Mobile DTV (ATSC MDTV) broadcasters are now facing both technical and operational challenges in delivering multiple program streams (multiplexing) into their Over the Air systems.

In Multiplexing and the Transition to IP Video, terminology has also changed. Instead of fixed key, we use constant bitrate (CBR) and stat mux is now variable bitrate (VBR). When we multiplex channels, we are IP grooming.

Broadcasters’ multichannel offerings come from a combination of live and pre-recorded sources, SD/ HD-SDI, and are played from tapes, optical disks, files and as IP streams. Broadcasters are always looking at ways to maximize and optimize their use of the ATSC spectrum. They are constantly looking for more flexibility to be able to offer a broader mix of HD & SD channels?

One of the many technical challenges is how to handle multiple encoding processes, and manage quality with audio and video processing including conforming to regulatory mandates (i.e., The CALM Act) by using automation and consolidated technology tools. One of these challenges is the large number of unique devices and media handling processes required to achieve this.

The ATSC spectrum of 19.39 Mbps 8-VSB was originally framed solely around MPEG2. The typical HD channel uses between 10-11.5Mb/s and an SD channel 1.5-3Mb/s. The ATSC MDTV specification now enables a broadcaster to have multiple television channels with multiple mobile channels, with each mobile channel using 917kbit/s out of the total ATSC bandwidth. The primary ATSC specification is based on MPEG2; the ATSC Mobile Specification is based on using MPEG-4. The whole ATSC MDTV specification can transmit both MPEG2 and MPEG4 streams simultaneously.

While accepted as a standard for ATSC MDTV in 2008, MPEG4 has not yet been adopted into the entire ATSC transmission system. That change will promote greater efficiencies, allowing for additional high bandwidth channels (HD) that can be broadcast.

To meet the FCC ATSC mandate, broadcasters only need a single HD and SD channel for Over the Air television. Once that requirement is satisfied, the broadcaster can divide the spectrum into many different channel configurations between HD, SD and mobile ATSC. Compression and multiplexing are used to optimize the use of the spectrum and offer the maximum amount of programming.

Quality Control

Broadcasters are always looking at ways to streamline these operations with technology and automation. The fewer devices that handle media, the less opportunity there is for introducing problems.

There are still a number of issues being sorted out in compression; latency is one—and particularly audio to video latency (better known as lip-sync error). Other issues that occur in transmission are those that can result in video pixelization, freezing and blocking.

Signal processing and quality control continually occur throughout the production chain. In the baseband transmission path, the transmission or delivery process is the final place where there is signal processing to assure conformance to broadcast specifications and overall quality assurance. In SDI baseband, the final step before transmission is the audio and video processors and legalizers that ensure compliance and quality of viewer experience. Now, that process takes place just before the signal goes to the encoding and multiplexing chain. There are multiple converters, transcoders and encoders that normalize the signal to an ASI or IP stream to the transmitter. However, once the encoding process begins, QC concerns itself around of the digital data attributes of the program content and its quality of service to the viewer at home.

One of the challenges broadcasters face today is signal processing in the IP ecosystem. Currently, signal processing occurs in digital baseband; the test and measurement and processing tools that allow contouring and/or correcting audio and video are SDI baseband tools. In the IP production environment, craft editing and graphic tools have built-in tools to contour media for file delivery; however, QC for ASI/IP streams in a signal transmission system requires both stream monitoring and file-based tools to acquire measurement and analysis data.

Air checks are the common method for evaluating the transmission path for Over the Air Distribution. In the analog world, the only devices in the RF chain were the modulator, demodulator and transmitter. Now, in ATSC with multiplexed and compressed programming, the “Air Check” encompasses many more processes and is integral to the QC processes at the program origination point. Now, after final processing and legalizing, the programming is encoded (compressed), multiplexed (groomed) and then transmitted. On the monitoring side, ATSC is received, demodulated, de-multiplexed and decoded. It’s a little more complicated tracing a problem this way, since “off-air” has more components between master control and the transmitter; plus, it is also decoded at the receive device. The signal is encoded, transcoded, multiplexed and decoded in so many places.

The migration to a complete IP eco system is still evolving. Compression is now one of the mainstays. Successful delivery of multiple channels and multiplexing is dependent on quality control and management.