Compression in a Hybrid World

The first video compression standard was introduced in 1984; this was the CCITT/ITU-T, H.120 Recommendation. It was low bit rate black-and-white used for video conferencing. Compression and Digital Broadcast have come a long way since then. There are a considerable number of compression standards and formats; they are different depending on whether the compression is to a file or stream. When a stream is created for Over the Air (OTA) distribution, there are only a few standards typically used: MPEG2 and MPEG4 H.264.
Author:
Publish date:

The first video compression standard was introduced in 1984; this was the CCITT/ITU-T, H.120 Recommendation. It was low bit rate black-and-white used for video conferencing. Compression and Digital Broadcast have come a long way since then. There are a considerable number of compression standards and formats; they are different depending on whether the compression is to a file or stream. When a stream is created for Over the Air (OTA) distribution, there are only a few standards typically used: MPEG2 and MPEG4 H.264. There are others used for contribution and delivery to online and mobile platforms.

Compression is now the accepted norm for program content distribution. SD/HD-SDI is primarily used in live production and original acquisition. However, recording or ingesting is typically now an encoding process, which uses one of the compression formats. When a file is being created, literally hundreds of codec options are available. Once the audio and video is encoded, it is then packaged into containers or wrappers.

Program producers will request a specific format or bitrate for the encoding that ranges from AVID DNx220 (220Mb/s) to H.264 and everywhere in-between. The finished program can range from 25Mb/s to 220Mb/s—all of which require a high level of compression to fit into the OTA spectrum. Once the production and craft work are completed, the program audio, video and metadata are compressed and wrapped into a container for distribution. A number of container formats are used in distribution (MPEG.ts, MXF, GXF, LXF, QT), all of them compressed.

The container for ATSC is the MPEG Transport Stream (MPEG2.ts). The complete MPEG2.ts is a transport stream that carries audio, video and data (i.e., PSIP and EPG). The program channels and program guides are compressed and multiplexed into a single stream for transmission. The ATSC tuners and set-top boxes decode and uncompress this container into the different channels and channel information.

Getting more channels into the ATSC spectrum of 19.39 Mbps8-VSB is just one of the challenges facing broadcasters. The current standard is based on MPEG2; a typical HD channel uses between 10-11.5Mb/s and an SD channel 1.5-3Mb/s. A considerable amount of compression is needed to fit channels into the Over the Air spectrum.

The transmission stage of compression can introduce new artifacts that cannot be monitored until the OTA signal is received. Before compression, a broadcast engineer or master control operator had the ability to perform a final QC before sending the signal into the modulator. Now once the program enters the transmission encode or transcode process, they can no longer perform signal processing, legalizing or QC monitoring. These all occur upstream from the encoding process.

To prepare programs for the multi-channel ATSC transmission, the broadcaster needs a unique device (encoder) to compress each program stream. This may also involve transcoding. Handling multiple processes on a large number of unique devices poses many technical challenges. This includes managing the quality of the audio and video while ensuring compliance to regulatory mandates (i.e., The CALM Act).

As in multiplexing, there are similar issues with compression; broadcasters look for more efficient methods to handle their transmission processes. This may be accomplished by a combination of technology and automation. The transition or conversion between different compression formats can introduce errors or artifacts. The fewer processes and devices in the transmission chain, the less chance for introducing artifacts or other problems.

Another technical challenge in compression is latency and transcoding, which can result in signal degradation. While transcoding is not a complete encode and decode process, compression formats are NOT the same; transcoding requires a partial decode and re-encode.

The transmission chain for a broadcaster has an HD-SDI baseband signal, leaving a master control switch or router, processed through legalizers then into an encoder. Once there, it is both compressed and processed for distribution. For OTA distribution, the HD-SDI 1.5Gb/s baseband signal is compressed to 10Mb/s to fit into the ATSC spectrum. That’s a considerable amount of compression and it’s important to manage the encoding to protect the integrity of the signal. As we discussed in our multiplexing article, multiple compressed streams are multiplexed together for transmission.

Quality Control in Compression

Quality control in the world of compressed media is different than QC in baseband SDI. In the complete media chain, quality control happens in a number of places. Programs need to be checked during the ingest process, which is the initial stage of compression. In a typical workflow, the media continues its journey until it has to be processed for transmission. As we have discussed, the transmission process is a sequence of decoding, signal processing, legalizing and encoding.

One common process for quality assurance is restoring the compressed media to SDI in order to check signal levels and parameters. When the media is compressed, checking the file or stream is different. One way to check the compression process is to do a concurrent de-compress, checking on SDI test and measurement. In the days of videotape, this was monitoring the confidence head or E-E.

There is one set of tools for checking the integrity of compressed files, and another set of tools for checking a compressed stream. When checking a compressed file, the analysis tools check the audio and video parameters. For video some of these are freeze detection, macro blocking, encoding standard, MPEG profile, frame size and frame rate and aspect ratio. For audio these are audio/video mismatch, MPEG PCM 48 kHz, peak audio, audio Phase, loss of sound and Dolby® Format Change.

The quality checks on compressed streams are different. These typically apply to the video and audio plus the transport layer, including captioning and program guides (EPG, PSIP). This analysis checks the video stream information, picture graph and bitrate profile, conformance tests, black video detection and frozen video detection. For the audio component, some of the checks are audio type, bitrate, sampling frequency, level, silence detection and digital clipping detection.

What this means is that many parameters in both compressed files and compressed streams can pose problems. This puts a burden on the compression tools and demands availability of the necessary tools to monitor the integrity of the content once it’s compressed.

For today’s broadcaster, the media-preparation process for transmission in a compressed world is complex. The programming gets encoded, transcoded, compressed again and multiplexed into containers for transport. In quality control, the media needs both baseband SDI and IP monitoring tools to ensure signal integrity. The SDI layer in quality control assures that the decompressed program meets standards; the IP layer in quality control ensures the integrity of the media transport.

The entire media workflow and process chain is still a hybrid of SDI and IP. The right tools are needed to manage all these processes in a streamlined manner.