The last newsletter covered some of the most common video file formats in use. This tutorial will explain the functions of containers and wrappers in the modern broadcast facility.
Containers and wrappers
Special file formats called containers are used to combine or hold the audio and video elements (files) within one file for convenient storage and transport. Some video servers store the audio and video elements separately on their storage systems, while others use container formats to keep these separate elements together. All video servers store both the audio and video clips as well as a database with information about those clips. When the audio/video files are transferred to another system, the data about them also needs to be transferred — that is where metadata comes in. Metadata is the data about the data that makes up the audio and video elements. To join the container, with the audio and video elements along with the metadata, a wrapper is used. A wrapper is a type of container used in professional video to combine the elements (audio and video files) as well as the metadata.
The differences among elements, containers and wrappers can become confusing because some elements and containers share the same name. For example, MEPG-2 is a compression codec for digital video but is also a container when the audio is combined with it; the difference is in the file extension used. And when the files are contained inside a wrapper, the question of whether it’s an MPEG-2 video element with a separate AIFF audio element or an MPEG-2 stream (container) with the AAC audio element combined can only be answered by the metadata, because all that’s displayed of the file is the wrapper and it’s file extension. This type of information becomes more important as we combine and wrap the basic elements for easier storage and transport.
Containers have been around for many years and are in use everyday worldwide. Just about every video clip played on the Internet comes in a container. They can hold many different types of elements or codecs. Because of this, one of the important functions of the container is to inform the playback device what type of codec is required to decode the elements within.
For a wrapped file to be usable, the receiving device must be able to understand and decode the wrapper and separate out the container and the metadata. Then it must be able to decode the container from information stored in the metadata and separate out the audio and video elements. Next, the receiving device must have the correct codec to decode the audio and video elements for playback or processing.
Wrappers have been developed to make is easier to exchange video files and the information associated with them between various systems such as nonlinear editing systems (NLEs), video servers and some digital VTRs. The metadata carries different information depending on where it is being used. Within a production environment, the metadata can carry information about the various scenes, director notes, scripts and so on. In a broadcast facility, the metadata could carry the start and end dates for the spot, the contract number as well as other information that would normally be stored in the traffic system. If not for wrappers, manual data entry would be required to exchange this information where errors and omissions can occur.
What follows is a list of the most commonly used containers and wrappers today.
AVI (Audio Video Interleaved) — Developed by Microsoft for video for Windows in 1992, AVI is a container for many types of audio and video elements. Some consider it an outmoded format, but it’s used in many systems including several NLEs.
QT (QuickTime) — Developed by Apple in 1991, QuickTime is a container format well suited for editing purposes. QuickTime can hold a wide range of elements, including video and audio codecs.
MPEG-2 — When audio is combined with MPEG-2 video, it becomes a container. There are two types of MPEG-2 containers. Transport Stream (TS) can carry several different video and audio elements, as is found in ATSC DTV transmissions. The MPEG-2 Program Stream is similar to MPEG-1 in that it carries only one video element and associated audio.
MPEG-4 — When audio is combined with MPEG-4 video it becomes a container, Part 14.
AAF (Advanced Authoring Format) — AAF is a highly complex format that contains information (metadata) about the script, effects, editing as well as other data concerning the production of a TV program. AAF is meant to replace all the paper notes and other media and contain them all within the AAF format, where it will be kept together in one place. Basically, it is a production-oriented wrapper format.
MXF (Material eXchange Format — SMPTE 377M) — MXF allows for user data and metadata to be encapsulated with the audio and video in the same the file. MXF actually uses a subset of the AAF metadata format, but because of the complexity of the AAF metadata, a smaller but related version of the metadata was created and made part of MXF. This subset of metadata works much better in exchanging files between servers and storage systems where the metadata contained in AAF is not needed and could be a burden to the systems using it. MXF is intended for finished programs, where they will be stored and aired or streamed.
GXF (General eXchange Format — SMPTE 360M) — Originally created by Grass Valley Group for transporting compressed video files over Fibre Channel networks using FTP, GXF at first only supported JPEG video and uncompressed audio, but it has been enhanced to carry MPEG, DV as well as HD. MXF and GXF are similar in function in that they are both intended for daily operations of a TV station or network. Although GXF is used in many systems, MXF has gained wide acceptance and is slowly replacing GXF in newer systems.
OMF (Open Media Framework) — OMF was developed by Avid for use in its products. OMF is widely used around the world but has never been officially adopted as an industry standard. It too is a wrapper format with its main use in post production. Because of its widespread use, OMF has become a de facto standard and likely will be in use for years to come.
As the industry develops, more video formats will be created to meet the needs of future transmission and storage systems. Listed here are just the first methods that have been created to deal with content, content compression and transportation as we enter the digital age. Just as computing power and storage capacity have grown well beyond what was first thought to be their limits, so too will file compression and containers grow as the need arises.