Facilities like those at ESPN and Turner Entertainment employ HD infrastructures and offer HD production values similar to those restricted to SD producers just a year or two ago. Photo by Andy Washnik, courtesy Thomson Grass Valley.
Sitting in my living room, watching an absorbing HDTV presentation of my favorite primetime program, I am mesmerized by the quality of what is presented to me. The production process that created this experience is of little concern to me, I'm just interested in the results. Yet for a broadcast systems engineer, a single philosophy should direct the playout infrastructure so the media is as identical as possible to the original.
Although not a production standard, the ATSC presentation layer impacts the design and operation of a BOC. SMPTE has authored numerous standards that define production processes. In the September 2004 issue of Broadcast Engineering, John Luff explains many of the details behind the standards that make HD broadcasting possible.
The life of a clip
A video clip has a defined lifecycle. The essence is captured by a camera or mic, digitized and compressed to a file format. It is then distributed over a media network (MN) and stored in a media assent management system. Shortly prior to broadcast, the clip is identified in the MAM system and then transferred over the network to a playout server. It is then converted to SDI and production is switched as it goes to air.
File size matters
An infrastructure design for production is dependent on the desired audio and video workflows. For instance, SDI video with embedded audio requires 270Mb/s for SD or 1.485Gb/s for HD, both of which are too large to store on anything but tape. Transfers must then be done in real time and transported to the BOC via sneaker-net.
Fortunately, by using I frame only compression, frame accurate editing can facilitate generations of encoding and decoding without noticeable video quality loss. Storage requirements for one minute of 40Mb/s MPEG-2 SD are 300MB. For 100Mb/s of DV HD video is 750MB.
Audio typically is not compressed and is digitized to AES standards. One minute of a single channel of audio at 48KHz sampled with 24 bit words occupies 8.64MB, where eight channels takes 69.12MB of storage.
Twenty-two hours of HD compressed material or DV100 with eight channels of AES audio can be stored on a 1TB server. With MPEG-2, 40Mb/s SD, up to 55 hours of audio and video can be stored there.
One minute of HD video compressed to 100Mb/s would take six seconds to transfer. In reality, the transfer is at about 2.5 times real-time so a one minute clip at best would take 24 seconds at 250Mb/s to transfer. Therefore, if a 1TB playout server holds 22 hours of HD material, that material must first be copied onto the server and played out to air. Using the 2.5 factor, 22 hours of programming would take nearly nine hours of time just to load onto the server.
With the exception of a few pure HD channels, production infrastructures will have to support both HD and SD formats into the foreseeable future. And even if that system can handle multiple data rates, there are important asthetic issues to consider. The first is aspect ratio. For example, how do you properly frame a 50-yard line shot for both the 16:9 HD and 4:3 SD pictures? If the camera operator frames for SD, once upconverted, there would be a lot of extra grass in the frame. If the camera operator frames for HD, SD viewers may miss important action that occurs in the side panels.
Starz Encore uses an OmniBus System’s Colossus in its master control center to control 72 channels of video.
In the Feb. 2 issue of HD Technology Update, OmniBus Systems Vice President of Technology John Wadle discussed his vision of a simplified infrastructure for SD/HD coexistence and introduced the concept of a mix server. With this approach, rather than production switch SDI in a production centered room, a mix server performs this operation with a file-based methodology.
When the program reaches the consumer, audio and video must be converted to the proper receive format. A 720p source may have to be transcoded to 1080i for a display and 5.1 sound may have to be collapsed to stereo. While this is a consumer issue, keep in mind that the results depend entirely on what's being done "upstream."
In every scenario, it’s difficult for the programming originator to accomplish quality control. However, invest a few dollars and a few sufficient space to build a living room like environment in your BOC and watch all of your broadcasts as you would at home. Only then will you be able to evaluate the subjective impact of your programming on your viewers.
“HD systems,” By John Luff, Sep 1, 2004 http://broadcastengineering.com/mag/broadcasting_hd_systems/index.html
“SD and HD: Coexisting in a multi-resolution environment”, Feb 2, 2005 HD Technology Update e-newsletter http://broadcastengineering.com/newsletters/hd_tech/20050202/automation-sd-hd-20050202/index.html