A recording conundrum

The goal is achieving a standardized method of interchanging content between recording media.
Author:
Updated:
Original:

The history of video recording has a remarkably short timeline, at least in the commercial sense. The first generally available solution was the Ampex VR-1000, developed by a storied team in the mid-1950s. When the National Academy of Television Arts & Sciences belatedly awarded a Technical Emmy a few years ago, Charlie Anderson and Ray Dolby represented the rest of their colleagues (many deceased) and shared some wonderful stories. It is interesting to note that Ampex was an audio recording company that attacked the problem in a philosophical way as a scale up from the technologies used in analog audio recorders. However, in some fundamental ways, the company had to forget what it knew about audio and attack the wide bandwidth, tape-swallowing monster with innovation in order to get to a viable solution. One innovation was rotating heads; another was FM recording on magnetic tape.

A few other celebrated recorders were approached with similar innovation, perhaps none more fundamentally different than the first commercial digital recorder, the Sony DVR-1000 D-1 introduced about 20 years later. Digital video was not developed for the recorder; rather, the opposite is true. The ITU-R BT601 standard needed a recording system, and the natural extension was to assume tape was the medium that was appropriate. Indeed, primitive digital disks — both metal platters like those in current hard drives and oxide coated arrays of platters — were in common use, but with very limited storage capacity. Tape offered much higher storage density and commercially viable economics. Digital tape offered uncompressed performance that didn't degrade from generation to generation — a pretty radical change in product concept.

In order to understand modern recording technologies, it's critical to know that commercially viable solutions can be made using many approaches, but the ones most successful no longer work financially unless the technology was researched for other purposes. For example, the success of DVCPRO in the news marketplace was fueled by the research done by a consortium, including both Panasonic and Sony, into a DV-based consumer recorder standard. DVCPRO adopts the fundamental technology and adapts it to the rigors of broadcast usage. Essentially every recorder, server and replay system available today leverages IT or consumer electronics research to achieve economics that can succeed.

This applies to every recording technology I looked at in researching this article. Hard drives are IT hardware repurposed. Memory card recording is IT hardware, with applications in consumer camcorders. Optical disks are developed for both consumer delivery of packaged media and IT archival storage. It is important to know why this has become a universal theme in our industry, applicable to everything from camera to displays. We are simply too small of a marketplace in the global economy to be of much interest to large-scale industrial development without multiple ways to use a product or component. In total, the broadcast hardware marketplace is smaller than the size of HP's printer business. (HP's 2009 Q4 Report showed that the company's printers and imaging revenue was ~$24 billion in 2009 and ~$29 billion in 2008). Broadcast worldwide is a fraction of one company's revenue in the IT sector.

If you want to make an efficient way to develop and market a technological product, don't spend a fortune on primary research applicable only to broadcast. Here's the best example: MPEG is decoder-centric. Encoders are expensive and a relatively small market. Decoders show up in orders of magnitude more places and are intended to be cheap to deploy. The broadcast industry adapted MPEG to video recording and developed the MXF standard based on something done to deliver video to the home cheaply. As my math professors used to say: Q.E.D. MPEG development was practical and chip deployment commercially viable because the market for consumer electronics swamped the potential for a broadcast product.

So it is not a surprise that using inexpensive components, we can build sophisticated and relatively expensive video recording systems — many of them. Charlie Jablonski, a former NBC engineering executive and SMPTE president, has often said NBC “…never met a tape format they didn't like.” This is not surprising. Each one has attributes that make it well suited to specific tasks, and certainly one of those attributes is cost. Each one has a dedicated marketing team trying to fight through a thicket of competing commercial solutions. I find it interesting that with the exception of the large networks around the world (the big three in the United States, BBC, NHK, ZDF and a few others), no one delivers specifications to manufacturers and says “develop this product, and we will buy it in quantity.” Increasingly, even those important voices play a smaller role in defining recording products in advance. It has become more generally accepted that major users review rough designs and make suggestions about modifications to the packaging and performance that will make it more acceptable for their individual use.

I remember when Julius Barnathan, a longtime ABC executive, told every manufacturer that he was not buying a new tape format until someone got the tape size down to 1/4in. Bosch responded with what it cleverly called “Quartercam” (6.25mm in metric terms),which was probably a blatant play to ABC at the time. Today, that would be an unlikely story, but I know of one network that specifically told a manufacturer its news recording format would be acceptable if it also had an option for recording to removable memory cards, which was coincidentally already in development for consumer products at the same company.

Interchange wars

It is also important to point out one other parallel to early recording technology deployment. We used to have format wars, with some companies insisting they would only purchase a format if it was supported by more than one manufacturer. The result was industry cooperation, with SMPTE acting as the standards body that facilitated the exchange of design information necessary to achieve a unified standard.

Today the battle is enjoined around file interchange as we move away from removable media for storage, and again SMPTE leads part of the charge to establish standards necessary to allow interchange of content. MXF is a central part of that interchange. But owing to the complexity of the problem, and the huge number of pages of technology the MXF standard contains, we still have not achieved a simple method for interchange. The Advanced Media Workflow Association picks up where SMPTE stopped with MXF and has created application specifications, which constrain MXF options with the goal of making content interchange more predictable.

It is worthy to note that MXF does not standardize the content, the essence itself, but rather standardizes the method of delivering a file that can be read if the receiver has codec appropriate to the essence. Therein lies part of the conundrum in modern recording systems. If we could all agree on the coding standards for the essence, it would be easy to specify a standardized method of interchanging the content between various recording media (hard disks, memory cards, optical media, etc.). I hope we get there in my lifetime.

John Luff is a broadcast technology consultant.

Send questions and comments to: john.luff@penton.com