The evolution of encoding

It seems like we have had compression technology forever. We haven’t, unless you include the adoption of interlace scanning, the original analog compression system. Digital compression has existed for essentially as long as we have had digitized video (and audio). I distinctly remember attending SMPTE conferences as a much younger person in which a major topic was “bit-rate reduction” technology, another term for what we now call compression. Context is everything, so let’s venture back a bit to see why this was so complicated.

Before CCIR 601

Before ITU-R BT.601 (or its original popular name, CCIR 601) there was little agreement on the sampling format for moving images. The research into how to sample and store video was centered in fine research institutes on several continents. Of course, video was what we would today call standard definition, a distinction that was not helpful before NHK showed HDTV to the world in the late ’80s. And there was no common agreement on component video as the basis for imaging, nor storage and transmission.

So, sampling grids did not need to be locked to any other notions, and in fact there was no requirement that it be based on rectangular sample grids. Some proposals that were quite popular included samples aligned 45 degrees to the line structure.

Opening the barn door

When researchers looked into how to compress images, the work first had to be done to define some basic parameters, like how many samples per second it took to reasonably represent and transmit quality images. When SMPTE and EBU did the heavy lifting of trying to define a standard (601) for sampling images, it made a huge difference to the advancement of digital imaging.

It led directly to the development of the first practical digital recorder from Sony, which was uncompressed. Picture quality was never approximate; it was a full and exact reproduction of the sampled image. One might argue that sampling itself threw away valuable content, and some at the time no doubt did. Indeed, for some applications, that was certainly true. However, the adoption of the 601 standard opened the barn door and let research proceed on bit-rate reduction of standardized streams.

DCT-based compression was not invented for 601 sampled images, but quickly work centered on using DCT as the basis for compression. Two international standards were established to harmonize the work on compression worldwide. JPEG (Joint Photographic Experts Group) and MPEG (Moving Pictures Experts Group) created the base standards we use today in due process bodies from which all manufacturers and users would benefit. Other work preceded MPEG and JPEG, including work that created standards adopted widely in Europe but never very successful in North America, for reasons I have never fully understood.

The ETSI (European Telecommunications Standards Institute) compression standards (ETS 300.174) used rates based on integer fractions of European data transmission standards, especially E-1 at 34Mb/s. One-half and one-quarter of the E-1 rate yielded 17MB and 8Mb rates, which the EBU deployed widely on its satellite network in the ’90s. But with the rapid advances in MPEG compression, it became clear that MPEG offered higher quality for the same bit rate, and thus better economy for interconnection. Eventually, broadcasters worldwide adopted MPEG-2, most based on the DVB specifications which facilitated interoperability.

Other forces at work

At work at the same time, of course, are other forces that affected how compression products would be designed and deployed. It is important to recognize that while MPEG was always seen to be a one-to-many approach, with the expense of processing put on the single encoder and cheap decoders making the economics viable, the first deployments were exclusively in the backhaul market.

DVDs brought digital compression to consumer products only after several years of successful and increasingly high-quality deployments in the backhaul market. The use by PBS of General Instruments DigiCipher compression systems for network distribution beginning in 1994 is a good example of early deployments that were successful. But a DigiCipher encoder was expensive, and cost was a major barrier to widespread use of compression, for instance, in satellite newsgathering.

The development of specialized chips that enabled the complex calculations needed for MPEG compression to be done on single boards changed the dynamic completely. As solutions became more compact and less expensive, people began to experiment with more marketplace uses, which drove up volume and drove down prices. Early MPEG encoders cost an order of magnitude more than a decoder, and up to around 25 times as much.

Over the last two decades, that ratio has declined to perhaps 3:1 (professional decoders and encoders), and even less. Appliances for streaming content to the Internet can be purchased for less than a single decoder cost in the ’90s. We have also seen decoder implementations in finished consumer devices sold at discount for a fraction of what one encoding chip cost 20 years ago.

HEVC

As it always seems to be, the dynamics are not much different today. This year, we will see the first HEVC (High Efficiency Video Codec) systems, which should lower bit rates by 30 percent to 50 percent. The complexity of HEVC is astounding, and implementations in silicon are perhaps the only way it can be produced economically.

To gain the advantages HEVC can offer, the cost of an encoder/decoder pair will jump up, which makes the most likely early deployments look like the backhaul market, again. This cycle is likely to repeat again in a few years, driven by the impact of consumer scale deployments. HEVC may be the next compression technology to arrive in consumers’ homes. MPEG-2 has lived for two decades, H.264 is already approaching its second decade, and the early deployments of HEVC will certainly begin this year.

This dynamic repeats in many parts of our industry. Display technology intended for consumer deployment shows up first in professional uses, like flat panel monitors on set for productions, and later in consumers’ homes as increased quantities drive prices down and widespread deployments bring new technologies to the edge where they mature and follow the same cycle.

We have become an industry dependent on the infusion of development dollars into potential consumer technology, which then trickles into our small corner of the economic puzzle.

John Luff is a television technology consultant.