Skip to main content

I always seem to be harping about video, but what's happening in the world of audio? Are we, in fact, making any progress for the future of what we hear in our homes and in theaters?

Way back in the early 1960s, stereo audio recording was in its infancy, and the equipment all still used tubes. The setup at the Abbey Road recording studios of EMI, for example, looked like something pulled together from a World War II surplus. The fact that reasonable-quality recordings were made has always astounded me. At the BBC, every audio recording had to be optimized for the particular reel of tape that we loaded; the bias setting was different every time!

Noise-reduction systems

In 1965, Dolby Laboratories came onto the audio front. Based on patents that could have easily been dismissed by the BBC at the time — over prior art — the first single-channel A-type noise-reduction system, the A-301, was launched in 1966. It battled for a place in the audio recording studios of the world, and traction finally came with multitrack recording, which wouldn't have been possible without such a system. A 16-channel A-type system, the M-series, hit studios in 1972.

Meanwhile, Henry Kloss, the legendary proprietor of KLH, badgered Dolby into developing a noise-reduction system for consumer equipment. The result was the simplified Dolby B-type noise-reduction system. The corporation decided to keep control of the manufacturing and leasing of the recording coders, while licensing the decoder technology to the OEM vendors. The tiny licensing dollars per unit of sold product quickly encouraged widespread adoption.

B-type also rescued the Philips Compact Cassette standard from its dictation standard to make it a worldwide audiotape playback standard. The first players were made by Nakamichi — with various vendors' names glued to the outside — in 1970.

(Back in 1961, Signetics Semiconductor developed a decoder IC that made implementation of the standard even easier. Incorporating the IC also led to a fast blessing of a new product by Dolby Licensing. Philips later bought Signetics.)

The remainder of Dolby's audio improvements came about because of the then lousy mono audio quality of the Academy standard used in movie theaters. Dolby Stereo in 1976 and Dolby Surround in 1982 completely changed the movie experience — the former being just in time for “Star Wars” and “Close Encounters of the Third Kind.”

Digital standards started popping up in 1984 with AC-1, which was adopted by DBS the following year. AC-2 came out in 1989 and quickly became the standard for exchanging studio-grade recordings and mixes — both domestically and internationally — using ISDN. AC-3 was the final consumer delivery standard in 1992. It came to be known as Dolby Digital rather than a complex list of words and acronyms consumers would not understand.

5.1 Dolby Digital is in just about every audio delivery we can get, whether it be recorded media, satellite delivery, terrestrial broadcasts in SD and HD formats, video games and, of course, in the movie theater. Eventually, when the film medium goes away — as it must — the electronic projector's audio will also be Dolby Digital on present-day expectations.

7.1 and beyond

But where does it go from here? More and more sources are pushing for 7.1 systems, even for your gaming experience, but those just involve hanging a couple of extra speakers around the room. We have pseudo headphone systems, but there does not seem to be any radical developments in the works.

It has been 14 long years since Dolby Digital emerged. Prior to that, developments came fast; developments are still coming fast in the video world. How can a medium that needs so many bits for effective resolution be so slow to change?

Paul McGoldrick is a consultant based on the West Coast.

Send questions and comments to:paul.mcgoldrick@penton.com