Numbers, Fingers, Toes and Quality

You might not have noticed that golden "CD-quality" audio has a dynamic range of 98 dB. Okay, so maybe you've noticed that. There are 16 bits, each contributes 20 log 2, and no one ever talks about the remaining 1.7 dB. Far be it from me to tamper with precedent.

I'll try again. Have you noticed that there are really only two definitions of the word digital? One means related to fingers or toes. The other means numerical. That's all. So, unless you happen to be a finger, toe or number fetishist, there ain't any definition of digital that happens to mean good.

If you're involved in medical TV, maybe you use the finger/toe meaning of digital once every blue moon or so. For the rest of us, digital means numerical.

Now then, numerical can be a pretty good thing. For one, it means repeatable. If you need to set something to 3,579,545, you can set it to 3,579,545, not "about yea much" or "a smidge higher than a bunch" or some other accurate setting technique.

Numbers are also pretty danged decipherable Maybe you can't sing, dance or look like Jennifer Lopez, but I'll bet that if she says 601 to someone and you say 601 to the same someone, that person will hear the exact same number from both of you.

Last, numbers are easily controlled. If you want to do something to numbers (other than repeating or deciphering them), that's called computing. Methinks you have heard about computers.

So, let's say you want to record or transmit some audio or video. Ahoy, decipherability! With numbers, your signal might seem to gain some weight (for instance, 705.6 kbps for one of those "CD-quality" audio channels versus 20 kHz for the original analog), but you lose all the noise and interference that multigeneration recording and real-world transmission have to contend with. It's a good deal.

AHOY, DECIPHERABILITY!

"But, Mario, what do you mean 'original analog'? What if you start with a digital signal?"

I'm glad you asked! There surely is a veritable plethora of devices out there calling themselves "digital cameras." Not a one of them is.

They might record digital signals. They might process signals digitally. But to capture pictures, they use a chip with analog sensors.

"But, Mario, the chip is divided into a number of columns and a number of rows. Don't those numbers make it digital?"

Nope. You've got a birth date and a number of eyes and ears. Does that make you digital?

Each of the sensors on the imaging chip squirts out a voltage proportional to the light that hit it. Ergo, the electrical output is an analog of the light (look it up). It's continuous. Add a smidge more light, and you get a smidge more electrical output.

Digital ain't continuous. Whatever the light level is, it gets a number. Add a smidge more light and you get the same number. Add enough smidges and you get the next number. Subtract a smidge and you still have the same next number.

That means that there will usually be an error in a digital signal on account of most folks can't afford analog-to-digital converters with an infinite number of bits. But most of the time, you don't need an infinite number. Ergo, "CD-quality" sound gets away with just 16 (and, if it's done right, the last bit is half noise--but that's a story for another rant).

"But, Mario, what about photons? Couldn't it be said that an imaging chip counts photons and is, therefore, digital?"

Hey, if you're going to go quantum physics on me, I've got just one thing to say. What the bleep do we know?

Ahem. Right. Back in the real world, where was I? Oh, yeah! Things being called digital that ain't. Solid-state imaging chips ain't digital. LCD TVs ain't digital (but I'll give TI's DLP the benefit of the doubt).

Take an old "analog" projection TV. Jazz up the horizontal deflection circuit so it'll do 31.5 kHz instead of just 15.75 kHz, and the Consumer Electronics Association will count it as a "digital television," even though it's still the same old "analog" TV (with jazzed-up horizontal deflection). It doesn't have to have digital-reception circuitry, and it doesn't need to be able to show HDTV. If it's got a way to deal with a signal with 31.5 kHz H-rate, it's "digital." I am not making this up.

Anyhow, I already mentioned the benefits of decipherability. If you're shooting outdoors over the course of a day with temperature variations, then you'll appreciate the repeatability. If you need to squeeze a Gb per second of HDTV into a single broadcast channel, you'll appreciate the computability. So there ain't anything wrong with going digital when it's the right thing to do.

Sometimes, it ain't necessarily the right thing to do. Ever turned on a computer? How long did it take before it was all set to do whatever you wanted it to do? Ten minutes?

So, tell me this. Why are folks rushing to buy digital audio consoles for real-time purposes?

"But, Mario, what about decipherability, repeatability and computability?"

Decipherability's great for recording and transmission. Audio-mixing consoles do neither. There might be a wee mite more reason for repeatability and computability, but I ain't yet heard of a digital board that could do something desirable to audio that an analog one couldn't. And then there's that embarrassing 10-minute wait as you reboot the console on a live transmission.

"But, Mario, what about the sound quality?"

Have you ever met a decent analog audio console that couldn't deliver at least 98 dB of dynamic range? That's "CD quality."

This month's rant was inspired by a tale told by an audio mixer whose board hiccupped just before air, leaving nothing feeding the network. To err is human; to really screw up takes a computer.