NTSC’s Problems Were Fixed in Just 54 Years

You might not have noticed that no one complains about excessive blanking anymore. Yes, as America’s documentarian laureate, Ken Burns can tell you from painful personal experience, it’s time for yet another rendition of the DTV-Transition Blues.

Stream of consciousness: blues, hues, colors, color television, NTSC. Methinks it was the late, great Clown Prince of TV technology, Joe Roizen, the man who came up with the idea of a TV set that could interfere with your neighbor’s power tools, who first quipped that the letters stood for “Never Twice the Same Color.”

That was in 1967, 40 years ago but also 14 years after the NTSC color standard came out. It was one heck of a document.

DEVIL IN THE DETAIL

In some places, there was plenty of detail. It seems to me there was exactly one model of color TV that actually used phosphors matching those specified in the standard. For years, high-end monitors had a button you could push to make your pictures look ugly but simulate what they might look like on that one TV set. And how many of you ever ensured that your chroma was encoded along I and Q axes, with 1.3 MHz of bandwidth in the I and 600 kHz in the Q?

Yes, there was plenty of detail in the NTSC standard (and the adoption thereof by Our Beloved Commish, the FCC) that was quickly ignored. And then there were the details that just didn’t happen to appear (and I ain’t referring to high spatial frequency picture elements).

For instance, the standard called for a color subcarrier. It also called for a horizontal sync pulse. But the small detail of the timing relationship between the two got omitted.

So, every time a color videotape got played back, the recorded subcarrier got matched to the one in the plant, and if that made the sync pulse fall someplace other than where it ought, no problem. A proc amp would just stick on new sync, expanding the blanking interval in the process.

After enough generations of re-recording (as might happen in the good old days of linear post), the edges of a picture could become visible on a home TV. But so what? There was nothing anyone could do about it anyway.

Then along came the Vital Squeezoom, and folks could blow up pics to reduce blanking intervals, making everything soft in the process. Our Beloved Commish temporarily lost its mind and caused TV technologists to make lousy looking video (blown up, picture-framed in color, etc.)

By that time, the nonexistent RS-170A standard had already provided a relationship between subcarrier and horizontal sync (SC/H), new equipment met the standard, SC/H indicators flooded the market, and excessive blanking width was less of a problem than the fixes for it. It’s like what I said at the beginning, you don’t hear much about it these days, even in the analog-composite world.

You also don’t hear much about color knobs. Joe Roizen could go on for hours about the four knobs on NTSC TVs, for brightness, contrast, color (saturation), and hue (or tint). He’d tell how PAL TVs eliminated the hue knob and how SECAM eliminated the color knob, somehow bringing French President Charles de Gaulle into the picture.

So, I’ve got a question for you: When’s the last time you saw a new analog TV with four knobs? Does anyone know what a hue knob is anymore? If you say NTSC stands for Never Twice the Same Color, does anyone laugh?

WE’VE COME A LONG WAY, BABY

It’s taken a long time, but, on the verge of its being wiped out of existence, NTSC seems to be finally sufficiently standardized. Blanking stays where it’s supposed to. Colors are reasonably appropriately saturated and phased on most TV sets, and they don’t drift. Heck, SMPTE 170M even finally told us where the center of an NTSC picture is, in case anyone still wanted to use a clock wipe.

NTSC is on the verge of being wiped out, at least as a broadcast transmission mechanism in the U.S.A., on account of the nation’s Congress and president passing a law that says such emissions have got to end on the 51st anniversary of Clare of Assisi becoming the patron saint of television. They are to be smoothly replaced by the digital version that started in 1996.

HA!

I ain’t going to bring up here anything about reception issues. Those are ancient history. The same way you don’t need to diddle with a hue knob anymore, you don’t need the antennae of a radio-telescope observatory to pick up DTV broadcasts anymore. But that doesn’t mean we’re looking at a smooth transition.

There are receiver boxes that have been approved by the National Telecommunications and Information Administration for folks who’d like to continue to use their analog TV sets. Have those boxes been tested for AFD, active format description, the bits that tell TVs not to make everyone look too fat or too skinny? Nope.

Well, now, that’s OK. At least the sound won’t come blasting out as channels are changed or commercials come on, thanks to “dialnorm” metadata, required by law. That is, it seems to be required by law for everyone except CBS, or anyhow that’s how an alien analyzing their DTV transmissions might see it.

Okay, so there’s a flaw or two or three (don’t get me started on dual-stream audio decoding or long event information tables or stuff like that). At least there ain’t any problem with lip sync. There are presentation time stamps for video and presentation time stamps for audio, and receivers bring them together, right?

Not on Ken Burns’s documentary “The War,” they didn’t. I ain’t sure what the heck was going on, but the lip sync wasn’t just off; it was varying.

Now, then, I ain’t too concerned for the long term. DTV reception used to suck colossally, and now it ain’t half bad. And NTSC fixed its blanking, color, and some other problems over the last 54 years.