Skip to main content

Want to Get Real HDTV? Turn Down the Lighting

You might not have noticed that no one has started selling an anti-Seidel deconvolver yet. Welcome to another episode of TV Technology's favorite game show, "What's HDTV Anyhow?"

I know some fans of dialnorm and AFD (active format description) who might be interested in a device to counter the influence of CBS's Bob Seidel in those areas, but that ain't the Seidel I had in mind this month. I was thinking more on the order of a 19th-century son of a German postal worker, one Philipp Ludwig von Seidel. Or maybe I should say I was thinking more on the third orders of Seidel (but that gets into approximations of sine and cosine functions based on series of powers divided by factorials, so, then again, maybe I shouldn't).


Anyhow, I seem to be getting ahead of myself, which is pretty easy when you're as slow as me. Have you ever heard the expression, "The eyes are the gateway to the soul?" Well, Larry Thorpe at Canon has been saying lately that the lens is the gatekeeper of image quality. And, much as I'd normally sooner believe a politician than someone marketing a product in our business, methinks he's actually right.

Think about it. The very first thing an image hits, before it even enters a camera, is a lens. But there's a problem: Lenses ain't electronic.

No offense, but methinks you ain't exactly an optical engineer. Heck, methinks Larry Thorpe ain't one, either. You might be able to program Trellis codes in your sleep and solder multilayer circuit boards behind your back, but that object hanging off the front end of your camera consists of discs of glass—lots of discs of glass, made from different kinds of glass and in different shapes—with some leaves of metal forming a variable size hole. Electronics ain't going to help you a lot here.

I'm going to start with that hole, on account of it's the simplest part of a lens. I mean, it's just a hole, right?

The trouble is: all holes have edges, and light does funny stuff when it passes an edge. Think about the last time you drove down a multilane highway that was reduced to one lane due to construction. As soon as the restriction ended, the cars fanned out, right?

It turns out that light does the same thing when it passes an edge. It fans out. This process is called "diffraction," maybe on account of it's more difficult to figure out than a simple fraction. Anyhow, I'll try to explain it.


Light can be waves and particles at the same time, but that's too hard for my last remaining neuron, Nellie, so I'm going to stick to waves. Waves come in cycles, and cycles have peaks and troughs.

When light spreads out due to an edge, the waves charging straight ahead are going to get to a plane a little earlier than the ones taking a diagonal route. If you take a snapshot of the wall at the moment the straight part hits its peak, you'll see a bright dot in the center, shading towards a black surround, and then maybe some rings of diminishing brightness beyond that. It's called an "Airy" disc, even though there ain't anything airy about it.

Anyhow, that's one pixel. Another pixel will produce another Airy disc. But what happens where the Airy discs overlap? Where the bright center of one falls on the dark surround of another, they'll cancel each other out. Between total cancellation and none whatsoever are all the shades of gray.

There's a formula for figuring out contrast loss due to diffraction. It's one of those "one-minus" types of formulas, meaning that anything greater than zero means less than 100 percent contrast. The anythings are the light wavelength, the lens f-stop (there's that hole), and the fineness of the pixels (more resolution or smaller image sensors mean greater fineness). So, the smaller the iris hole (the higher the f-stop number), the greater the contrast loss.

At some point, the contrast drops to zero, and you start losing resolution. For red light, 1080-type HDTV, and a 2/3-inch-format camera, that magic number is F13. If you're shooting at F16, you ain't shooting 1920x1080, and I don't care what camera or lens you're using. For a 1/3-inch camera, it's F7. And that's for zero contrast. Heck, you're down to 69 percent contrast at maximum resolution on a 2/3-inch camera even at F4.

"So, Mario, are you saying we should always shoot at maximum aperture?"

As the frizzy, tied cord said, "I'm a frayed knot."

Remember Seidel? He came up with a bunch of lens aberrations that make pictures blurrier, and a lot of them—spherical, comatic, astigmatic and Petzval (don't ask)—get worse as the hole gets bigger (lateral spherical as the cube of the hole size). And those are just the monochromatic aberrations. There are also chromatic ones.


Complicating all of this is the work lens manufacturers do to make their stuff better. Heck, back in the 19th century, Joseph Petzval made a complex lens out of four glass discs (two of them cemented together), and that was for fixed-focal-length, black-and-white still pictures. But sometimes you just can't seem to make a lens good enough.

For NHK's ultra-definition TV camera (16 times more pixels than 1080-line HDTV), they got around the problem by getting Astro to make a digital chromatic-aberration corrector. Panasonic picked up on the idea and put chromatic-aberration correction look-up tables for popular lenses into some of their cameras. But that's just chromatic aberration.

At IBC, Thomson said they're showing a lens-aberration corrector that goes beyond color. I ain't sure they've nailed it, but the idea gives me goose bumps, on account of mathematical image deconvolution ought to be able to fix other stuff, too, like being out of focus or even somewhat shaky.

Here's to the future! Meantime, keep an eye on your iris.