Scaling Mount Ed Display

You might not have noticed that there is still a production aperture and a viewing aperture. Let me put that another way: Safe-action and safe-title areas exist even in the age of plasma panels and LCDs. And that's a problem.

Hey, let me be among the very first to admit that the problem ain't restricted to new-technology displays. Shoot in HDTV or even in 16:9 standard definition, and send it out letterboxed to the "Great Unwide" (the 99-percent plus of TV viewers worldwide who are still watching 4:3 screens), and there's something interesting that happens to the vertical overscan.

It disappears.

So, if you happen to be airing a widescreen Gene Kelly movie, and his feet are touching the edge of the active picture in the letterbox version, they'll be cut off on true 16:9 screens. Put them in the right place on the widescreen TVs, and the Great Unwide watch garbage above and below the good stuff.

"But, Mario, isn't this just a temporary situation until everyone has widescreen TV sets?"

Oh, sure. It's temporary. So are the situations at the divided parts of Ireland, Korea and the Middle East. Hey, Germany wasn't twain forever. Someday all TV sets not in a museum will be widescreen. After all, no one watches a black-and-white TV today, do they?

Anyhow, that's not what I wanted to rant about this month. But what I did want to rant about this month does have something to do with overscan.

First of all, if you think overscan has something to do with picture tubes, kindly run some floss through your ears. Then go to your CD-ROM of SMPTE film standards, or, if you ain't got that handy, pull an "American Cinematographer Manual" out of a library or film bookstore. It doesn't matter what edition.

SIZE MATTERS

Flip to the section showing apertures. Here's a typical camera aperture: 0.868 by 0.631 inches. Here's a typical projector aperture: 0.825 by 0.600 inches.

You have probably noticed that the first aperture is bigger than the second aperture. It turns out that 0.868 is a little more than 5 percent bigger than 0.825. And-surprise"-0.631 is also a little more than 5 percent bigger than 0.600.

I'll wait here for you while you search through the preceding two paragraphs for words like "video" or "television." Find any? No? That's good, on account of there ain't any.

Yes, Virginia, photochemical motion-picture film shot with a movie camera and shown in a cinema has pretty much always had overscan. That's on account of edges not being a nice place to be.

The ancients thought that if you got to the edge of the world, you'd fall off. Maybe they were right. I don't happen to know where the edge of the world is, but maybe they did. A lot of smarts have been lost over the years.

For instance, cinematographers have always known that there's a tolerance to projection aperture plates and that perforations need to be bigger than sprockets or the latter will tear the former apart. All of that spells jitter and weave, or to put it in words of at least four syllables each, positional instability.

Human vision can put up with a lot, but testing it with image edges dancing with cinema-drapery pleats is maybe not the best idea in the world. Ergo, the camera (production) aperture is bigger than the projection (viewing) aperture, Q.E.D. (which means "quit eating daisies" or something but looks good after a sentence that starts with "ergo").

Anyhow, that's film. In good old analog TV technology, the blanking intervals take the place of the perforations, and the sweep circuitry takes the place of the aperture plate. Or maybe it's vice versa. Whatever it is, picture edges have always had a bad case of the uglies, and overscan kept us from seeing them.

That was in the era of the wise ancients, who understood stuff. Now we are in the digital era, where everything is perfect all the time, right?

I can't think of any other reason how come in most digital standards (but not all-somebody has a clue), there ain't any such thing as a production aperture and a viewing aperture. What's 1080i HDTV? It's 1920x1080 in the video camera, and it's supposedly 1920x1080 on the TV screen.

Someday maybe I'll do another rant about how even analog composite-video cameras have huge amounts of CCD oversampling (maybe 1300 active sensors per scanning line even in an NTSC camera from which the FCC won't allow more than about 440 lines of TV resolution in that same scanning line). So why do HDTV cameras still have just 1920?

DAT WAS DEN, DIS IS NOW

Yes, I could understand it 10 years ago, when the first chip-based HDTV cameras were just coming out, and we had to applaud even 1920. But then 10 years passed. Didn't they? Hello?

Anyhow, that's not this month's rant. This month's rant is about 1920x1080 in the camera and 1920x1080 on the TV screen.

Now then, there surely ain't a whole heck of a lot of picture tubes out there that can squirt out 1920x1080. I'll second the emotion for projection tubes. But times are changing.

There's some sort of a research organization called iSupply/Stanford Resources (Stanford's quartermaster, perhaps?). It published a list of shipments of TV sets for the first quarter (aha!) of this year for some portion of the world. The list went something like this:

  • Direct-view CRT TV-11,186,000
  • CRT rear-projection TV-1,001,000
  • Direct-view LCD TV-365,000
  • Plasma TV-356,000
  • Microdisplay projection TV-265,000
  • Home front projector TV-106,000



Unrepentant geezers among you rejoice! Cathode-ray tubes are dominating TV screens-about 92 percent, not even counting those front projectors-as recently as this year!

Salivating geeks among you rejoice! By just the first quarter of this year, non-CRT displays have already taken over close to eight percent of the market!

Now then, perchance somewhere in Geekland, folks expect cameras and screens to have identical apertures, but, if so, they're forgetting things like MPEG and rise-times. I mean, if you've got a white picture edge coming out of nothingness, it's going to take it a while to get white, and thanks to our friends the filters, it'll ring all the way.

As for MPEG, macroblocks are 16x16, but 1080 ain't evenly divisible by 16, so ATSC calls for coding 1920x1088, but the camera just puts out 1080, so... Oh, never mind.

Elsewhere in Geekland, those employed in the design and manufacture of TV sets know that image edges are still a problem and need to be overscanned. No, scratch that. Some of those employed in the design and manufacture of TV sets know that image edges are still a problem and need to be overscanned.

A whole bunch more know only what they read in standards, and the standards say 1920x1080 and don't make any distinction between cameras, recorders, transmission channels, and screens.

The way I see it, Group B designs fixed-pixel display components like plasma panels, direct-view LCDs and projection microdisplays (like DLP and LCoS) with 1920x1080 or 1280x720 (or, worse, 1280x768) pixel counts. Group A designs the driving circuitry that ensures there will still be overscan.

That means that a 1920x1080 TV set showing a 1920x1080 picture is actually showing (if I plug in the old 5-percent overscan) something like the central 1824x1026 of what the camera captured. That ain't a terrible idea. If Group B made 1824x1026 fixed-pixel displays, it would be a downright great idea. But they don't.

They make 1920x1080 displays. That means consumers are watching HDTV through the haze of a scaling processor even when they watch 1920x1080 pictures on 1920x1080 TV sets.

Now, then, the mathematician and theoretical physicist in me would like to point out that, in an ideal world, a scaling engine is perfect, and display resolution doesn't have to have anything to do with source resolution.

AHEM

Are those guys gone? Good. Hi, everybody! It's the TV-watching practical engineer here, and I'd just like to point out that, no matter what anyone says about scaling, you just can't beat a pixel-to-pixel matching relationship between camera and display.

Did you happen to wander into Dalsa's exhibit at the NAB show? Did you see the slightly shrunken image on the LCD monitor? Do you know why it was shrunken? Exactly. It created a perfect pixel-for-pixel match between imager and imagee.

That made the pictures look as good as possible. Having pix look as good as possible is what companies at NAB live for.

Too bad you can't do it at home. If you want the best-looking 1920x1080 on a TV with 5-percent overscan, try watching it at 1824x1026.