Surprise! HDTV Has Higher Definition

You might not have noticed that HDTV has higher definition than standard-definition television. I am not making this up.

"But, Mario, the HD in HDTV stands for high definition. Everyone knows HDTV has more definition than standard definition."

Do they? There have been a bunch of phone surveys showing that something like 30 percent of Americans say they have HDTV. That's maybe three times more homes than the number of HDTVs that have been manufactured, but can you blame them? I can't.

Our Beloved Commish, aka the FCC, never forced anyone to transmit HDTV (unless we're talking cable retransmission, which is pretty restricted). But so many people have confused digital TV with HDTV that it's no wonder the reported numbers are so high. Add digital satellite and digital cable and DVDs, and I'm surprised phone survey results ain't closer to 50 percent. But that ain't what I wanted to rant about this lunar cycle. I started this sorry excuse for a technology column by saying "You might not have noticed," and when I said you, I meant you.

"But, Mario... "

I heard you the last time. HDTV has higher definition. But do you have any concept of what that means?

One of our competitors ran something called "Hands-On-HD" recently, in which they got people from the front lines (meaning you), to talk about dealing with HD. A cinematographer talked about exposure, color, depth-of-field and depth-of-focus. A network engineer talked about focus and aspect ratios. A director talked about "3:2 pulldown" and slow motion. The head of a production-truck company talked about aspect ratios and audio/video sync. An editor and designer talked about aspect ratios, motion-blur and grain.

ASPECT, SCHMASPECT

That last paragraph was getting too long, so this'll be more compact. The other folks talked about blue-screen compositing, aspect ratios, frame rates, aspect ratios, frame rates, layering, lighting, exposure, frame rates, aspect ratios, displays, aspect ratios, file sizes, rendering times and bandwidths. Did I mention frame rates and aspect ratios?

That's all good stuff, and if I wore a hat, I'd take it off to our competition (if we had any) for gathering the folks and their comments. But unless I missed something in the last two paragraphs the one thing that's missing is that HDTV has higher definition.

Ok, I lied. One of the folks talked about pixel density, which is kind of the same thing as definition, but he did it only to emphasize file sizes.

Hello? What does the HD in HDTV stand for again? Allow me to note that 'twasn't ever thus. Back in the days when HDTV was strictly in the future, there were some blithering technidiots who decided that HDTV's additional detail meant that we could use fewer cameras, less panning and less cutting.

Here's the deal: In normal-definition TV, you show an establishing shot, then a close-up of one person talking, and then a close-up of the other person reacting. That's three cameras, three shots and at least two cuts. The HD-technidiot theory was that, on account of you've got all the detail of the standard-def close-ups in the original HD shot, there's no need for the close-ups or the cuts.

"But, Mario, that sounds like your point. Why call them blithering idiots?"

Ahem. Methinks that last month I pointed out the folly of technotypes trying to solve artistic problems. Repeat after me: Engineers are not directors. Got it? No? Exhibit A, Stanley Kubrick's "2001: A Space Odyssey." "2001" was shot in about as high definition a film technology as you can get short of IMAX. It was also about as wide as movies get before they wrap around and turn into a cylinder. HDTV ain't caught up to it yet. But "2001" had some scenes wherein a single human eye filled the screen. How come? On account of what the director wanted.

Directors ain't got resolutions or aspect ratios. If a director cuts from a wide shot to a close-up, it might be to call your attention to the person. Or to get a different angle. Or maybe it's a matter of pacing. Whatever it is, it's probably on account of what the director wanted, not likely on account of the image lacking in definition.

"But, Mario, if the ones who emphasized HDTV's definition in production were wrong, why do you think it's so important to keep in mind?"

That's easy. It's because HDTV ain't all there is. I ain't got any complaints about the cinematographer neglecting to mention HDTV's definition. All movies, since Day One, have been at least HDTV definition (OK, maybe not those 8mm family movies). And, since almost the beginning of television, movies have been shown on a little screen.

There have been a few hiccups, like the time NBC broadcast the CinemaScope version of "How to Marry a Millionaire" and lopped off the body parts of any actress stretched out on a chaise. But once Hollywood figured out thatTV and money rhymed, movie directors and cinematographers shot with the boob tube in mind.

So a good cinematographer doesn't need to worry about HDTV's definition. It ain't going to be any higher than film's. But you folks in TV technology are a different story. It's pretty common for you to build an HDTV control room wherein the only line-cut monitor the director looks at is HDTV. That's nice, if all you're shooting is HDTV. But unless you're working on some tradeshow exhibit that ain't ever going to be seen outside the HDTV monitor, you ain't shooting just for HDTV.

Shooting a movie? That movie might be seen on TV -- maybe after a trip through a VHS machine. Shooting for HBO HD? Do you think they're going to prevent their non-HD subscribers from watching it?

Some survey (better than the typical phone survey) put the number of U.S. homes with HDTV displays at about 6 percent at the end of June, and maybe only a third of those can actually receive HDTV. Allow me to be wildly optimistic and say that only 90 percent of your audience will see your show in non-HD for the near future.

Now the standard concerns seem to be frame rate and aspect ratio. Frame rate is just TV technology folks coming to grips with putting 24-frames-per-second stuff on TV, which is what film folks have done for years.

As for aspect ratios-please! If you've got a quarter-ounce of smarts, just shrink the widescreen to fit the old TV shape -- letterboxing. If you (or your boss) ain't got even that much smarts, what's the worst that happens? You lose 12.5 percent of the picture on either side. Heck, the safe-title area is already 10 percent on each side. Yes, it ain't great, but it probably ain't the end of the world.

But definition is something else altogether. Suppose you shoot 1080i and letterbox for NTSC viewers. You've just gone from 1920 x 1080 to, at best, 440 x 360.

If you sat on a wide shot when shooting in HDTV because you saw all the definition you needed in your HDTV monitor, you're probably going to have a ticked-off audience that can't make out a thing. If you made the clue to the mystery just barely visible in HDTV, it's invisible to more than 90 percent of your audience. Hey, if Kubrick could fill an ultrawide, ultra-detailed screen with a single eye, you ain't got any cause to complain about a lack of subtlety.

How do you know in a control room what folks'll see at home? Give the director a non-HD home TV to check. Someday, you'll thank me.