Want Sharper Pictures? Try the Other Hi-Def!

You might not have noticed that TV used to come in slower frame rates. I ain't talking here about 720p versus 1080i. Yes, the frame rate of the former is usually double that of the latter, but the number of pictures per second is exactly the same.

No, I'm talking about the old days of TV, the really old days, like, for instance, back in the 1920s. Methinks the first video pictures had eight scanning lines at eight frames (or pix) per second, a nice symmetrical arrangement.

I ain't any TV technology historian, so take these next figures with a grain or two of sodium chloride, but methinks there might have been a 12-line 12-fps system before a 30-line 25-fps system. Heck, even the latter ain't terribly asymmetrical. I'm pretty sure there was a 45-line 15-fps system in Chicago that had 3:1 interlace, which would have made it 45 pix per second, another perfect step on the gradient-one stairway of progress. And then the slope changed to cliff-like proportions.

STEEP SLOPE

There were 240-line systems at 25 pix per second, 405-line at 50 pix per second, and 441-line at 60 pix per second. That there last one was transmitted at the 1939 New York World's Fair. Some folks say that was the beginning of mass-market TV. Nellie (my last remaining neuron) and I kind of figure it was the end of progress on the picture-rate end.

In two months, it'll be 2009, 70 years after 1939. Scanning lines per picture have sure as heck advanced in that period. After 441, there were 525, 625, 819 and 1125. You might know that last one better as 1080, which is a number of active lines, whereas numbers like 525 and 1125 refer to total lines, including the vertical blanking interval. And I ain't done yet. Switching the progress list to active lines, you can buy LCDs with 2160 lines, and that ultra-high-definition Japanese system that keeps showing up at NAB conventions and other places has 4320 lines.

There surely has been continuing progress in lines per picture. But in pix per second, progress ended in the 1930s. HDTV has the same 60 pix per second as 1939 World's Fair TV (or maybe even a little less: 59.94 at our latitude and longitude and 50 in some other parts).

"But, Mario, isn't 60 all we need?"

Now, there's an interesting question. Methinks there's more than one answer, depending on the unmentioned words at the end.

If the question goes, "Isn't 60 all we need to turn individual pix into apparent motion?" then Nellie and I answer, "Heck, yes!" That rate, which some folks call the fusion frequency, seems to be down around 16 per second or so. Don't hold me to an exact figure. Not only ain't I a historian, but I ain't a vision scientist either.

If the question goes, "Isn't 60 all we need to eliminate flicker?" then Nellie and I answer, "It depends." It depends on factors like screen brightness and peripheral vision stimulation and even what you get used to.

In a movie theater, the screen is pretty danged dim compared with a TV set. That doesn't mean 24 pix a second is enough to eliminate flicker (which is how come movies are called flicks), but stick a two-bladed shutter into the projector, and 24-fps becomes 48 pix a second, and that's enough.

Over in foreign climes, like the mother country, folks watch TV at 50 pix a second and think it looks fine. On their first night over, Americans recoil at the flicker; then they get used to it, too.

Meanwhile, on your desktop, where you lean into a computer monitor competing with office lighting, chances are you're using a refresh rate of 72 or 75 or maybe even 90 per second. Anyhow, the range between 48 and 90 is less than an octave; 60 is definitely in the correct ballpark.

If the question goes, "Isn't 60 all we need for detail?" then Nellie and I answer, "Heck, no!" I mean, you can see the problem on almost any sports show. The batter swings and connects, and the director calls for slo-mo or a freeze. If the camera wasn't shuttered, the slo-mo shots are all a blur, which means you were watching that blur in the first place. If the camera was shuttered, then maybe the freeze frame is nice and clear, but the original, normal-rate video looked jittery and shuddered (or, as some high-faluting unmasked engineer might say, "It had judder.")

Now then, there used to be a few good reasons why frame rate didn't increase the way lines did. For one thing, every doubling of frame rate doubled the amount of info that needed to be recorded or transmitted. For another, it sped up camera scanning. And, for a third, it sped up display scanning (and LCDs had a hard time dealing with even just 60).

Well, then, welcome to the future. This here is the age of compressed video. If you increase frame rate, there's less change between frames, so it's easier to compress them. It also makes each frame sharper, so it's easier to predict interframe motion. Now then, in addition to not being a historian or a vision scientist, I ain't the world's greatest expert on entropy coding, either, but, the way I figure it, going from 60 pix a second to 120 ain't going to take anywhere near twice the data rate.

As for high-speed cameras, there are one whole heck of a lot of them doing slow-motion stuff. Anything that can shoot 120-fps slow motion can also shoot 120-fps normal motion.

Those LCD displays, I've got to admit, are a problem, but LCDs are getting faster all the time, and they ain't the only display technology in town. At NAB in April, field emission technologies showed some car racing shot and displayed at 240 fps, and it was good.

Yes, time for hi-def: high temporal definition.