The Problem With Plasma

YOU MIGHT NOT HAVE NOTICED that sex isn't everything. And I ain't being Lombardiesque here; it ain't the only thing either. Yes, of course, this lunar cycle I'm ranting about plasma displays. What else could I be referring to?
Publish date:
Social count:

YOU MIGHT NOT HAVE NOTICED that sex isn't everything. And I ain't being Lombardiesque here; it ain't the only thing either. Yes, of course, this lunar cycle I'm ranting about plasma displays. What else could I be referring to?

Right up front, I will be happy to admit that they are sexy as heck (no matter how sexy heck is). They're big. They're flat. They're expensive. They're (for the most part) widescreen. They can hang on a wall. Like "Wow," man.

Geez! If I had twenty grand burning a hole in my purse (is that what made that hole?), I'd probably rush out and buy one, too. Then I'd have it installed in the home of someone I didn't like very much.

There is a brilliant invention out there, and it ain't plasma displays. No, that great leap of technology to which I refer is the conference tape, something that lets you listen to papers you didn't get to hear in the first place.

If you buy the tapes for February's SMPTE Advanced Motion Imaging Conference in Dallas (a lot cheaper than the airfare for most of us), you'll come upon a paper on "Innovations in Consumer Display Technology" that was supposed to be delivered by Panasonic's Phil Livingston, Mr. ATSC Chairman himself. That ain't the paper on the tape.

Instead, it's a one on plasma technology by Jim Noecker of Plasmaco, one of the best danged makers of plasma displays in the whole wide world. I don't remember the title of the paper, and it'd be much too much trouble to get it off the tape, so I'll just call it what it sounded like to me: "Plasma Displays \ Why Ours Suck Less Than Others."


Hey - ever heard of Weber's Law? I ain't talking about the one that says you should always cover your backyard grill (and replace it at the first sign of wear). No, I'm talking about the one related to human perception.

It'd take too long to get into all of the details, so let's just pretend it says we're sensitive to contrast changes of about one percent. I ain't expecting any arguments here; Weber's dead.

So, draw two parallel vertical lines a few inches apart. Make a mark a little below the middle of each line. Label the mark "100." That'll be some illumination level on some arbitrary scale.

Weber's Law (my Weber's Law, anyhow) says you can just barely perceive something 1 percent different - 99 or 101 on the arbitrary scale. Got it so far? Good.

Now divide the left line into 256 equidistant spaces, what you'd get if you used eight-bit linear processing. Sure enough, there's a level 99, a level 100, and a level 101, just exactly what Weber's Law says you need to avoid contouring.

Pop up to the top of the line for a second. The top level is 255. One percent of that is 2.55, so a change of 1 percent would be located just above level 252. There ain't any problem there. Heck - if anything, you've got too many levels.

Now pop down near the bottom, maybe at level 10. One percent of 10 is 0.1. Whoopsie! The nearest levels to 10 are 9 and 11; they're 10 percent different, not 1 percent. Drop down to level one, and things are even worse. The nearest levels are 0 and 2, 100 percent different. Yipes on a stick! You are looking at some mighty severe video contouring (that means someone's dark face looks more like a contour map full of lines than a dark face).

Time to switch over to the other line. This time, make each mark just 1 percent more than the previous mark. At 100, the 99 and 101 marks will be in almost exactly the same place as they were on the first line. But at the top the marks will be more spaced out, and at the bottom they'll be all bunched together.

Welcome to the cathode ray tube. It does that (or something close enough) all by itself, and our saintly pals, the camera manufacturers, stick gamma "correction" into camera so they'll compliment what the picture tubes do ("Nice pix, tubie;" "Thanks, you chip, you.")

This has been going on since the days of Philo T. Farnsworth, and it works. Enough said.


Picture tubes have phosphors; so do plasma panels. Picture tubes have electron beams; plasma panels don't. Some folks think the gamma of picture tubes comes from their phosphors. Nope. It comes out of the cathode.

In one of those there picture tubes, the electron beam hits the phosphor, and it glows. In one of those there plasma panels, a discharge creates a wee bit of light (as in a neon lamp), and the light causes the phosphors to glow. No electron beam, no gamma, eh?

Take a look at a luma ramp on a plasma panel. No, wait! Make sure you have a barf bag handy first.

If it's an early (or cheaper) plasma panel, it'll have contours up the wazoo. If it's a fancier plasma panel, it'll have - how shall I put this delicately? - ugly dots. I've heard them described as "graininess."

Plasma panel designers (who must never look at their products at home) sometimes call those dots "error diffusion." That means the plasma panel gets to look ugly over a broader area than the ones with contours. And I ain't done yet.

You probably have noticed that a roll of film ain't the same as a light bulb ("You know, Mario, come to think of it, I have"). One of them does its thing when light hits it; the other wants a shot of electricity.

So, even though we call the light-emitting chemicals on both CRTs and plasma panels phosphors, they sure as heck ain't the same (no matter how sure heck might be). The CRT phosphors are looking for a little stimulation by electron beam; those in the plasma panel are waiting for some of that light (and, yes, I include within the term "light" the visible stuff, the nice warming infra-red, and those ultra-violet rays to which the eyes of our cousins the bees are sensitive).

Now, then, as it turns out, those light-sensitive phosphors are more likely to wear out than the electron-beam phosphors. It's got something to do with more oxygen atoms, methinks.


So, remember that blue-burn issue I've been warning you about with multiple aspect ratios? You know, the one where, if you watch the wrong aspect ratio a lot, the part of the screen you watch wears out (in blue first) and the part you don't watch doesn't, so the next time you watch the right-shaped pix you see a broad yellowish stripe in the center of the screen?

Well, it ain't so bad for direct-view TVs. It's worse for projection TVs. It's worst for plasma panels. I mean big-time worst - network logos burning in and the whole enchilada.

So, in this here era when most programming is 4:3, what shape are most plasma screens? You've got it! The one that pretty much guarantees stripe burn-in.

Let me see now.... Contouring? Check. Graininess? Check. Burn-in? Check. Latency?

Oh, yeah. There's another "feature" on some plasma displays. The pixels stay lit (sort of - more on that in a sec). But most video is interlaced. So it's got to get deinterlaced prior to display. That requires a wee bit of thought. Does the system interpolate between lines or between fields? Fields take a bunch of time. Ergo, the plasma display has latency in it, and that spells "lip-synch error."

Have I mentioned motion rendition? In a picture tube, if you want a pixel dim, you send less juice. Bright? More juice.

In a plasma panel, if you want a pixel dim, you don't turn it on very much of the time. If you want it bright, you turn it on a bunch more time. This is called pulse-width modulation. On account of my beloved publisher not buying me a blackboard this lunar cycle, I can't draw you a picture (I'm not complaining - all profits go directly to a fund for starving tech writers - or so they tell me), but, suffice to say, many plasma panels introduce their own motion judder.

Someday, I'm sure, all of this will get fixed. Meantime, producers and directors fall in love with plasma panels and blame the suck-y pictures on us. There ought to be a law.