Following the recent launch of a 3G phone, some critics derided the included camera as being “only 2 megapixels.” Now this happens to be the same as a 1920 ×1080 broadcast-quality HD camera. TV viewers find that an HD camera produces excellent pictures on a 70in plasma display, so why does a mobile phone need a 5-megapixel camera in order to display a picture on a screen measuring 320 × 240 pixels or less, and with a diagonal of 3in? Not only that, but mobile phone images are highly compressed. I know over-sampling at the capture stage can improve picture quality, but I believe the answer here lies in marketing, not science.

The rule seems to be that big numbers must be good, because they impress the public. However, such thinking also extends to broadcasting. In the move to HD, many networks chose 1080i over 720p, even though careful tests from bodies like the EBU proved that 720p pictures looked better, plus they had the advantage that compression of progressive scanning is more efficient.

The same obsession with pixel counts can be found in semi-pro HD camcorders. Many sensors have pixel sites smaller than the diffraction limit of light so they cannot physically resolve the possible 1080 lines resolution. These sub ½in sensors also sacrifice sensitivity for pixel count. It's easy to assume Moore's Law applies to everything, but the Laws of Physics always overrule.

Moving on to displays, the same weird science applies. Although some picture monitor manufacturers understand all the issues, others allow their marketing departments to use inexact terminology. Just getting the colorimetry right doesn't make a grade one monitor. The area where it all goes awry is motion rendition. If video shot at a higher shutter speed is viewed on an LCD, it is never going to look like it does on a CRT because of the sample and hold function of the LCD. But then a CRT never looked like projected film, yet we have watched film on the CRT since the inception of television. Ironically, the LCD display is far more like a shuttered movie projector in its motion rendition.

Going back to the camera at the beginning of the signal chain, it is sampling motion in the temporal domain, so any system that reproduces motion is never going to look exactly like the original scene.

It's all neat and tidy for marketing folks to talk about 1080, progressive scan, etc., but the physics and the psychovisual effects behind it all are much more complex. We have the advantage that our sensor (the eye) feeds the visual stimuli to a powerful parallel processor (the visual cortex area of the brain) for the image analysis and perception, whereas a TV essentially maps pixels in the camera to pixels at the display.

Often, over the years, the product specifications caught up with the claims as technology improved. If it were possible, imagine comparing a 1950s broadcast color monitor with the last generation of CRTs. They are both capable of displaying NTSC pictures, but what a difference in quality. One could say that early generation products have aspirational features, and marketing is all about aspiration.