In the UK, there has been much talk in the last week of HD bit rates. During a consumer feedback program, the BBC's head of HD answered a complaint that the BBC had dropped the data rate of its single HD channel to the detriment of picture quality. She confirmed that the broadcaster has indeed reduced the data rate, and then stated, “There is no evidence that reducing the bit rate has an impact on picture quality or that there is an absolute relationship between picture quality and bit rate.” She continued that HD was about “more picture depth.”
I was educated as a scientist and later worked as an engineer, and find such remarks misleading to the average viewer. They demonstrate the faux-science that broadcast engineers come up against as they strive to deliver great pictures to the viewer. To follow the logic of this false statement, we could get 10 or 20 HD channels into an old analog channel slot, so why aren't we? Well, clearly her statement was nonsense. And as to picture depth, it was used as a subjective term with no real meaning. Depth in a picture conventionally refers to the Z-axis in stereoscopic 3-D.
Broadcast engineering has followed a long road to provide higher quality pictures and sound. That journey continues as we look forward to UHDTV. Some broadcasters strive to deliver the best possible picture quality; HBO, Discovery and Sky come to mind. Others fall back on marketing speak, where dubious statistics, numbers and unproven claims reign.
Audio pioneer Peter Walker, founder of Quad, once said of audio bandwidth, “The wider the window, the more dirt blows in.” Mapping this analogy to video, an HD display will make artifacts more visible than an old SD display. This ups the quality requirement for an HD system. Add to that the viewers' expectation of a better picture, and you arrive at the need to improve the delivered picture quality with an HD channel. On what do viewers base their standards? They have a number of references; Blu-ray is one. Another is the increasing number of movie theaters using digital projection.
TV is a business, and in the competition between reducing bandwidth vs. reducing picture quality, many broadcasters are tempted to squeeze bandwidth until even the uninitiated viewer can see the blocking and motion artifacts.
It has always been this way. When some used 35mm, others used 16mm. When Digital Betacam became a favored acquisition format, others used DVCAM. The drive to cut costs has always put the delivered picture quality at risk. If I scan through my TV guide, I find endless channels mining the archive. I often wonder why so much bandwidth is wasted on NVOD and reruns. A more logical means of delivery of the back catalog is via broadband to a smart DVR. Precious spectrum is wasted because marketing the “channel” reigns supreme; the loser in the pressure for more channels is the delivered picture quality.
For many CE manufacturers and broadcasters, HD is just part of a hollow marketing message. It has more lines, so it must be better. Just recently, I was looking for a low-cost camcorder that could shoot great (progressive) pictures for the Web. I was surprised to find that 1080i dominates the consumer space. 720p50 or 60 remains the preserve of the professional camera. Why are consumer cameras sticking with interlace, a 1920's technology? I can only presume that 1080 is a larger number than 720. What is more illogical, I found these cameras use 1/4in sensors that cannot physically resolve 1080 lines. Where is the science in this?
DAVID AUSTERBERRY, EDITOR
Send comments to:firstname.lastname@example.org