Dear John Luff:
In the March 2010 edition, you claim that universal cameras can be constructed for different resolutions. You are overlooking the fact that the optical low-pass filter (OLPF) can be optimized for one resolution only.
This is the main problem of all current HDSLRs. It leads to false detail and massive aliasing, like moiré on repetitive structures in focus.
Institut für Medienforschung
Hochschule für Bildende Künste Braunschweig
John Luff responds:
First, I concur entirely. The physics of imaging do not allow all things without compromises. The question becomes: At what cost can we avoid compromise today in cameras varying in price by about two orders of magnitude, and with optics that vary by the same amount? The design of a complete system must take into account the effects of sampling, and low-pass optical filters have been in cameras for a generation. DSLRs, the subject of your letter, were not the only topic of my article, but they are of increasing importance in the marketplace.
One would hope that any professional shooting with a camera and lens costing less than $5000 would not expect the performance of digital cinema cameras, but of course you are exactly correct. Not understanding the limitations of the tool could well lead to the assumption that the tool is defective when it is simply not optimized for that use. The truth is that no one is willing to say much about the secret sauce, how the live video in DSLRs is produced from a Bayer sensor with many times more pixels, whether they take into account any optical filtering or how they convert from “Bayer space” to 1080p. As one expert told me, “It appears they do make final choices on pictures and not test charts,” though lots of threads about aliasing in DSLR video can be found in even a cursory search.
The “Nielsen: Broadcast-only TV households to slip below 10 percent” article in the May 4 “HD Technology Update” newsletter certainly caught readers' attention. Here are two comments about the article.
Sacrilege alert: I really think that Nielsen may be wrong.
As someone who has seen cable subscriptions drop because of the recession in an area utterly under-served and overpriced by cable, satellite and broadband providers, I honestly think Nielsen's methodology is flawed, its results are skewed and the whole survey is, in a word, unbelievable. Isn't this the same organization that swore up and down that America was ready for the DTV transition?
Do any of these research organizations, or the government for that matter, gather information beyond America's city limits? My guess: no. That's why the whole notion of the broadband spectrum crisis is a farce as well. This “crisis” was caused by the FCC and the consumer electronics industry, not broadcasters. I have a feeling that a large part of mainstream America couldn't care less if someone's iPhone runs slowly and they can't update their Facebook page from a local bistro.
America ought to worry more about the promise of giving spectrum to emergency services, which still has not happened, and ask why it has not instead of whining about their cell phones not being able to access the Internet. When the next Hurricane Katrina happens and the grid goes down, our 3G and 4G networks die with it. Who's left to inform the public about how to save themselves? Broadcasters and emergency services.
Here we go again with flawed Nielsen data. Does anyone remember the DTV mess? Nielsen claimed 10 percent of households weren't ready. The actual number was closer to 40 percent.
For OTA in Los Angeles, 10 percent is simply wrong! In this six-county DMA, the number using antennas is still well over 25 percent. Cable companies have been losing subscriptions because of the recession, and it's the same at DIRECTV and DISH.
It is time for Congress to act and require one seat on the FCC to be held by a practicing engineer (certified by SBE), not D.C. lawyers or their academic buddies. This flawed data will be used by the FCC chairman for his 20-channel wireless spectrum grab. The data is flawed, and so is the FCC's approach.