SMPTE 2014: HPA—More Resolution, Frame Rate or Dynamic Range?

HOLLYWOOD—The foundation of ultra high-def TV is more pixels, but Mark Schubin questions if that’s the best option for improved pictures. Contrast, color, frame rate, screen brightness and even immersive sound impacts how moving images are perceived, he said. Schubin addressed the topics at the Hollywood Post Alliance Symposium held Monday in the eponymous movie town.

Schubin said that in his presentation, higher resolutions, frame rates and visual dynamic ranges cannot be properly displayed. Source material matters, conditions matter, i.e., distance from screen; screen size, display and environs matter.

“And,” he said. “It’s still 2014,” and that matters because perception is learned.

“We’re not born with it, but born with the ability to gain perspective,” he said. He spoke of the first motion picture shown to a paying audience in 1895. It showed an oncoming train. The audience apparently reacted as if it were real.

“People hadn’t seen motion pictures before,” Schubin said.

Physics is another obstacle, particularly for higher frame rates. Schubin said doubling frame rates means half the exposure time.

“It is possible to fix that with processing without affecting temporal resolution,” he said. “But no one’s doing that now.”

A larger image sensor could be employed, as Lockheed Martin did with their 4K prism image sensor camera, but the camera was huge. Instead of a prism, manufacturers have made on-sensor filters. But that raises the issue of an “optical low-pass filter.”

“That’s a very good thing to have in a camera, but my question is, ‘how do they do that? If you have a single sensor, you have different resolutions for red, green and blue. Color filtering is an issue. Red and blue end up with 2K resolution. If it’s correct for the green, it’s wrong for the red and blue. If it’s correct for the red and blue, it’s wrong for the green.

“If you have a single sensor imager, you get color filter artifacts. Debayering… gets rid of that, but if you filter something, you make it have less. So we’ve gone to more but end up with less.”

Difficulties of single-sensor 4K cameras also include lens issues, e.g., there are no long-range zooms, and adaptors from a 2/3-inch format lens to an S35 sensor lose around 2.6 stops. The upshot is a need for 600 percent more light. One alternative is to go from a larger sensor to multiple smaller ones. Another was showed by Hitachi at IBC with the SK UHD4000, which has four chips instead of three; i.e., two green with a ½-pixel diagonal offset. Gearhouse Broadcast ordered 50 at the show, he said.

A third alternative is to simply use an HD camera with 4K upconversion., he said.

“It has no optical filtering problems, and the lens matches the sensor perfectly,” he said.

Another strike against equating more pixels with better pictures is that higher resolutions aren’t as visible at a distance. That’s a function of human physiology. Schubin also noted that the physiology of color perception exceeds that of any three-color-primary display.

Schubin drew applause from the post-heavy audience for a comment he made about the psychology of content creation. He gave the example of how phone calls in movies sound as if the unseen participant is mumbling in a can because of the creative effect.

“Engineers should not dictate to creatives what they can do. We should provide them with tools,” he said.

“The Hobbit” in high frame rate generated some complaints, he said. But he noted both that the movie has grossed over a billon dollars to date, and that there have been complaints about such other innovations as sound, color, and widescreen pictures.

With regard to immersive sound, Schubin said the more immersive the sound to begin with, the less additionally immersed people felt when more channels were added. In a test in which he participated in the 1970s, however, people said stereo sound made the picture seem larger.

Schubin illustrated how sharpness is perceived as a contrast sensitivity function, using a bar graph that appeared to have a curve at the bottom, but did not. The curve varied with retinal angle and observer. He showed that along the same axes of resolution and contrast there’s another curve—a modulation transfer function. It indicates how much contrast a system—a lens, a camera, a whole studio—lets through at different resolutions. The curve shape varies with many factors and the area under the curve determines our perception of sharpness. It’s easier to increase the area with increased contrast (dynamic range) than with increased resolution.

For Schubin, high dynamic range provides the “most bang for the buck,” because it provides visible improvement at a fraction of the data rate required by higher resolution and frame rates. Going HD to 4K generates a 16x increase in data rate for about a half-grade of improvement. Doubling the frame rate yields a full grade of improvement for a lower payload increase, while HDR creates more sharpness with very little increase in data rate.


See high-frame rate in action in James Nares’ street scene clip.