LOS ANGELES—“As above, so below” may be an apt theme for two presentations this morning on the first day of SMPTE 2017 Annual Technical Conference & Exhibition in Hollywood, Calif.
One dealt with the steps NASA engineers working with digital cinema camera vendor RED and AWS Elemental took to deliver a live 4K UHD transmission from the International Space Station 250 miles above earth. The other offered a proposal for a new TV panel calibration method that takes into account the dynamic nature of HDR to provide a predictable panel state without limiting innovation on the part of vendors. While in quite different orbits, both shared the mission of finding a way to deliver the highest possible viewer experience.
CAN YOU SEE ME NOW?
First up was NASA program manager Rodney Grubbs who described the engineering involved in delivering live 4K UHD video via a RED Epic Dragon camera from the space station to an audience attending the 2017 NAB Show in Las Vegas. The days of NASA building its own custom solutions for imaging from space have passed, and the space agency now relies on commercially available products, which it might modify to meet its unique needs, said Grubbs.
Grubbs co-authored the “Engineering a Live UHD Program From the International Space Station” paper for SMPTE along with Sandy George, an engineer at SAIC, which contracts with NASA.NASA met with RED at the 2016 NAB Show and together they decided to use a RED camera as the source for a live UHD downlink from space, he said. However, the RED Epic Dragon is not a broadcast camera, so it lacked embedded audio, Grubbs explained.
To overcome that hurdle, NASA decided to use audio from its existing HD camera onboard the space station and marry its sound output to the 4K video on the ground. But first, NASA had to settle upon the full tech package that would be delivered to the space station in December 2016 on Japan’s Logistics Carrier for the UHD live transmission in spring 2017. It included the RED Epic Dragon, Redcast, a module for the camera that provides four 1080p HD-SDI outputs and a one-off 4K UHD H.265 encoder, which AWS Elemental working with RED built specifically for the space agency, he said. The 18 Mbps UDP stream out of the encoder would then be sent to earth from the ISS, which is fully internet-compatible and is “an IP node” in space, he said.
Testing revealed an offset of about four seconds between the 4K UHD video and the HD-camera-originated audio, Grubbs said. On the ground in NASA’s audio control room, both the 4K video and audio from the HD camera where taken back to baseband, synced and re-embedded as HD-SDI to create another UHD stream via another custom AWS Elemental encoder.
To get the live stream to Las Vegas, NASA would route the UDP 4K H.265 stream out of the encoder to the NASA TV hub at Encompass in Atlanta. From there, the stream would be decoded in Atlanta and uplinked to Las Vegas for the live 4K UHD stream at the 2017 NAB Show. As a backup, AWS Elemental put in a terrestrial link from Atlanta through Dallas to Las Vegas, he said. The end-to-end latency between the space station and Las Vegas was 10 seconds, which required a lot of planning, but proved to be manageable with practice, said Grubbs.
DETERMINING NATIVE GAMMA CURVE
The second morning presentation “Proposed Measured Display Characterization File for HDR Consumer Displays,” by Tyler Pruitt of SpectralCal in Seattle, Wash., examined a method that bypasses high dynamic range during calibration to determine the native gamma curve of a consumer display.
HDR masters in almost all cases have a higher performance than consumer TVs, said Pruitt. At the same time, there is a major display performance delta between the highest-quality HDR consumer TV and the lowest. This requires color mapping in many cases to preserve the creative intent of the content producer. Some of the approaches include static metadata that accompanies content, dynamic metadata and in higher-end displays GPUs and CPUs that analyze frames in real time.
However, most content is viewed on lower-end, less costly sets that do not include these processors, he said. So, the challenge is finding out how to make those sets more accurate. At the same time, any calibration method should not interfere with the algorithms TV manufacturers use to optimize the performance of their panels, said Pruitt.
“If we start adjusting stuff after [the TV manufacturer’s color mapping] happens we are essentially deviating from what the picture-quality engineers at the TV manufacturers have decided is the correct tone map,” he explained. What Pruitt proposed is disabling all HDR mapping and conversion to gamma during calibration. That way, the panel would be measured in its HDR mode with its native gamma response, he said.
In response to a question following his presentation, Pruitt said: “Most of these color management [modes in the set] are done with a 3x3 matrix, and they just put it to unity and give you the panel native vivid mode,” said Pruitt. “Put your 3x3 matrix that controls the color gamut into unity, and let’s measure what the actual primaries are,” Pruitt said. Then a consumer set can feed that data back to itself and calculate a new 3x3 matrixes from the actual measured data rather than an average of all panels, he said.
Describing this method as “radical process” for calibrating HDR, Pruitt said this technique would be appropriate for home theater displays, televisions and theater projectors. He urged SMPTE to take up the proposal for standardization.