SMPTE 2017: Sessions Examine Practical Side of UHD

LOS ANGELES—Ultra HD is coming of age and with maturity its treatment at industry gatherings, such as the SMPTE 2017 Annual Technical Conference & Exhibition in Hollywood, Calif., is changing.

Hans Hoffmann, session chair of the UHD, Bigger, Better conference track and senior manager of the EBU Technology and Innovation Department, may have said it best at the opening of the second day of the SMPTE conference.“UHD TV really came a long way. Years ago, we were talking of very fundamental research topics,” said Hoffmann as he introduced the morning UHDTV sessions. “Today what you are going to see and hear in the sessions is actually that we have reached now practical, operational questions.”

CLOSED CAPTIONING

First up was how Korean Broadcasting System (KBS) has deployed closed captioning as part of the nation’s deployment of over-the-air ATSC 3.0-based 4K television. YunHyoung Kim, a research engineer at KBS, described a practical implementation of next-gen TV closed captioning that leverages the same captioning information generated by a stenographer for its ATSC 1.0 broadcasts. KBS can take this approach because it simulcasts its programming in HD and UHDTV, he said. However, the work of the stenographer is the only common element between the two standards.

YunHyoung Kim ATSC 3.0 relies on the A/343 captions and subtitles standard. Unlike CEA-708 used in ATSC 1.0 in which captioning information is delivered as a bitstream contained in the picture header, A/343 delivers the information in XML language as a separate service, he said, adding that the ATSC 3.0 captioning standard is based on IMSC1, or the TTML profile for Internet Media Subtitles and Captions 1.0.

As a result, timing becomes a critical piece of the ATSC 3.0 closed-captioning puzzle to ensure closed captions are synchronized with the video, he said. That requires timing information to be inserted with the closed captions so they can be synced up with the video in consumer receivers. For KBS, accomplishing this goal requires a two-set process.

The first involves a caption converter, which converts DTV captions into IMSC1 captions. The converter buffers the incoming DTV captions, inserts timing information, which Kim characterized as being incomplete, and periodically generates TTML captions. The second step is caption generation. The KBS UHD caption generator completes the timing information needed in the display to sync captions and video by inserting UTC time, he explained. Finally, the caption generator generates ROUTE/MMT caption streams.

THE USE OF SUBTITLES

Simon Thompson, a BBC project engineer working on next-generation video systems, also tackled the topic of closed captions in his “Access Services for UHDTV: An Initial Investigation of W3C TTML2 Subtitles” paper.

Subtitle use is increasing in the United Kingdom for several reasons, such as an aging population, their popularity among children learning a second language and the increased viewership of foreign-language series, he said. The British government also has mandated their use since 2003, he added.

“TTML subtitles are the way forward for UHD,” said Thompson. W3C [the World Wide Web Consortium] developed the TTML subtitle specification, and various organizations, including SMPTE, the EBU and CableLabs, have their own profiles of the spec, he said. TTML files contain the text of the subtitle, as well as on-screen placement, timing and color information, which enables the TV to render closed captions.

The color of closed captions is important to BBC because it assigns one of four colors—white, lime (green), yellow and cyan—to identify different speakers on screen, he said. Thompson discussed the process BBC uses to arrive at those TTML colors for captions used on an HDR display. “We can’t just use the sRGB signal on an HDR screen because, if you use whites, then you stick your subtitle out at peak luminance of the display,” Thompson said. “We tried that. It’s not very nice. You can actually see the text for about 30 seconds [after it was presented on the display].” This requires a transform from sRGB to hybrid log gamma (HLG), which necessitates a transform of BT.709 primaries to BT.2100, he said. Thompson pointed out that color fidelity is not the objective in this captioning application. It just has to be good enough to identify the speakers.

A TTML working group developed a method for converting sRGB to PQ, but after evaluating it for use with HLG, BBC decided to write its own, said Thompson. The BCC method undoes the sRGB transfer function, performs a color space conversion and applies a simplified HLG Opto-Optical Transfer Function.

Thompson said the BBC wanted to answer three questions related to this transfer to HLG: How accurate was the color after the transfer to HLG? Does background brightness affect the perceived brightness of the subtitle? And do background changes in brightness create perceived changes in brightness of the subtitles? The broadcaster presented test subjects with four individual video sequences as well as two with cuts between clips—bright to dim and vice versa—to get the answers.

To conclude, Thompson made four points. First, fewer than 5 percent of viewers said they were very annoyed by the variation in subtitle brightness when a screen is forced into power-saving mode by the brightness of the background video. Second, content producers should be aware that scenes with rapid, repeated changes in brightness level will cause some viewers to perceive that the brightness of subtitles change when the monitor’s power-saving mode is not triggered. Third, when displaying subtitles on a dim background, dimmer subtitles should be used. Finally, subtitlers should have access to a bigger color palette and should choose the appropriate subtitle brightness levels within a scene, Thompson said.

Phil Kurz

Phil Kurz is a contributing editor to TV Tech. He has written about TV and video technology for more than 30 years and served as editor of three leading industry magazines. He earned a Bachelor of Journalism and a Master’s Degree in Journalism from the University of Missouri-Columbia School of Journalism.