Mastering the Digital Frontier: The Importance of QC Checks in UHD Workflows

Interra
(Image credit: Interra Systems)

With the advent of UHD workflows, ensuring a premium viewing experience for audiences has never been more important. This, coupled with the break-neck speed with which content creation and distribution technology is advancing, means media professionals are now facing both obstacles and exciting opportunities at the same time. 

This article delves into the challenges of an end-to-end UHD workflow and how they can be overcome with advanced QC solutions.

Video Quality Issues in UHD Workflows
UHD, including HDR, offers a premium and immersive experience to viewers.  However, as is the case with any new technology, it comes with its own set of challenges regarding adoption. Creating an end-to-end workflow that captures, edits, and transcodes in UHD may not be financially or operationally feasible for many content creators, especially those in their first few years of business. Therefore, such creators may resort to upconverting the resolution, framerate, dynamic range, bit depth, and color gamut of their content. 

However, if these aspects are not handled properly, the result could be a host of video quality issues, including blurriness, digital ghosting, color banding, and tone mapping. Furthermore, upconverting from HD to UHD, or poor UHD capture, may result in blurriness at the edges, as these processes may not be capable of maintaining sharpness in these areas. Therefore, blur detection is a critical requirement.

Likewise, frame rates much higher than 30 fps are required to achieve the natural feel of temporal variation on bigger and wider UHD screens. During the capture of large spatial data at a high temporal rate — 48 fps, 60 fps, or 120 fps — visual issues such as motion-blur, ghosting, grains, and dead/stuck pixels may occur. 

Furthermore, during the journey through transcoding, upconversion, and editing, perceptual temporal issues like motion jerks and flickering may also be introduced. It’s important that such quality issues are discovered before content is delivered to end-users, while corrective action can be taken to fix them.

Another challenging aspect of UHD content is the wide color gamut (WCG) specification (e.g., ITU-R BT. 2020, DCI-P3), which extends the range of colors beyond the standard color gamut. It is this extended range that provides the true, in-depth colors that make UHD scenes look more realistic on a big screen. 

During the transformation of content through editing, transcoding, and tone-mapping, the very fine and smooth variations of the content colors can be disturbed, leading to the formation of bands that are quite evident on wide screens. This color banding can ruin the overall viewing experience.

Furthermore, unlike SDR content, where display quality can be predicted by analyzing the loss resulting from compression, HDR content is much more complex. Due to the non-linear relationship between the decoded and displayed data, this perceived video quality analysis becomes much more cumbersome.

Again, this depends on the type of transfer characteristics that have been fixed while creating the HDR content, which involves the automated or manual process of defining the contrast at various scenes. The content needs to be specified with the average and maximum light levels (nits or cd/m2). For the analysis system, it is important to measure these light levels whenever there is any kind of transformation — such as editing or transcoding — of the originally toned content. 

A primary factor affecting HDR viewing performance is interoperability between the video and the display. When a legacy SDR display is used to view a picture created in a color palette with a WCG, contouring — rendering smooth gradient areas as a series of distinct bands — can be an issue if the conversion is not handled carefully. It can be caused by bit depth loss, which is a global issue, or by incorrect region-based tone mapping, a local problem. These conversions need to be appropriately handled in order to maintain a rich viewing experience. 

In addition, while guidelines for photosensitive epilepsy (PSE) have been long defined for SDR content, the International Telecommunication Union (ITU) and National Association for Commercial Broadcasters for Japan (NABJ) have now updated the guidelines for HDR content as well. This needs to be included as an additional component in the analysis of UHD content. 

The Need for Advanced QC Solutions
To address the above issues, media professionals must employ advanced QC solutions tailored to the UHD landscape. Such solutions must have comprehensive format support from capture to delivery, including ProRES, DNxHR, TIFF, SonyRAW, DPX, EXR, AVC, and HEVC.

Different metadata checks for parameters like resolution, bit depth, frame rate, WCG, and HDR transfer characteristics are also required for ensuring the quality of UHD packages. In the case of HDR, additional checks are available for parameters like MaxCLL, MaxFALL, and white point chromaticity. Due to bigger file sizes, there is also a potential issue of file corruption while transferring UHD content. Therefore, checking MD5 or SHA1 checksums may be important for certain workflows.

Owing to higher processing speeds, and increased hardware and storage requirements, multiple media organizations have shifted their UHD workflows to the cloud. For this reason, it’s imperative that a QC solution have both stable and optimized support for different cloud platforms like Amazon, Google, and Microsoft. This will ensure seamless integration with various workflows.

As UHD becomes widespread, it is critical that providers deliver content in pristine quality to match viewers’ ever-increasing expectations. It is important that such content is free from different artifacts, has consistent quality, and meets UHD specifications. This can be achieved with advanced QC solutions tailored specifically for the UHD landscape. 

Such solutions, with comprehensive format support, metadata checks, and cloud integration capabilities, ensure that content retains its quality throughout its lifecycle, from capture to delivery — ensuring a premium viewing experience for audiences.

Manik Gupta
Director of Engineering, Interra Systems