SMPTE 2017: Developing the Cinema Experience 2.0

LOS ANGELES—As dwindling theatrical attendance continues to alarm the industry, it certainly seems like a good time for a body such as SMPTE to explore issues related to enhancing the cinema experience. Wednesday’s Cinema Processing and Projection Technology track, held in the TCL Chinese Theatres’ VR Theatre, addressed these issues in a series of presentations: one on multiscreen content; one on variable frame rate projection; and one about factors that must be considered when quantifying high dynamic range (HDR) projection.

The history of multiscreen presentations goes back to Cinerama and the triptych format used for portions of Abel Gance’s 1927 epic “Napoleon.” More recent digital incarnations include Barco’s Escape format. Instead of installing a single, wider screen, the use of extra screens to expand the effective width of the image is more adaptable to the layout of different theaters exhibiting the same content.

This process can be complex and might have discouraged some theaters from adapting for multiscreen content. Sungmin Cho and Kyunghan Lee presented their research introducing the concept of VR Theater, which uses VR technology to map an auditorium’s specific characteristics, including height, width and angle of view for individual seats, to simplify what is now quite an elaborate process.

VR Theater uses the Unity game engine, HTC VIVE media encoder and Oculus Rift. The idea is to be able to generate an XML file that can retain information about where additional screens can and cannot be placed, the angles, and the effect in a specific room of light reflection from additional screens on the center or main screen. The work on the VR Theater concept is ongoing and the speakers noted that they intend to track additional parameters. They believe that solving these issues for presentation venues could be a key to seeing greater adoption of multiscreen cinema.


The next presentation concerned variable frame rate projection and was given by Tim Ryan, software system architect, DLP Cinema at Texas Instruments. Ryan explained that while current DLP technology is capable of projecting at up to 120 fps, factors such as throughput capacity make achieving that frame rate at 2K or 4K impractical or impossible for current theaters. (Although Ang Lee’s multi-frame rate feature “Billy Lynn’s Long Halftime Walk” was never mentioned by name, it presents a perfect example of this problem, as only a handful of people were ever able to see it in all its multi-frame rate glory.)

Furthermore, Ryan pointed out that there is currently no easy way to build multiple frame rates into a single DCP (Digital Cinema Package). Altering the standard to do so, he said, could be a simple process and would represent a significant advancement, allowing filmmakers—such as Lee, James Cameron and others who have expressed interest—to explore the possibilities of having some scenes run at 24 fps, others at 60 and others perhaps at 120 within a single film.

Applications wouldn’t be restricted to future projects. Ryan discussed communications he’s had with film archivists about movies from the silent era that frequently came with notes to the projectionist about the ideal speed to crank the projector for specific scenes. The people who restore and remaster films from this era have sometimes been able to approximate this effect, but Ryan’s suggested changes could actually allow audiences to see these films projected at the frame rates those accompanying notes suggest.

Ryan said he and the TI team that has been working on this MFR project have come up with a way to add instructions to a revised version of a DCP, which would communicate changes in frame rate to a digital cinema projector such that it would have enough time to make the shift seamlessly.

The work is still in progress, and Ryan acknowledged the generally poor audience reaction to films that use frame rates other than the 90-year-old 24 fps standard dating to the introduction of sound. He also responded to a question about the ramifications for sound encoding in a MFR feature still needed study. But he noted that there is continued interest in MFR by major directors and believes that if people build the technology, the filmmakers will come.


The final presentation came from Martin Richards of Dolby, representing work he and Dolby colleagues Barret Lippey, Peter van Kessel and Dave Schnuelle have done in measuring the factors involved in determining a true contrast range for HDR theatrical systems such as their Dolby Vision, which is capable of 108 nits, rather than the 48 of standard dynamic range D-cinema.

Looking at theaters equipped with the laser projectors used for Dolby Vision projection, he and his team examined real luminance values from actual motion picture scenes of varying APL (average picture level) with the intention of quantifying, or at least defining, the real-world impact offered by Dolby Vision.

While Richards reported having measured contrast ratios of up to 400,000:1, he acknowledged that the number itself is not as definitive a measurement as some might think because there are so many variables—from actual shooting and mastering, which often uses that available brightness sparingly, but also based on extraneous factors in the theater itself, from ambient light in the theater to filthy port glass, to the color of the walls and seats. All this and more can affect the real displayed dynamic range significantly, especially in scenes with a low APL.

The three presentations certainly demonstrated that scientists and engineers are working hard to develop, refine and implement new types of cinema presentations. If intrepid artists find ways to use the technology and audiences embrace what they do with it, the movie-going experience might soon look very different.