Skip to main content

Will high frame rate video finally kill off interlace?

I visited a Dolby screening room yesterday for an introduction to Atmos, Dolby’s next-generation cinema sound platform. It is just a cinema product today, so not of pressing interest to broadcasters, but television was once only monophonic, and is now up to 7.1 surround, so who knows what could happen in the future? A key addition to surround systems is overhead sound from ceiling mounted speakers. We saw clips demonstrating how audio “objects” can be placed around the auditorium.

Atmos looks set to add to the cinema experience, and combined with high frame rate (HFR) digital projection, does differentiate from viewing at home.

I was distracted from the audio by the video presentation, one of the problems with being an engineer. The second clip was shot on film and with the superb projection in the preview theater, it was easy to see the crawling film grain all over the picture. But what I found most disturbing was the motion judder. It is really accentuated in a darkened room with projection to the periphery of vision. It was also the contrast from the first clip, which was video. 24-frame film does not look good with 2K digital projection; the temporal resolution just does not match the static resolution. I guess we have all got use to the wagon wheels rotating backwards!

However the next clip was 60fps, and stereo. This was like a veil being lifted. It was close to looking at the real thing—if you ignored some of the cue conflicts of depth perception inherent in stereo 3D.

The high frame rate does lift the viewer out of the marginal success of motion portrayal exhibited by legacy film and 25/30-frame television. The frame rate of the systems was chosen to be the lowest that viewers would accept, and a long time ago when the novelty of viewing moving pictures overcame objections about image quality. We have put up with in the interests of restricting the bandwidth for the delivery systems, or with film the limits of mechanical projection.

Peter Jackson’s Hobbit is pioneering high frame rate digital cinema, being shot at 48fps, although Doug Trumbull’s Showscan predates this by many years. I remember seeing a motion ride at the Luxor many NABs ago and it did look good—though the projection prints didn’t last long.

Broadcasters may smile knowingly at film judder, but 50/60-field interlace is hardly perfect. Interlace was a great idea in the 1920s, when TV processing was performed with a handful of vacuum tubes. Back then the technology needed for MPEG decoding was in the realm of science fiction. Interlace allowed a higher refresh rate at half the bandwidth of a progressive scan system with the same number of lines. The cinema trick of showing each frame twice to avoid flicker was not possible back then, that had to wait until digital frame stores were cheap enough to include in the receiver. The double (or triple) flash might avoid flicker, but the cadence of motion is still unnatural.

Imaging devices were so soft back in the early days of television that artifacts like inter-line twitter were not a problem. When electronic CGs came along the problems of interface did start to manifest. As a result CGs included “anti-aliasing” to reduce inter-line twitter. This scientific term sounded good in the marketing material, but what was really happening was that the vertical resolution was halved, giving the vertical resolution of a 250-line system from a 525-line standard. Interlace got around the frame flicker issue at the expense of vertical static resolution and motion artifacts (like the combing on moving vertical edges).

With advances in compression, new efficient schemes like AVC mean that high frame rate progressive systems could be used for delivery to the viewer. The EBU have been great advocates of this move, but there seems to be little interest from broadcasters who seem wedded to interlace, apart from a few progressive pioneers.

As 4K and HFR movies up viewers’ expectations, how much longer will broadcasters hang on to interlace? The usual excuse is backwards compatibility with SD material. How about forwards compatibility with the progressive displays that every consumer device now uses? I am sure many readers will have a view on this, please comment below.