Taking Editing into the Third Dimension
October 20, 2009
3D is popping out all over in movies these days, and as usual, it is up to the post-production pros to ride herd on all the magic that is coming out of Z-space.
But for editor Christopher Murrie who cut Henry Selick’s recent stop-motion animation hit “Coraline,” it was especially important to make every frame count.
I recently had the opportunity to speak with Murrie about his work on the film for the Laika Inc. animation studio. I tried to structure my interview around the three great editing principles of “context, contrast and rhythm,” which have been a leitmotif of this column to examine how an editor handles the aesthetics of working in 3D. If you want a refresher on that trinity, you can find it online.
Although Murrie has recently been testing out Avid’s 3D-oriented version 3.5 software, he edited “Coraline” on an earlier
version of Media Composer that was strictly 2D. He was helped during the fine cut by veteran consulting editor Ronald Sanders.
Coraline gets a surprise. ©2008 LAIKA Inc.
“This was all new territory for us,” Murrie said. “Our biggest problem was not having a simple, quick way to actually see the 3D. We previsualized everything on the storyboard and created extensive animatics from that to guide the animators, but then had to go into a 3D theater to see our cuts in full dimension through polarized RealD glasses.”
Murrie and the lead media engineer on the project, Trevor Cable, were discovering the rules for cutting 3D in context with the film’s overall story as they went along.
“We learned that pushing objects back and forth in Z-space strongly affected how one shot would relate to another,” Murrie said. “But we could only determine their final rhythm once the left and right eyes were streamed together and seen through the 3D glasses in the theater.”
To accommodate this, Murrie helped the animators establish the rhythm of intricate scenes as tightly as possible using the pre-viz animatics.
“There is a very elaborate trapeze act in ‘Coraline’ that went through a half-dozen iterations before the animators began filming their puppets,” Murrie said. “We needed to determine the precise timing of each trapeze swing so everything would line up perfectly once I cut the final shots together. On a stop-motion film like this, editing exists throughout the whole process, servicing the production all the way from preproduction to post.”
Live action cutters are accustomed to inserting unplanned transitions with the push of a button, but Murrie had to help the “Coraline” animators foresee the need for those extra frames.
“We had all the transitions timed to their exact length before we hit ‘Go!’,” he said. “If we asked for 40 effects frames, that’s exactly what we were going to get. Any time we changed a cut, we had to sit down with writer/director Selik, producer Claire Jennings, and animation supervisor Anthony Scott, to work out what we needed and how it could be scheduled. We varied the shot lengths many times, but I don’t think we changed any of the transitions once filming the puppets began.”
As a practical matter, retakes were only possible while the animators had left the sets hot, but Murrie can’t remember a single time they had to reset a shot.
“There was one time when we changed a background on a scene by rotoscoping around a character, but that was about it. The editor’s job is to make sure everything is well rehearsed and reviewed before they start to shoot the final.”
SHOT IN 4K
Lead media engineer Trevor Cable told me everything was shot in 4K using Redlake cameras from Integrated Design Tools Inc. (IDT) and their files were sent directly to the “data wrangling department,” which prepared them for editorial.
“Our IT department handled all the networking and online storage,” Trevor said. “Of course, since we were shooting left and right eyes in 4K for each shot, it was a tremendous amount of data.”
As opposed to the dual lens cameras or over/under beam splitting rigs common on live-action 3D productions, these Redlake models had only one lens, but could be moved laterally on a small programmable motion-control track to create the interocular distance needed for the illusion of stereoscopic 3D.
Coraline is eager to enter the doorway to another world. ©2008 LAIKA Inc.
“That way we could easily adjust the interocular wedge to affect an object’s position into or in front of the screen,” Trevor said. “Most importantly, it let us set the interocular distance to match the eyes of the puppets. We learned that if we positioned that IO to human scale, everything looked like miniatures. But adjust it to the puppet’s perspective and the sets appeared like the real world in the context of the film.”
Murrie cut only the left eye image in 2D on his Avid Media Composer, but the wrangling department simultaneously provided daily rushes to the 3D theater.
“That meant there was no way to see a whole sequence in 3D until the data wizards had the time to prepare a conform of it from my EDL,” he said. “So it became challenging to evaluate how a cut would look in 3D because I’d have to remember the way it appeared in the theater when I went back to my flat-image edit system. Sometimes I would have to change the shots based on my memory of how they worked in 3D because of a depth change or mismatched position of an object in Z-space.”
This is going to get a lot easier now that Avid has given its software the ability to display its timeline on a 3D monitor right in the edit bay. They introduced this capability on version 3.5 and have enhanced it in the new 4.0 introduced during IBC2009.
But the most important aesthetic principle Murrie learned about the use of 3D in “Coraline” is not to overuse it. Like him, the main character traveled between two worlds in the story, but he did not want the illusion to intrude upon the film’s narrative.
“We dialed back the 3D effect in Coraline’s real world so it would have greater impact in her alternative reality,” he recalls. “We wanted the audience to feel the depth, but were careful not to overuse it, and learned to build in a bit of breathing room when cutting from an extremely deep shot to something coming off the screen. Just like cutting 2D, you don’t want the illusion of Z-space to overwhelm the audience. We treated the screen as a proscenium, and kept most of our 3D from breaking the frontal plane.”
Jay Ankeney is a freelance editor and post-production consultant based in Los Angeles. Write him at
comments powered by Disqus.