Skip to main content

Behind the Post of ‘Avatar’

By the time you read this, millions of 3D film fans will have experienced the forest moon Pandora and its fiercely beautiful blue-skinned inhabitants, the Na’vi. But as of this writing, the post production on James Cameron’s gigantic sci-fi production, “Avatar,” was still in high gear.

In fact, all of the theaters intending to show “Avatar” on 3D screens had not even been converted to stereoscopic presentation, although they were predicted to eventually number almost 4,000 in the United State alone including most of the existing RealD, Dolby Digital and IMAX 3D venues along with some 2D presentations.

However, with upwards of $500 million having been invested in this film’s production and publicity, the industry buzz is rampant—will “Avatar” mark the dawn of a new era of mega hits that will propel 3D into the mainstream of feature film production?

Without revealing too many plot points, “Avatar” tells the story of a paralyzed former U.S. Marine named Jake Sully (played by actor Sam Worthington) who accepts an assignment to an alien world called Pandora where he will be able to exist in a remotely controlled virtual body, or “Avatar.” But finding himself in conflict with a humanoid race of sentient beings called the Na’vi, among them a young Na’vi female, Neytiri (actress Zoe Saldaña), Jake becomes embroiled in a war over the moon’s resources.


Crucial to the 3D look of “Avatar” is the extensive use of the Fusion Camera System developed by Emmy-nominated cinematographer Vince Pace and Academy Award-winning director James Cameron. Originally called the Reality Camera, it was first deployed on Cameron’s underwater documentaries “Ghosts of the Abyss” (2003) and “Aliens of the Deep” (2005), which proved that Pace’s concept of combining two Sony HDC-950 HD cameras with lenses that could dynamically adjust the angle of their convergence to match the depth of objects in Z-space significantly reduced the eye strain associated with viewing previous 3D productions, while increasing the flexibility of manipulating an image’s perceived depth effect.

Paralyzed Marine Jake Sully (Sam Worthington) volunteers to exist as an Avatar on Pandora. Now upgraded with Sony HDC-F950 cameras, the Fusion Camera System was used on location in New Zealand throughout the live action sequences on “Avatar” by Mauro Fiore (2007’s “The Kingdom”). Pace, who is CEO of the digital camera system company, PACE, in Burbank, Calif, also served as second unit DP in New Zealand and as primary cinematographer on all the scenes shot in studio back in Los Angeles. Like James Cameron himself, Pace has been closely involved with all the post work on the film.

“A key enhancement to our Fusion Camera System used on ‘Avatar’ has been our ability to introduce a software algorithm that controls the convergence so we can extract the best stereo from a shot based on metadata such as focal length and distance to the subject,” Pace explained.

“This patent-pending ‘Constant Convergence Algorithm’ then specifies variables such as the interocular distance between the lenses and their necessary convergence point, and when used in conjunction with an on-set convergence engineer, gave us a guide track for creating images in ‘Avatar’ that most closely emulate the way human eyes perceive depth. We want it to be a totally immersive 3D visual experience.”

But that 3D experience has taken almost three years to emerge from the edit bays at 20th Century Fox and also Cameron’s own Malibu home at the hands of post pros John Refoua (editor on 2007’s “Balls of Fury,” Cameron’s “Ghosts of the Abyss,” and many TV episodics) and Stephen Rivkin (three “Pirates of the Caribbean” features along with hits such as 2001’s “Ali” and “The Hurricane” in 1999). Refoua and Rivkin have worked closely with Cameron’s vision to take audiences on a 3D thrill ride to Pandora.

“Avatar” was cut on Avid Media Composer systems running software version 2.8.4. Even with the help of stalwart first assistant Jason Gaudio, the editing team did not want to risk upgrading their NLE software in mid-project despite the fact that Media Composers have been able to playback 3D sequences directly from the timeline ever since version 3.5. So when they wanted to view the 48 terabytes of footage on their Avid Isis storage system holding both left and right eye tracks, they had to run both dailies footage and cut sequences through a QuVIS Acuity 3D playback platform.

Easy access to viewing scenes in Z-space is vital as editors are still learning the grammar of 3D visual story telling.

“The 3D in ‘Avatar’ tends to be used to enhance the reality of the visual environment rather than as an effect in itself,” said Refoua. “We want the audience to feel they are actually standing in the scene they are watching. This also enabled us to use faster cuts in some of the action sequences because the audience’s eyes will not be overwhelmed with visual information.”

But it is the unreality of “Avatar” that makes its reality challenging.

Jake Sully meets a young Na’vi female, Neytiri (Zoe Saldaña), and integrates himself into her clan. He finds himself forced to choose sides in an epic battle that will decide the fate of an entire world. “One of the most intricate aspects was the way Jim dealt with the motion capture for the virtual figures he envisioned in the film,” Refoua said. “First, we would get footage of real people in performance capture suits shot in 2D from different angles. We would cut that together to evaluate the action and performance, and then Jim would shoot those sequences using his virtual camera in 3D space. That footage would then be turned into a crude rendering of the final effects and we would get a new set of dailies. This let us give the ultimate visual effects team a fine cut of each scene so they would only have to create the final output of the exact frames needed for the movie.”

The editors continually presented Cameron with creative alternatives during the post-production process.

“Originally the movie opened with an elaborate back-story sequence of what happened to Jake Sully while still on Earth,” described editor Stephen Rivkin. “But because we wanted to move the action to Pandora as quickly as possible, we ended up using that footage as flashbacks visualized in Sully’s mind. By editing this exposition into the sequence of the trip to Pandora, the pace of the storytelling could be accelerated. Since Jim was intricately involved during the whole editing process we could experiment with the footage while remaining firmly within his creative vision.”

It’s a fair bet the longer Earth sequences may end up as extras on the “Avatar” DVD or Blu-ray release. Now it is time to track the box office reports of what is being called the most significant 3D movie ever produced to find out if there really is a future for big-budget, live-action Hollywood spectaculars in Z-space.

Jay Ankeney is a freelance editor and post-production consultant based in Los Angeles. Write him at 220 39th St. (upper), Manhattan Beach, Calif. 90266 or