Virtual Production Can be Real for Everybody—Here’s How

Virtual production on Disney+'s "The Mandalorian" (Image credit: Disney)

Think virtual production is the preserve of James Cameron? The confluence of games engines with faster PCs, LED backlots and off-the-shelf tools for anything from performance capture to virtual camera is bringing affordable real-time mixed reality production to market.

Cameron saw this coming, which is why he has upped the ante to where no filmmaker has gone before and decided to shoot the first  “Avatar” sequel as a virtual production under water. Not CG fluids either, but with his actors holding their breath in giant swimming pools.

See more

“The technology has advanced leaps and bounds at every conceivable level since 'Avatar' in 2009,” says Geoff Burdick, SVP of Production Services & Technology for Cameron’s production company Lightstorm Entertainment.

Massive amounts of data is being pushed around live on the set of “Avatar 2,” Burdick says. “We needed high frame rate (48fps) and high res (4K) and everything had to be in 3D. This may not be not the science experiment it was when shooting the first 'Avatar' but … our setup is arguably ground-breaking in terms of being able to do what we are doing at this high spec and in stereo.”

This is just the live action part. Performance captured of the actors finished two years ago and is being animated at Weta then integrated with principal photography at Manhattan Beach Studios.

“Avatar 2” may be state of the art, but it’s far from alone. Most major films and TV series created today already use some form of virtual production. It might be previsualization, it might be techvis or postvis. Epic Games, the makers of Unreal Engine, believe the potential for VP to enhance filmmaking extends far beyond even these current uses.

Examined one way, virtual production is just another evolution of storytelling—on a continuum with the shift to color or from film to digital. Looked at another way it is more fundamental, since virtual production techniques ultimately collapse the traditional sequential method of making motion pictures.

The production line from development to post can be costly in part because of the timescales and in part because of the inability to truly iterate at the point of creativity. A virtual production model breaks down these silos and brings color correction, animation and editorial closer to camera. When travel to far flung locations may prove challenging, due to COVID-19 or carbon neutral policies, virtual production can bring photorealistic locations to the set.

Directors can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail.

What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render—which is light years from where directors used to be before real-time technology became part of the shoot.

In essence, virtual production is where the physical and the digital meet. The term encompasses a broad spectrum of computer-aided production and visualization tools and techniques, which are growing all the time, meaning that you don’t need the $250 million budget of “Avatar 2” to compose, capture, manipulate and as good as publish pixel perfect scenes live mixing physical and augmented reality.

GAME ENGINES

The software at the core of modern, graphics-rich video games is able to render imagery on the fly to account for the unpredictable movements of a video-game player. Adapted for film production, the tech consigns the days of epic waits for epic render farms to history.

The most well-known is Epic’s Unreal Engine, which just hit version 5 with enhancements intended to achieve photorealism “on par with movie CG and real life”.  A virtualized micropolygon geometry, for example, frees artists to create as much geometric detail as the eye can see. It means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into the engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works. Nanite geometry is streamed and scaled in real-time so there are no more polygon count budgets, polygon memory budgets or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.

Epic also aims to put the technology within practical reach of development teams of all sizes by partnering with developers to offer productive tools and content libraries.

It’s not the only game in town. Notch has a new real-time chroma keyer, which when combined with its automated Clean Plate Generation produce “fantastic” results with almost no setup or tweaking while providing all the features you’d expect such as hair, liquid handling and hold-up mattes all within less than a millisecond.

ILM, which uses a variety of engines, also uses proprietary real-time engine Helios, based on technology developed at Pixar.

“The Jungle Book,” “Ready Player One” and “Blade Runner 2049” all made use of Unity Technologies’ Unity engine at various stages of production thanks to custom tools developed by Digital Monarch Media.

For example, on “Blade Runner 2049,” director Denis Villeneuve was able to re-envision shots for some of digital scenes well after much of the editing was complete, creating a desired mood and tempo for the film, using DMM’s virtual tools.

Games engines rely on the grunt power of GPU processing from the likes of Intel, Nvidia and AMD, which has got exponentially faster to enable real-time compositing.

DIGITAL BACKLOTS

The use of video walls in film and TV goes back at least a decade as a light source projecting onto Sandra Bullock and George Clooney in “Gravity.”

More advanced versions playing pre-rendered sequences were deployed by ILM on “Rogue One: A Star Wars Story” and its follow-up “Solo” and during a sequence set on a Gotham metro train in “Joker.”  A system is also being used on the latest James Bond, “No Time To Die.”

The most sophisticated set-ups combine LED walls (and ceilings) with camera tracking systems and games engines to render content for playback not only in real-time, but in dynamic synchronicity with the camera’s viewpoint. The result allows filmmakers to stage scenes with greater realism than with a green or blue screen and with far more chance of making decisions on set.

“The big change has come with more powerful GPUs combined with games engines providing the software for real-time rendering and ray tracing,” says Sam Nicholson, who heads Stargate Studios. “When you put that together with LED walls or giant monitors, we think that at least 50% of what we do on set can be finished pixels.”

For HBO comedy-thriller “Run,” the production built two cars outfitted to resemble an Amtrak carriage on a soundstage in Toronto. These rested on airbags that could be shaken to simulate movement. Instead of LEDs, a series of 81-inch 4K TV monitors were mounted on a truss outside each train window displaying footage preshot by Stargate from cameras fixed to a train traveling across the U.S.

Domhnall Gleeson, Merritt Wever in HBO’s “Run.” Photo by Ken Woroner/HBO

Domhnall Gleeson, Merritt Wever in HBO’s “Run.” Photo by Ken Woroner/HBO (Image credit: HBO)

“It’s a smaller scale and less expensive version of Lucasfilm’s production of 'The Mandalorian,' but the principal is the same,” explains cinematographer Matthew Clark. “It effectively brings the location to production rather than move an entire production to often hard to access locations.”

Any light that played on the actor’s faces or on surfaces in the train had to be synchronized to the illumination outside the windows, otherwise the effect wouldn’t work.

“It was important to line up the picture so when you’re standing in the car your perspective of the lines of train track and power lines has to be realistic and continuous. If the angle of the TV screen is off by just a few degrees then suddenly the wires of a telegraph pole would be askew. When we needed to turn the car around to shoot from another angle the grips could flip all the monitors around to the exact angle.”

LED displays are measured in pixel pitches (the distance in millimeters from the center of a pixel to the center of the adjacent pixel) are narrow enough for the images to be photographed. The panels are capable of greater brightness, higher contrast ratios and displaying 10-bit video.

Rental companies in the U.S offering LED screens or monitors include PRG and  Stargate Studios, and in the U.K., disguise and On Set Facilities, both of which also have operations in LA.

OSF advises that the bigger the pixel the more light it outputs onto your subject, which means very fine pixel pitches may not be optimum for filming. The pixel pitch resolution of the LED screens used on “The Mandalorian” was 2.8.

OSF is set up as a fully managed virtual production studio covering in-camera VFX (LED), mixed reality (green screen) and fully virtual (in-engine) production.

It has a partnership with ARRI and also has its own virtual private network connected to the Azure cloud for virtual production. StormCloud enables remote multi-user collaboration in Unreal Engine powered by Nvidia Quadro technology. Entry points currently set up in London and San Francisco are being tested by “a number of Hollywood Studios and VFX facilities,” says the facility.

CAMERA TRACKING

Another essential component is the ability to have the virtual backlot tracked to the camera movement by a wireless sensor. This means that as the cinematographer or director frame a shot, the display, which is often the main lighting source, adjusts to the camera’s perspective. That’s no mean feat and requires minimal to zero latency in order to work.

Professional camera tracking systems from Mo-Sys and N-Cam are the go-to technologies here, but if purely filming inside a games engine there are budget ways of creating a virtual camera.

To create raw-looking handheld action in his short film “Battlesuit,” filmmaker Haz Dulull used DragonFly, a virtual camera plugin (available for Unity, UE and Autodesk Maya) built by Glassbox Technologies with input from Hollywood pre-viz giants The Third Floor.

Another option is the HTC VIVE tracker, which costs less than $150 and has been tested at OSF. “If you want to shoot fully virtual, shooting in engine cinematic is amazing with a VIVE as your camera input,” it sums up. “If you want to do any serious mixed reality virtual production work or real-time VFX previz, you are still going to need to open your pocket and find a professional budget to get the right equipment for the job.”

PLUG-IN ASSETS

The Rokoko mo-cap suit can stream directly into UE via a live link demoed by OSF. The facility explains that the suit connects over the wireless network to the UE render engine and into Rokoko Studio where OSF assigns the suit a personal profile for the performer. It then begins streaming the data into UE by selecting the Unreal Engine option in the Rokoko Studios Live Tab (a feature only available to Rokoko Pro Licence users). The system is being refined at OSF with tests for facial capture in the works.

Reallusion makes software for 3D character creation and animation, including iClone and Character Creator 3D. The Unreal Live Link Plug-in for iClone creates a system for characters, lights, cameras and animation for UE. The simplicity of iClone combined with UE rendering delivers a digital human solution to create, animate and visualize superior real-time characters. It’s also free for indie filmmakers; details are here.

Character Creator includes a plug-in called Headshot, which generates 3D real-time digital humans from one photo. Apart from intelligent texture blending and head mesh creation, the generated digital doubles are fully rigged for voice lipsync, facial expression and full body animation. Headshot contains two AI modes: Pro Mode & Auto Mode. Pro Mode includes Headshot 1,000-plus sculpting morphs, Image Mapping and Texture Reprojection tools. The Pro Mode is designed for production level hi-res texture processing and ultimate face shape refinement. Auto Mode makes lower-res virtual heads with additional 3D hair in a fully automatic process.

OSF ran this through its paces, using Headshot to automatically create a facial model, which was animated within iClone 7 using data from actors performing in Rokoko mocap suits streamed live to iClone allowing real-time previews and the ability to record animations. OSF also used Apple’s LiveFace app (available for download on any iPhone with a depth sensor camera) and its own motion capture helmets to capture the facial animations. The next part of the pipeline is to transfer the assets over to UE with the Unreal Engine LiveLink plugin and Auto Character set up plugin, which creates skin textures in the same way as Epic Games’ digital humans.

VIRTUAL PRODUCTION ON A BUDGET

British filmmaker Dulull made the animated sci-fi short “Battlesuit” using Unreal Engine, on a skeleton budget and team of just three, including himself.

Rather than creating everything from scratch, they licensed 3D kits and pre-existing models (from Kitbash3D, Turbosquid and Unreal). Dulull animated the assets and VFX in real-time within Unreal’s sequencer tool.

They retargeted off-the-peg mocap data (from Frame Ion Animation, Filmstorm, Mocap Online) onto the body of the film’s main characters. For facial capture they filmed their actor using the depth camera inside an iPad and fed the data live into UE.

“We had to do some tweaks on the facial capture data to bring some of the subtle nuance it was missing, but this is a much quicker way to create an animated face performance without spending a fortune on high end systems,” Dulull says.

Powering it all, including real-time ray tracing, Dulull used the Razer Blade 15 Studio Edition laptop PC with Nvidia Quadro RTX 5000 card.

Every single shot in the film is straight out of Unreal Engine. There’s no compositing or external post apart from a few text overlays and color correction done on Resolve.

“If someone had said I could pull off a project like this a few years ago that is of cinematic quality but all done in real-time and powered on a laptop, I’d think they were crazy and over ambitious,” he says. “But today I can make an animated film in a mobile production environment without the need for huge desktop machines and expensive rendering.”

This story originally appeared on TVT's sister publication Creative Planet Network.

Adrian Pennington

Adrian Pennington is a journalist specialising in film and TV production. His work has appeared in The Guardian, RTS Television, Variety, British Cinematographer, Premiere and The Hollywood Reporter. Adrian has edited several publications, co-written a book on stereoscopic 3D and is copywriter of marketing materials for the industry. Follow him @pennington1