Fox Makes VR 'Wild'

Baylor team creates VR app that animates books
Author:
Updated:
Original:

HOLLYWOOD—Virtual and augmented reality technology was showcased at the 2015 SMPTE Technical Conference, which wrapped up Oct. 29. TV Technology was able to check out a couple of demos, including a VR short from the movie, “Wild,” starring Reese Witherspoon, and a couple of apps from designers in the Department of Film & Digital Media at Baylor.

Fox brought the “Wild,” demo to a corner of the VR/AR exhibit room, where random SMPTE-goers sat wrapped in VR goggles and looking around the room at a forest that the goggle-free could not see. The clip was low action and so not so conducive to cybersickness—a common complaint with VR. In the “Wild,” clip, the viewer is sitting in the middle of a quiet forest on a sunny day. The colors were vibrant, but the resolution low, as is also typical of VR. The scene seams were relatively easy to detect, but not distracting.

Several seconds into the clip, Witherspoon comes walking up in character, wearing a backpack and hiking up a nearby trail. She pulls off her pack and sits near the viewer on a rock, seeming to look right at the viewer, though within a few more seconds, it's clear she's looking at something behind the viewer, which turns out to be Laura Dern, who played Witherspoon’s character’s deceased mother. So essentially, the viewer is sitting between Reese Witherspoon and Laura Dern. Dern then disappears when the veiwer looks away, Witherspoon takes up her pack and trudges down the trail. A small fox runs through the scene. Cut. (“Wild VR Experience” images courtesy of 20th Century Fox.)

The Baylor team, incuding Professor Corey P. Carbonara, Ph.D., Assistant Professor Daniel M. Shafer and grad student Blake Copeland, demonstrated augmented reality apps that created 3D animations from the two-dimensional pages of books. (Depicted in accompanying videos below.)

Team Baylor also brought a VR modeler based on an “open source Scanner sample app in the Structure SDK,” according to Copeland. The app worked with an iPad affixed with a $110 LIDAR-equipped camera no larger than a typical portable phone charger. The camera device is pointed at the subject while the operator moves slowly around it, capturing data from all angles and producing an .obj file (which we cannot display here, but the two still images were captured from the process.) Copeland used it to create the author’s avatar in a matter of moments, while onlookers tried to make her crack up.