Virtual reality is upon us, and, like any other seachange technology in the world of entertainment communication, it’s ultimately going to be up to editors to figure out how to handle it.
Whether it’s called “immersive multimedia environment” or “computer-simulated reality,” virtual reality, or VR, comes in many flavors. But all of them share one factor significant to editors: We’ve lost control over the viewer’s perspective. So to evolve VR from an experiential medium to a complex means of communication we are going to have to develop a whole new language of visual storytelling.
Fortunately, while we invent a cohesive editorial grammar for VR, many of the tools we’ll need for editing are already being developed.
Assimilate’s SCRATCH VR Suite can take a VR project from ingest to mastering. Avid’s latest software Media Composer v.8.5 can input the multi-image output of a VR camera array that has already been stitched together by third-party software such as Boris FX Continuum Complete. At that point, the 180-degree or 360-degree video is called an “equirectangular panorama,” or on the West Coast, a “latlong” (short for “latitude/longitude”) clip.
“Right now, the Avid Media Composer needs an already-stitched equirectangular panorama that we can edit just like a multi-camera shoot,” said Dave Colantuoni, senior director of product management at Avid. “Since Media Composer is fully capable of handling 4K inside of DNxHD and the large files of 3D stereoscopic editing, we have a technology set that is not far off from handling a VR editing environment.”
Adobe will be bringing out its next version of Premier Pro CC NLE later this summer with enhanced VR editing capabilities. “We’ve put in some really solid foundational features for editing VR,” said Bill Roberts, senior director of professional video product management at Adobe. “Since an equirectangular image can easily top 8K in size, our ability to handle large file sizes is very important for VR pioneers.”
POINT OF VIEW
Roberts tells us the first thing Premier Pro CC provides is a series of settings that defines whether the viewer is in a 180-degree or 360-degree environment and the “look at” point of view or how many degrees of the content the editor wants the viewer to see at one time.
“Usually this is between 115–120 degrees of vision, and that will be displayed on either the fourth monitor or the timeline monitor,” Roberts said, “and you can selectively display only that ‘look at’ point of view while editing. This shows you what the viewer would see if they had a VR headset on.”
Then, either a mouse or a touchscreen will let you pan all around the VR environment just as the viewer will be able to.
“Once the project is edited, the ‘Destination Publish’ output function lets you render out files in the format required by social media sites like YouTube, Twitter or Facebook,” Roberts continued, “and we’ve just added a tab to turn on 360-degree video.”
Actually, if you have the SkyBox VR Player plug-in made by Mettle, you will be able to use Adobe’s own Mercury Transmit technology to send the VR imagery over HDMI to an Oculus Rift headset.
“In effect, the broadcast monitor has been replaced by a VR headset,” Roberts said.
As editors know, all this technology is just a means to an end, and that end is wrapped around the aesthetics of virtual audio/visual communication.
In Madrid, Spain, Adrian Gonzalez has cut several projects on a Mistika edit system at the headquarters of parent company SGO (Soluciones Gráficas por Ordenador) using Mistika’s Virtual Reality mode.
“This mode maps the equirectangular imagery into a 360-degree spherical space,” Gonzalez said, “including all effects and color corrections.”
But since he, as an editor, cannot control the direction the viewer is looking, how does Gonzalez determine where to make an edit?
“No question, that is one of the main issues,” Gonzalez said. “You need to make transitions between shots very soft, kind of like in 3D. Usually dissolves work best, since that gives the illusion you are maintaining the same environment through the sequence. Fast editing is out, so you mostly rely on long shots.”
A demonstration of SGO’s Mistika Virtual Reality mode at the 2016 NAB ShowVISUAL LANGUAGE
Gonzalez says editors need to use the whole world as the point of view.
“We have tried editing some clips with narrative content, but this demands developing a different visual language that involves using the environment as part of the communication space,” he said. “It is a very radical experience, even more challenging than using the depth space in stereo 3D, because you are not just watching a screen. You and the viewer are together inside a totally surrounding artificial world.”
One facility that has probably posted as many VR projects as any is Assimilate in Santa Clara, Calif. The company’s SCRATCH VR Suite was just announced at the 2016 NAB Show. With SCRATCH VR, editors can take a VR project from ingest to mastering, according to CEO Jeff Edson.
“We typically work with the equirectangular file itself, while viewing it on a 360-degree headset,” Edson said. “But the big difference in choosing edit points between classical editing and 360 editing is that in the former, the point of view is defined by the camera. In the 360-degree VR world you need to define your own point of view.”
And how is that done with cameras all around you?
“You can lead the viewers down a certain direction, showing them the things you want them to see,” Edson said. “Then you transition from shot to shot, keeping in mind that at any time the viewers can look anywhere they’d like to. But when they look back forward, you are taking them down the path you want them to follow.”
Of course, you can call upon other senses to help with that guidance.
“Audio has an even greater impact in the 360 world than in a 2D world,” Edson said. “On a flat screen, if I hear something behind me I notice the sound, but pretty much ignore it. In a 360 environment, if I hear a significant sound coming from behind I’m going to turn and look. The fact is, in the 360 world there is no off-screen audio. So, audio cues are going to be crucial while editing virtual reality when the world is all around us.”
Jay Ankeney is a freelance editor and post-production consultant based in Los Angeles. Write him atJayAnkeney@mac.com.
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.