Love it or hate it, CNN’s use of Star Wars-style “holograms”
on election night was one of the most striking pieces of TV magic to date. But
as impressive as it was for Chicago-based reporter Jessica Yellin and
entertainer will.i.am (of the hip-hop band Black Eyed Peas) to “beam in” to
Wolf Blitzer’s New York Election Center for live “face-to-face” interviews, the
way CNN made it happen was even more astounding. Here is the science behind the
beams in to CNN’s New York studios.
The driving force behind CNN’s holograms is David Bohrman, the network’s senior
vice president and Washington bureau chief. For 12 years, Bohrman has dreamt of
improving on live remote interviews; typically executed using satellite-linked
guests based in other physical locations. His idea is akin to the famous “Star
Wars” scene in which a 3D talking, moving hologram of Princess Leia is
projected by the droid R2D2.
“I’ve basically been a crazy mad scientist, trying to get this done,” Bohrman
told CNN anchor Wolf Blitzer on CNN following the election. “This year we
pressed really hard, and about three months ago we launched into developing it,
and it ended up working.”
To make it work, CNN (led by Washington Senior Video Producer Chuck Hurley),
Ran Yakir from Israel-based graphics company Vizrt, and freelance broadcast
engineer/consultant Jason Odell brought together several different broadcast
technologies. These were drawn from Vizrt, SportVU, Shotoku Broadcast Systems,
Thoma and Fujinon. “It was a very complex project,” Odell said, both in scope
and because “there were so many different players, of which CNN Engineering was
HOW IT WAS DONE
Yellin and will.i.am’s interviews were shot in two temporary studios, built
inside a massive tent in Chicago. The inside of each studio was home to a 220
degree semicircular chromakey green stage. Inside this space—a.k.a.
the “Transporter Room”—were 35 HD “machine vision” CMOS cameras.
Spaced six inches apart around the 220 degree arc and pointing towards the
space’s center, these 2D cameras were positioned to provide multiple angles of
the subject; images that could then be composited to create the illusion of a
3D whole. “It is akin to shooting QuickTime VR [virtual reality], except that
the cameras were pointed in at a subject, rather than out to capture a 360
degree,” said Goshen, Vizrt’s director of usability.
diagram of the Vizrt-SportVu workflow.
So how did this create a 3D image?
Hurley described the process as simply taking existing chroma-key technology to
extremes. “Weathermen have been standing in front of green screens for years now,
but that’s [with] one camera,” Hurley said in an interview on CNN.com. “Now we
can do that times 35, so you can send all the way around the subject.”
Adds Odell, “From these 35 fixed cameras we can now derive an infinite number
In New York, CNN used positional tracking camera pedestals and a jib from
Shotoku Broadcast Systems. The goal was to bring 3D positional data back to
Vizrt Viz IO camera tracking software (part of Vizrt’s Virtual Studio
“Every moving part of our pedestals and jib are equipped with high resolution
encoders,” said Naoki Ebimoto, president of Shotoku (USA) Inc. “This data is
calculated instantly so that we know precisely where the camera is located and
looking in real time without any delay.” Positional tracking signals of CNN’s
handheld cameras were fed to the virtual studio computer with 3D data captured
by IR cameras in the New York studio, using Thoma’s Walkfinder technology. All
this data was processed by Viz IO and supplied via fiber to Chicago.
Back in the Transporter Room, the tracking data told the Vizrt Virtual Studio
software which two cameras should be accessed to provide the right 3D
perspective for creating the hologram. Next, using a Viz Engine plug-in
originally created by SportVU to show where players are on a game field, a 3D
electronic model of the person being shot (Yellin or will.i.am) was rendered in
Viz Engine. On top of this model was laid the video texture captured by the
“The in-between frames [those locations which do not correspond to a single
camera view]—both in shape and color—were smoothed out by
the plug-in of SportVU in Viz Engine,” said Goshen.
Once the final image was ready, it was sent back to New York via fiber, so that
the local and remote feeds could be blended to create the hologram illusion.
The result: Holograms on television, live!
Two points worth noting: First, Wolf Blitzer didn’t actually see the hologram
standing before him in the studio. Instead, he looked in the direction of a ‘red
dot’ on the studio floor while watching the combined play-out on a
Second, the Star Wars-like blue edging that surrounded Yellin and will.i.am was
not an artifact like the ones around Princess Leia in the movie. (That
sequence—to which CNN’s effort has been constantly
compared—was entirely faked and never done in real time.) Instead,
CNN deliberately added the blue edging during production to alert the viewers
that it was an effect, not an in-studio live body.
One of the most interesting challenges of CNN’s hologram effect was the time
delay between Chicago and New York. The problem was not the speed of the
fiber-optic link, but rather the time it took for the Vizrt/SportVU system to
process the Shotoku/Thoma position data and create the right 3D hologram image.
“It initially required four seconds for this to happen, but we got it down to
three,” said Odell. “Still, this processing time meant that the director in New
York had to decide which camera shot she wanted next, bring it up on his
Preview channel and then send us that data. Once we had it, the 3D image was
created and sent back as a full HD signal; only then could she take to the next
Even with three months’ lead time, CNN Engineering and its partners had to work
full-out to make the hologram effect available for the election day broadcast.
Thanks to their efforts, the system “basically worked perfectly,” said Odell.
Still, he would have preferred to have more time to iron out the bugs. “If we
had, maybe we could have had the interview guest directly on top of the ‘red
spot’ in Blitzer’s studio at all times, where they were supposed to be.”
Will CNN’s hologram effect become a staple of broadcast television? Odell
thinks so. “I think people will demand it in sports,” he said.
But Goshen isn’t sure. “It is probably overkill for projects that are smaller
than U.S. elections,” he said. One problem is the degree of training that
camera operators require to ensure that the tracking system functions properly.
“You can’t expect freelancers to come in and know how to do this,” he said.
“We’ll see,” Bohrman told Blitzer during their post-election chat on CNN. “But
television evolves, and how we do things evolves, and at some
point—maybe it’s five years or 10 years or 20 years down the
road—I think there’s going to be a way that television does
interviews like this because it allows for a much more intimate possibility for
a remote interview.”