Extreme skateboarders and BMX bikers appear to hover in mid-air, while home viewers’ vantage points slowly rotate around the acrobatic athletes. Thanks to an innovative visual processing system from New Jersey-based Kewazinga, Inc., ESPN viewers may have mistaken the network’s broadcast of the Summer X-Games for an afternoon of science fiction.
On the surface, the Kewazinga effect is reminiscent of "Bullet Time," the special effect used in the film "The Matrix" to suggest warps in the continuities of time and space; but at its core, the company’s technology is far more accessible - and affordable - than sci-fi magic from the movies.
Like the technology from "The Matrix," Kewazinga’s system uses a curved rail with cameras affixed at regular intervals. Unlike the movie version, however, Kewazinga uses conventional video cameras not exotic slow-motion film cameras to capture live, full-motion sequences. And Kewazinga’s patented processing technology delivers results at near-real time speeds on garden-variety computers from Sun Microsystems and Dell Computer.
BIRTH OF AN IDEA
According to Kewazinga CEO David Worley, it all began with a pie-in-the-sky notion: capturing live events in a way that would, at a later time, give the viewer the look and feel of being able to change their perspective on the event ¾ in effect, to get up and walk around the corner for a better view. To realize their goal, the team envisioned hardware that could take an array of live, real-time cameras and create virtual camera positions between them.
Worley and his partners weighed the benefits of hiring a staff of engineers and scientists for in-house development of the underlying technology. In the end, though, they decided to approach the prestigious Sarnoff Corporation with their idea, at first asking the Princeton, N.J. think tank simply to conduct a feasibility study.
Worley and his colleagues were surprised, however, by the Sarnoff group’s response.
"They said they already had millions of dollars of intellectual property in-house that could be used as the building blocks for such a system," Worley said.
Despite this serendipitous pairing, the Kewazinga project would, according to Worley, present no small challenge
"We commissioned them to build out a system that had never before been conceived, never before built and operable," he said.
The enterprise then began a hectic push to bring the new technology to market: 18 months of development in the lab, culminating in the opening of Kewazinga’s studio here in late 2000; six months of further testing and refinement in the studio; and then, finally, some real-world tests of the nascent technology. The group trundled off to Florida to shoot test footage during a spring training game between the New York Mets and the Los Angeles Dodgers. The result: Kewazinga’s system performed exactly as intended, and the viewer’s perspective on home plate could be varied with the swipe of a mouse.
SO MUCH TO SEE
Although sporting events are a natural endpoint for Kewazinga’s variable-perspective capabilities, Worley noted that the company’s founders envisioned a host of other applications.
"We had in mind a broad array of areas, including education, health care and medical, defense and aerospace, travel and leisure, telecommunications, and merchandising," he said. "But we also found the need to begin generating revenue, so we selected sports broadcast as the ‘low-hanging fruit.’"
Kewazinga’s cherry-picking netted an impressive harvest: ESPN’s Summer X-Games in Philadelphia.
"ESPN has been wonderful," said Worley, who added that the combination of high visibility and a hip young audience made the X-Games "a perfect beginning for us."
In a two-part collaboration between Kewazinga and ESPN, Worley’s team visited the Woodward Sports Camp prior to the games and filmed 13 athletes on BMX bikes, inline skates and skateboards. This advance footage was turned into one-minute features called "Tricks & Tips," which aired during the August broadcast of the X-Games. Most important, however, was the integration of the Kewazinga system, branded "Axis" in its ESPN incarnation, into the broadcast production process. The Axis camera array captured athletes’ aerobatics during the games, and footage was processed in time for inclusion in the on-air broadcast.
"Turnaround time between capturing, processing and playback can be crucial," Worley said. "The most immediate way to optimize that time is to increase the horsepower of your processing platform."
HARDWARE WITH HORSEPOWER
The essence of the Kewazinga processing engine is its ability to interpolate additional "virtual" camera views between the real, hardware cameras positioned on its curved rail. Processing algorithms create video frames based on the trajectories of the pixels between the hardware cameras.
But it is the tiny Kewazinga player application ¾ a mere 50 Kb of programming ¾ that makes the magic visible. "All the heavy lifting, so to speak, is done on the processing side," Worley said. "Once that’s done, it’s basically storage of the original video plus the header files." Headers carry the results of the processing effort ¾ mathematical formulae called up and applied by the player.
The player software is then able to run in real time, meaning that when you call for a move from left to right, new video frames are being created instantaneously. "That’s the truly astounding part of the software," said Worley. "We re-create video on-the-fly… video that doesn’t exist."
The outlook for speedier processing times is positive. "Last January, we were at a five-minute turnaround for one second of video, and now, with Intel’s 1.7 GHz processor, we’re down in the 30- to 35- second range," said Worley. He expects per-second times to drop to near-real-time within the year, due in no small part to the continued development of Intel microprocessors.
"The interesting thing about our system is that it is a software system, which makes it highly adaptable and highly flexible," said Worley. The system runs on either of two computer platforms, and can use almost any type of analog video camera; digital camera interfaces are slated for the future.
With the Axis system firmly in place for sports broadcasts, the Kewazinga team is moving down its list of development challenges. "We’ve got a number of things in the works," noted Worley. "We expect to be out filming this fall on a number of fronts."
One such project is a tie-in to computer animation and gaming. "There is a path we’ve laid out in white papers, for taking our system into game development and hooking it up with 3D animation," he said. Kewazinga technology would allow the insertion of famous actors and real baseball or football players, for instance, into 3D environments. Once a few hurdles are cleared regarding scene depth and collision detection, new titles using this technology could possess sufficient intelligence for the animated characters to interact with "real" video.