NASA Picks Up Hollywood's Methods

Columbia crash leads to lab revamp


Hollywood has a long history of borrowing storylines from the space program, Buck Rogers through Apollo 13 and beyond. In a turnabout, NASA is leaning heavily on Hollywood technology for its "Return to Space" program.

Until recently, NASA monitored each launch using 16mm, 35mm and 70mm film, collected by 70 cameras, and analyzed on a projector screen. Critical frames were enlarged on a flatbed scanner for closer analysis.

The Columbia space shuttle accident last February sparked the need to more closely evaluate film footage of that and prior shuttle launches to identify what led to the shuttle's disintegration during reentry. Prior to Columbia's flight, NASA had been discussing an upgrade of its image viewing facility with SGI of Mountain View, Calif. The failed Columbia mission led to a call for immediate action.

NASA wanted the highest resolution and fidelity possible, uncompressed, with the ability to view multiple images, synchronized, in realtime, in multiple locations. Though NASA's dream system did not exist as a whole, most of its elements were already at work in the entertainment community.

"Within about 10 days we put the design together, figured out the workflow and basically presented the whole thing to NASA," said Kevin Smith, systems engineer at SGI.

The engine of the revamped Visualization Analysis Lab in Kennedy Space Center is a 12-processor SGI Onyx 3800 supercomputer with 36 TB of high-speed storage. Film is transferred to video using an Imagica XE film scanner, which can yield a superhigh definition resolution of 4,000 lines by 4,000 lines (4K).

The next shuttle launch is scheduled for no earlier than next September, but the lab is already hard at work.

"They're presently digitizing a lot of 35mm films ... we have to answer certain questions that the External Tank Project Office has," said Pete Rosado, senior system engineer at Kennedy Space Center.

Rosado said NASA relies on film as an acquisition medium for two reasons. The first is resolution.

"The data we can get out of video cameras is not as much as we can get out of the 35mm," he said.

Frame speed is another issue.

"Right now, the plan is that we're going to run the 35mm at no less than 100 frames per second," said Rosado, noting that some of the cameras run at 200 fps. As for current high-speed video technology, Rosado said resolution and color do not remain stable.


After the film is transferred onto high-resolution, uncompressed video, playing it back in realtime at high frame rates uses a lot of bandwidth. The SGI system is capable of accessing data at 2 GBps, serving the high frame rate video to 7 x 7-foot SGI Reality Center Insight DLP displays.

For closer, single frame analysis, non-realtime 4K images are served off an SGI Octane2 visual workstation to be viewed on superhigh-resolution IBM T221 flat-panel monitors. The files are manipulated using InteractiveFX Piranha HD software, which SGI's Smith terms the "utility knife" of digital film production.

"It gives people the ability to cut, transcode and play out in real-time, film frames in a computer," he said. "All of the color space manipulation, various imaging capabilities, converting the color space of the imagery, and transcoding to other image formats or other kind of output devices can be done with Piranha HD."

The capability of viewing from multiple angles at the same time is critical, said NASA's Rosado, because objects flying across the screen during launch that are suddenly hidden from one view can be picked up from another.

Another advantage of working in the digital domain, said Smith, is the ability to manipulate the color-space of the video, which reveals things never seen before in the images.

"Putting things in different color environments, like bringing up the red gun, bringing down the green, changing the color-space, is something a human can't do without help," he said.

This kind of video manipulation is not only being applied to the final Columbia mission images, but to film transfers from previous missions, yielding new information to the analysts.

Another piece of software popular to the Hollywood production community also plays a critical role at the Visualization Analysis Lab. The same CORRECT film restoration software from MTI Film that is used to remove scratches and restore old films has a similar role for NASA.

The problem is determining whether a small spec is a scratch, mar or other damage to the film itself, or an object of interest. Smith said the MTI operator applies "mathematical algorithms to the imagery to ascertain what it actually is that they're looking at."

Though NASA relies on film for image acquisition during launch, video acquisition has played a part. Many of the film cameras have video taps where NTSC feeds have been recorded on VTRs for viewing and analysis before the film reels are processed and available for screening.

With increased emphasis on careful analysis of potential launch damage as soon as possible after the spacecraft leaves the pad, NASA says there is a role for high-definition video acquisition.

"We plan on getting some HD cameras for the return to flight, download all those videos into the Onyx, and all three centers (Kennedy, Marshall and Johnson space centers) are going to be involved in the quick review," Rosado said.

That presents a problem NASA has yet to solve-finding a way to distribute that massive amount of data quickly across the country.

"It's a massive amount of data on the server, and trying to transmit that is going to be a big headache," he said, but NASA does have the better part of a year before this capability will be needed.

Reflecting on the NASA project, Smith noted. "our first two customers for SGI were NASA and the Walt Disney Company."

He said while the company was building imaging and 3D systems for feature films 20 years ago, "the government was taking the same technology and applying it in different ways to create aircraft, to create understanding of the universe, that sort of stuff."