Military uses instant replay technology in Afghanistan

The U.S. military is now using the same video technology the NFL uses for instant replay during football games to monitor battlefields in Afghanistan. The technology is being used to analyze video from drones and robots being used in the continuing conflict.

Increased use of robotics in the war has overwhelmed analysts who are faced with an increasing amount of video to monitor. For example, U.S. Air Force drones collected about 1800 hours of video a month in Iraq and Afghanistan in 2009, nearly three times as much video than in 2007.

If one analyst had to watch all the video now being generated, it would take about 24 years if watched continuously, the "New York Times" reported. The amount of video is expected to continue to grow exponentially.

In the next year, the Air Force will outfit 10 General Atomics MQ-9 Reaper Unmanned Aerial Vehicles with "Gorgon Stare" sensors. This is a package of high-powered cameras that can film an area 2.5mi around from 12 different angles. The military wants to eventually equip drones with 92-camera arrays.

Harris, based in Melbourne, FL, has devised a customized video analysis system to cut the time needed to analyze trillions of bytes of video from weeks to minutes. U.S. broadcasters currently can handle daily 70,000 hours of instant replay video in sports production.

Harris's Full-Motion Video Asset Management Engine, or FAME, can work with data relayed from the sensors in Reaper drones. The technology uses metadata tags appended to each frame of video to encode details such as time, date and location in space.

Knowing when and where each image was taken on the battlefield is fundamental to depicting a scene on, say, a Google Earth map. It is akin to knowing where each camera is in a football stadium during an instant replay shot.

"As an analogy, you want to not only find a book in a library using the index card system, but you want to be able to find a word on page 36 in chapter 12. Tagging information properly can help you find things instantly to review and analyze it," John Delay, director of strategy for Harris's government solutions business unit, told "TechNewsDaily.com."

Additional metadata about the event can then be layered on, such as who is shown and what is happening, as well as links to other documents, satellite photographs, cell phone call recordings, map databases or other files.

"If properly tagged, the speed at which you can garner intelligence is mind-boggling," Delay said. "If you have to watch 1800 hours of content a day, that's a problem. You want to know which content is relevant to what you're looking for in real time to speed up the pace of analysis."

Soldiers narrating what they see can increase the value of the metadata. "It'd be like announcing a football game, except here you could also have the system listen in and extract words of interest, say, 'IED' [improvised explosive device] or 'combatant,' which helps with search, retrieval and analysis of data," Delay said. "You can also imagine drawing a circle around a mosque and saying, 'No, that's not the building we're trying to bomb' for more sophisticated situational awareness."