Using User Feeds

Grass Valley MediaFuse
LOS ANGELES
As broadcast news facilities face decreased budgets and increased competition in the new year, they can still take comfort in the fact that newsrooms can cover more territory for less money by incorporating feeds from viewers. And due to the convenience of upload software, contributions are plentiful.

"One of the European broadcasters I talked to said when there's breaking news they could get up to a thousand clips an hour," said Straker Coniglio, senior market solutions manager for broadcast at Avid Technology in Tewksbury, Mass. "On a typical day, they get about a thousand clips."

But broadcasters are on their own to comb through the feeds—sniffing out fakes, viruses, obscenities, subliminal messages, and dubious rights to content—to determine what's appropriate.

WORKFLOW

The workflow for incorporating user-generated video has three stages: the submissions layer, the broadcast environment, and the bridge (Application Programming Interfaces aka API) in between. Generally, the first layer is provided by the broadcaster's IT department to scan, upload and store feeds in a database. Network automation systems provide the third layer.

Avid introduced its Interplay production/asset management system in 2006, and continues to add APIs. "The APIs previously just dealt with extracting metadata and maybe triggering a transfer to or from an external system," said Jim Frantzreb, senior market segment manager, broadcast for Avid. "With the new Interplay Web Services [introduced in iNEWS 3.0 in December], you're providing the ability to create jobs, transfers, do all kinds of operational things—not just inquire 'do you have this media?' or 'what is the attribute of this media?' It actually [triggers] actions."

There's no really good way to take user-generated video direct to air, according to Coniglio. Avid's system, for example, currently enables a producer to flag the video and push it through Avid's APIs into Interplay, then into a production environment for editing.

"Broadcasters are looking for systems that will not only accept the content, but will have some level of intelligence to pre-index and verify—there really isn't anything that is off-the-shelf that really solves the problem," said Coniglio. "You can put Web-submitted media into a media asset management system—but how can you [automatically] filter and analyze the video?"

Coniglio said Avid is currently working with a European customer to streamline the ingest process.

"So far, broadcasters who are having the most success have in-house developed systems for the submission layer and bridge to existing MAM [media asset management] and production systems," said Coniglio. "The overhead is significant—it's not easy to maintain these systems from the ground up."

Scott Matics, product manager for Grass Valley's MEDIAFuse content repurposing and distribution system, agreed that content control, to date, required manual labor. But, he said MEDIAFuse lessens the pain by enabling broadcasters to set up watch folders.

"They would have a watch folder set up specifically for user-generated content—it's a quick and efficient way to look at a list of all the content [and] make sure there's nothing unacceptable there," said Matics. "I would have a number of options that I would set up to act on the content that comes into those folders."

For example, embargo pre-sets could designate content that required various levels of pre-approval.

Plans are in the works to incorporate metadata into the process.

"Being able to read the metadata will be in our next release of MediaFUSE," said Matics. "Once we get to the point of being able to bring all the metadata in, we can have a field in our user-generated upload system that would ask the [feed providers] to describe what they saw—that would appear as a textual accompaniment. Whoever's operating the console could go in and spell check and make grammar corrections."

FROM PHONE TO NEWSROOM TO PHONE

At IBC2009, Harris showed iPhone-generated video and audio of the show floor being incorporated into a newsroom station and republished live to hi res and lower res offerings on iPhone and Blackberry receivers. The real-time demonstration used Harris' Nexio Amp 5.7 servers, Velocity editor 2.0, and an Invenio asset management software app prototype with the working title "Citizen Journalist."

The setup not only played the video in real time, but input metadata—location, time, provider, format—for digital asset management, said Sam Lee, director of development, servers and editing at Harris Corp.'s Broadcast Communications division. It also enabled the video provider to input comments—title, notes—automatically.

Nexio's new Instant Online module-conforming engine upgrade can output different formats (cell phone, hi res, Web, etc.) all in parallel, according to Lee. And, he said, the Velocity 2.0 upgrade introduced Web and phone formats that could be directly edited and output from the Velocity timeline.

"You can take the phone video and put it right on the timeline," said Lee. The system "can understand the video type and mix it with your HD and SD and graphics without any pre-processiwng."

Invenio's broadcaster-determined metadata fields can include expiration dates and rights information. Newsroom editors can manually adjust video and audio quality for brightness, contrast and zoom. A preset can shuttle clips to a broadcaster's Web department to prep for Web deployment.

Lee expected the commercial release of these products during the first quarter of 2010. He anticipates significant interest, as attested to by feedback from Harris' Customer Advisory Board.

"There has been a lot of discussion about how to qualify that media," he said.