Skip to main content

Linking the digital chain

It wasn't long ago that an archive within a broadcast environment consisted of a huge tape library at the end of a corridor or in another building. Then the IT revolution spun the broadcast world on its axis, promising greater return on investment and unlimited opportunity for growth. The golden ticket was the digital asset management (DAM) system within an automation system that could respond seamlessly to each area of the technology chain, delivering an integrated workflow and access to an archive. In essence, this is how the industry is evolving; however, as technology progresses and systems become more integrated, the potential for confusion grows. Also, the full potential of what is achievable is not being realized.

Assets and archives

IT technology and networking has delivered the ability for all of the material within the system's purview to be potentially retrievable and reusable. (See Figure 1.) The business drivers of modern broadcasting demand that content is used and reused to the maximum extent possible. Therefore, a digital archive is needed, first, as a secure repository for material that a broadcaster wants to keep and, second, as a program resource so that material can be found, restored and reproduced easily (in other words: put back through the production cycle).

It is extremely important that the asset and media management systems that drive the modern production environment are capable of accessing the archive and provide users with the ability to browse, recover and reuse material. Broadcasters must remember that the physical archive, which is essentially a bank account of content, is still a combination of disks and tapes. The asset management system and the automation system provide users with the ability to search, retrieve, reprocess and reschedule for playout the material within that archive.

Legacies of videotape

So what exactly is a file-based workflow? There is still an awful of lot of tape-based thinking that occurs even within file-based environments. Tape represented far more than just a lump of content in a box. What was written on the tape box or the tape itself constituted the metadata and the indexing. Users knew what was on the tape because it was written on the label, and as long as the tape was in the box, the metadata and the tape never got separated. Also, possession of the tape conveyed ownership of the content, so there was a territorial element that conveyed power.

In the digital domain, there are numerous issues that need to be addressed before broadcasters can have total confidence in the security of their assets. The most important issue is ensuring that the asset management system understands that the media recorded on the tape or the disk, which ultimately exists in cyber space, is adequately labelled and described for restoration. It's also important for users to impose some form of hierarchical access to protect it against accidental or willful damage. Then there is the potential for two people changing the same piece of content simultaneously.


The key question: How do you systematically apply metadata to content? File-based workflows are in their infancy with everyone having different requirements and different business drivers. The Material eXchange Format (MXF) is an attempt to take a very complex published standard and superimpose some real-world workflow measures to allow the standard to be used. The advantage of MXF is its ability for multiple manufacturers to deal with media plus metadata on the same basis, allowing material to pass seamlessly from one end of the chain to the other without losing any information.

It is important that metadata is organized so that broadcasters can create a metadata inventory to include different content components in multiple versions for media that often looks very similar. For example, if a broadcaster wants to subtitle in French, German and English, each language version has a different metadata package. What follows is the increasing necessity for broadcasters to apply business metrics to another part of the metadata. In other words: How much is this content worth?

To meet the metadata requirements of the industry, the Advanced Media Workflow Association (AMWA) is attempting to create workflow models to standardize how metadata is applied to production and playout material. The association demonstrated this at NAB2007. Metadata is infinite, and it's impossible to predict the criteria you might want to use to find something in the future. It is difficult to come up with a sensible universal metadata schema even for one organization. For a big multinational broadcaster, it's almost impossible.

The AMWA has put a great deal of effort in deciding how the standard should work, but implementation and industry acceptance is a long way off. However, the AMWA is one of the first genuine multivendor workflow experiments using a published standard.

The core of the implementation question: How do you ensure that work does flow and each part of the chain knows what the other is doing? The answer: With considerable difficulty. The solution is an organized systematic approach.

A broadcaster cannot solely rely on a one-off consortium of manufacturers to develop the solution. They also have to play their own part. The point for broadcasters is not necessarily to look at the real-world benefits of installing a fully file-based workflow, but to look at the dangers if they don't. (The dangers are that they get stuck in a legacy-driven past, and Google and YouTube eat their lunch!)

Digital archives

Over the next five years, digital archiving will undoubtedly get faster and cheaper. The digital storage industry has taken HD more or less in its stride, despite the vast bandwidth involved. The interesting area of activity for digital archiving companies over the next few years is tighter integration with the rest of the acquisition and production process so archiving is completely seamless with ingest and production.

Broadcasters are still nervous about applying file-based processes to their workflows because they don't know how to go about it. There are several things that broadcasters need to do before they start investing in this kind of technology. It may sound obvious, but it is important that broadcasters understand what they currently do, where they want to be, why they want to be there and what their business imperatives are. They then have to figure out how to handle the process. Next they need to sit down and talk with manufacturers.

This is where storage and automation companies can partner to provide solutions based on those requirements. By integrating a storage API with an automation system, material can be restored from the archive to the video server before playout by the automation system.

Broadcasters need reassurance that regardless of the type of system, they can still access and use the archive and its layer to its best level of performance. The archive layer stores different types of media irrespective of formats, which can be accessed, when required by, for example, the automation, MAM and news systems. Broadcasters need to be confident that their workflow and their business choices are not compromised because of limitations in the archive layer. Some storage providers can deliver this facility and protection of investment benefit through both standard Video Archive Control Protocol (VACP) and a rich and flexible API that uses nonproprietary IT standard technology built around a Microsoft suite of tools.

To provide an example, media is ingested into a production system. (See Figure 2) In the production environment, the editor can work on the rushes and create the finished program. When the program is complete, the production system can publish that content to the archive and notify other control systems such as the automation and MAM system with the title, duration, start time code and the material ID. These are the important metadata elements required for all these systems to access the archived media. The common denominator is that all of the systems are able to access the archive layer.

The production system can then place a copy of that finished program wrapped in an MXF file into the video server. The automation system recognizes that the file is on the server, opens the MXF wrapper, reads the metadata, populates its database and sends a request to the API to move that file to the archive. (See Figure 3.) To further this scenario, another production area might have, for example, an MXF file from a different provider. Therefore, two completely disparate systems can now deliver content into one location ready for playout with the archive always being controlled by the API.

Like many new concepts in the broadcast world, file-based workflows will completely revolutionize the way that media is acquired, edited, ingested, stored and transmitted. But like many new concepts, it will take a while for those involved to fit together all the links in the chain.

Bernie Walsh is sales director at SGL, and Adrian Scott is chief marketing officer of Pro-Bel.