Granby Patrick /
12.01.2008 12:00 PM
Inside automation
Your guide to achieving successful systems integration in a cross-platform, multipurpose digital broadcasting environment starts here.

Despite recent advances, the successful integration of systems over a network to enable efficient workflows in a cross-platform, multipurpose broadcasting environment continues to be a challenge. If we look at what's actually involved in this process, we can understand why this challenge remains and identify ways to achieve an efficient solution.

To begin, there are 14 parameters, ranging from frame size through compression, that need to be matched in order to move a clip between two boxes. And yet, this still doesn't actually improve our workflow. All this achieves is what we used to do by taking a tape from one VTR to another! File-based workflows start to pay dividends when the clip carries its own description, i.e., the metadata, so that the system can automaticallly file it, process it, index it and deliver it.

Metadata

The number of metadata variants is as great as the number of manufacturers' products multiplied by the number of different applications that the users require. Even if we find a standard way of encoding the metadata, there would never be a standard for the different pieces of metadata that need to be carried, as these are tied to the specific workflow we are trying to achieve. There have been a number of attempts at creating metadata frameworks such as DMS-1, Dublin Core, SMEF and most recently BXF. However, when made sufficiently flexible, they become very complex, making them difficult to use. This is unfortunate because a workflow often only needs a very simple but specific set of metadata.

Identifying the essential metadata — and ensuring that this is carried through the system along with the clip material — is key to the success of integrated workflows.

Workflow

A workflow can signify many different things. In this article, it's the human activity required to make the system deliver the desired output, whether it be a program, a news bulletin, a promo or a whole channel. Obviously the objective is to ensure that the people using the system add as much value as they can at each stage without having to perform repetitive tasks that reduce overall efficiency.

Repetitive tasks should be automated. Information, once entered, is part of the system. And housekeeping tasks should allow the user to make critical decisions easily and then carry out the rest of the work automatically. It becomes clear that an important aspect of achieving this efficiency is to ensure tight integration between the different parts of the system.

Systems

Systems from single manufacturers may be efficient only for performing the precise tasks for which they have been designed. However, caution needs to be applied because, in the future, the manufacturer may develop its system to benefit the maximum number of customers, which may not suit your own development path.

An option is to build a system from a selection of manufacturers' file-based products — each chosen to suit your application. These products must be integrated to support efficient workflows. Different products have different needs from their files.

An ingest or playout server will have a file structure that is designed to allow the server maximum levels of performance when recording or streaming the clips. This may mean that the video and the audio can be interleaved. It may also mean that an interframe compression is more efficient than an intraframe compression.

An archive system will be primarily concerned with the size of the files and the ease with which they can be managed. An editing system will be more tuned for rapid access to any part of the file, favoring intraframe compression, and allowing the playing of large numbers of separate audio files with the video. This tends to lead to a separate component (video and audio) file structure. Effects systems may store uncompressed video to reduce the effect that multiple generations of compression have on image quality.

All of these systems have different file structures for different reasons, but they all need to be integrated. An integration layer will ensure that material can be moved as seamlessly as possible. Ideally, this integration layer will access one product in its native file format, convert the file and deliver to the target product in its native format, all without any interim copies, and as a continuous data stream offering the highest performance.

Moving the file is not enough. The metadata must stay with the file so that the workflow remains efficient. In most cases, the metadata relating to a clip is not stored with the clip itself but in a separate data structure — whether that be a database or simple clip reference file. The integration layer needs to fetch the metadata, translate it and deliver it to the target systems at the same time as it is delivering the media. (See Figure 1.)

Control

The ability to move media is not useful without a method to control those movements. As with video routers, the control system may be built into the integration layer application. Alternatively, there may be a need to manage and monitor these movements from a higher level control system that has business and workflow logic built into it. Of course, if both methods exist, any transactions initiated via an external control system should be visible using the manual method as well. In this way, it can act as a backup strategy in the event of a failure.

In the context of media file integration, the control system may be a conventional automation system, but it is more likely to be an asset management system with some degree of workflow automation. As well as providing a mechanism for initiating the transfers, there must be a mechanism that allows the receiver to be aware of the incoming material and its transfer status.

For major installations and enterprise-level performance, the system should be able to gracefully handle as many exception cases as possible. Files may fail to transfer for many reasons, and it is important to notify the users (probably both source and destination users) that the transfer has failed.

Systems used for this kind of integration need to be flexible so they can be configured for a range of media formats and metadata structures. These systems must also be precise so that everything transferred is valid and not spurious or corrupted. This ensures that the resulting files at the destination are formed for the intended target device. These two requirements tend to conflict, making the detail of providing reliable integration difficult to achieve and maintain over a long period.

Manufacturers will upgrade their products and software versions, changing the manner in which third-party systems communicate with them. The integration layer must have a flexible architecture that allows it to be adapted to this continuously changing landscape, while still offering consistent user interface and workflow behavior.

Movement

All of this is often carried out in an IT environment. Designing a secure network that supports the file movement and sharing needed requires skills that are in short demand. Most engineers skilled in general IT applications don't have experience dealing with the kinds of network and file traffic experienced in a broadcast facility. Most broadcast engineers don't have the experience in designing and configuring network systems to be efficient and secure. In many cases, over-engineering and accepting unnecessary limitations in the workflow and flexibility of the system are the compromises accepted in order to deliver a network that can be used in these kinds of environments.

Some broadcasters have already found the answer to their cross-platform, multi-application integration problems by using specialized integration software. Such solutions act as media highways for content to move freely between a range of applications irrespective of their hardware platforms or software architectures.

This approach not only overcomes any interconnectivity bottlenecks, but also optimizes workflow efficiency through advanced manipulation and management of metadata. Media is wrapped and streamed for movement through the production process, allowing the metadata to remain attached. Such software allows for tightly integrated workflows based uniquely on the user's objectives and resources.

Benefits

As well as optimized interoperability between best-of-breed products, including solid-state and disk-based camcorders, an array of other benefits can be achieved by using specialized integration software. A few of these benefits include improved media tracking and accessibility, financial and time savings through fewer manual processes, better use of existing assets, and improved reporting procedures. Improved reporting procedures can be accomplished by using the metadata to provide useful data to back-office systems for integration with administrative tasks such as billing and statistics reporting.

Most frustrating, when it all comes together and works, the result looks so smooth and easy that it leaves everybody asking, “How come that was so difficult?!”


Granby Patrick is partner director of technology for Marquis Broadcast.



Comments
Post New Comment
If you are already a member, or would like to receive email alerts as new comments are
made, please login or register.

Enter the code shown above:

(Note: If you cannot read the numbers in the above
image, reload the page to generate a new one.)

No Comments Found




Thursday 10:05 AM
NAB Requests Expedited Review of Spectrum Auction Lawsuit
“Broadcasters assigned to new channels following the auction could be forced to accept reductions in their coverage area and population served, with no practical remedy.” ~NAB


 
Featured Articles
Research & Standards
Discover TV Technology