File-based workflows

The technology opens entirely new and efficient ways to produce and air content.
Author:
Publish date:
Social count:
0

Attracted by the promises of lower costs, broadcasters are eagerly adopting file-based workflows.

There is nothing new about processing video as files. They have long been the video format of choice for editing, and more recently for playout. In the process of editing files on disks, there is the great advantage of random access, which lies at the heart of the nonlinear editor. In playout, playing files from disks saves wear and tear on tapes, decks and robotics.

The challenge today is to link these two islands into a seamless workflow from acquisition to transmission. It is conventional to store and transfer material as videotape. To handle a production as files all the way to the consumer requires cost-effective and secure distribution and storage.

Over the years, broadcasters have developed tried and trusted procedures to ensure the safe transport of tape from one department to another. The issues for long-term storage of tape are no different from the storage of books or documents.

To join together many file-based islands into a connected workflow is not a trivial task. Once the issues of distribution and storage are resolved, then the real issues of a workflow emerge. In the world of videotape, a new facility can be added without affecting other technical areas. The transport of tape decouples each area, neatly avoiding many issues of integration. A file workflow cannot be simply overlaid onto existing processes if the true benefits are to be realized.

Security of transmission

It is commonplace to find a lone VTR in a playout area. This can be used to play a tape that arrived too late for ingest, or it can be used as a fallback to play a tape to air if the server or automation systems were to fail. Before a broadcaster will part with that tape deck, a file-based system must offer a higher level of reliability.

Acquisition

In the last few years, manufacturers have released video cameras that can shoot as files and store on a variety of media, including solid-state memory, optical disk and hard drive. Some manufacturers have gone as far as removing the tape transport altogether from the camcorder.

These original camera files can be delivered to the post house and ingested by the NLE. The editor software developers have kept pace with the acquisition formats, so in most cases, the ingest process is pain free. The post house is where the issues start. Post involves offline, online and numerous review and approval processes. Tapes are checked in and out of the vault, and handling media as files demands a similar form of control. Digital or media asset management offers such control, plus it can handle all the processes of reformatting and managing transfers and file storage.

Tapeless acquisition has several advantages, one being the speed with which the rushes can be loaded to a laptop editor for the process of logging and shot selection to start. But again, there can be problems. Files can be accidentally deleted or become corrupt. Much of this is a lack of established procedures and a consequence of the use of smaller crews.

On a film shoot, the DP could have a camera operator, a grip to look after the camera and a clapper loader to look after the precious negative. On a video shoot, one person may have to cover all these roles, plus record the sound. It's not surprising that a flash memory card goes missing.

Film crews have developed a methodology to get the negative to the lab without mishap, and video crews have worked out a logical way to handle tapes — set the record lock, make notes for the editor. File shooters must do the same by figuring out a way to back up files as they shoot, log metadata and create a routine that is as safe as legacy workflows.

File formats

The media business had to construct a set of file formats from the ground up. The many formats spawned by the multimedia business were soon found to be lacking in basic features considered essential for broadcast applications. Time code is a good example of this.

A broadcast format must work with existing videotape formats and ideally support streaming and conventional file transfer. A media file is usually a container of video, audio and metadata essence files, or at least referencing atomic files. The files in the container must be synchronized so that the audio plays in a correct time relationship to the video track.

International standard or proprietary?

Until the development of DV and IMX tape formats, the only way to get video off a tape was to play it out at real time as analog or SDI, and then ingest and convert back to a file format. To get beyond this, the Digital Video Digital Interface Format (DV-DIF) used the IEEE 1394 protocol as a transfer format. This enables data to be transferred from the videotape to a hard drive without the need to use an intermediate video connection. It removed the need to digitize video into an NLE. Instead, the files can be ingested over FireWire, making laptop editing a possibility.

Sony soon followed with the e-VTR, based on 4:2:2 I-frame MPEG-2 recording at 50Mb/s (IMX); contrast with the closed recording format used by Digital Betacam (DB). The IMX tape format is now standardized as SMPTE D10.

GXF

As playout servers became the favored format to broadcast commercials and interstitials, broadcasters found that they were locked into one vendor because the internal file formats were proprietary. Copying a file from one server to another from a different vendor required a decompress to SDI and then a re-encode, with the attendant quality issues of concatenation of codecs. Since most servers used variants of motion JPEG or MPEG-2 for storage, why not transfer content as files?

The answer might be that tape dubbing uses baseband SDI, but server-to-server file transfer avoids the decode/re-encode step. Plus files are supposed to bring efficiencies, including in-media transfer over IP, vendor independence and faster-than-real-time transfer. The answer was the development of the General Exchange Format (GXF) later standardized as SMPTE 360M.

MXF

Figure 1. An end-to-end broadcast chain can now be built using files throughout the workflow processes. MXF greatly eases file exchange between platforms.

The Material eXchange Format (MXF) extends the principles established by GXF by adding a great deal of flexibility so that it can be used throughout the broadcast workflow. (See Figure 1.) MXF addresses the requirements of a file-based replacement for videotape, and adds many more features that tape could never offer, like multiple video tracks.

For some, it is too flexible, leading to complexity in the implementation. In response, some user groups are developing recommended practices that constrain the number of options for a given application. A good example is the MXF Mastering Format, which aims to simplify file handling and content repurposing in playout centers.

The standardization processes have sensibly limited the number of codecs to around 10 of the popular SD and HD formats. This avoids all the issues that have existed in the world of graphics where hundreds of formats have been spawned, predominately proprietary.

Limiting the number of codecs not only simplifies the design of file readers, but will also earn thanks from future generations as they mine our program archives.

File transport

The cost of long-haul telecommunications links has prevented the widespread use of file distribution for national or intercontinental distribution for broadcast-resolution files. Short items, like advertising spots, have been distributed by satellite or fiber for some time, but the distribution of programs has taken longer. To make a business case, fiber must be cheaper than conventional courier delivery of a videotape (along with costs of dubbing, QC and manual handling).

Fiber is gradually becoming an alternative to the courier, as telcos move toward Multi-Protocol Label Switching (MPLS) and away from the constraints of the digital hierarchies (PDH, SDH and SONET) designed for voice and data circuits. In many metropolitan areas, it is feasible to connect two premises with 1Gb/s fiber at a cost that makes the tape courier obsolete.

Alongside private networks, several companies offer digital media networks specifically designed for transport of video content. Typically the commercial networks use WAN acceleration to overcome the disadvantages of FTP. These two complete the missing link that connects the television production islands together.

Maximizing server cost/performance ratios

It would not be cost-effective to store a library of content on video servers. Video servers include integral coders and decoders, and they are designed for real-time operation. Therefore, video servers are unnecessary for general file storage. The most cost-efficient way to store video is on a hierarchy of storage, with the classic video server used for real-time ingest and playout, low-cost SATA/SAS arrays for nearline libraries, and data tape for backup and archive. (See Figure 2.) This concept is not new. Viewing copies were circulated as VHS tapes or DVDs before file-based workflows became commonplace.

To minimize the cost of servers, files must exist at different resolutions. Browse resolution can be stored online for instant access, and broadcast-quality archive files can be laid off to data tape. Playout server systems must be optimized to minimize the capacity needed to meet the demands of traffic. Long-GOP encoded files can be used to advantage in transmission. About half the size of the editable I-frame archive files, long-GOP files maximize the capacity (in program hours) of the more expensive video servers and can be transferred faster between the nearline and air servers.

If multiple resolution content files are used, media management must include transcoding engines. To maximize the potential video quality, the process pipeline for video files should minimize the concatenation of artifacts. The archive format is the benchmark, and that will set the best quality that can be achieved. Any other files will be derived from that, such as the long-GOP TX copies or low-resolution browse.

Selecting an archive format is an important decision. Ideally it should not be a lower quality than the acquisition format, but cost is always an issue.

Many broadcasters have adopted 50Mb/s I-frame 4:2:2 MPEG-2 (SMPTE D10) for SD content shot on DB. The data rate is less than half DB, but the increase in artifacts is considered minimal when compared with the cost-savings. Future archives for HD could be AVC I-frame, JPEG2000, or HDCAM (D-11). Who knows what is around the corner? It is all a balance between cost and quality.

Security

Tapes have several advantages when considering security. They are difficult to copy. To do so requires having two VTRs and the physical tape in your possession. Also, it is obvious if a tape is about to be over-recorded, as the record interlock must be switched — hardly an accidental operation. There is only one master tape, and it is relatively easy to keep track of clones. In contrast, it is simple to copy a file. It is also easy to overwrite a file.

The move to file-based production introduces many problems that never existed before. Take security. If you lose a tape, it may be inconvenient. And if a central file storage system gets corrupted, you have a major problem.

Tape systems are resilient to security breaches, a consequence of their dispersed nature and the physical nature of the recording medium. To many creative personnel, the adoption of the IT security measures found in banks can be an anathema. It is common to deliver graphics files or audio clips on USB sticks to an edit suite. Media organizations are learning the hard way that removable media can carry a computer virus, neatly bypassing an expensive firewall.

The physical tracking of tape using library software and bar codes has to be replaced with something much more complex for files. Regular IT practices of authentication of users (i.e. passwords) and authorization can prevent the copying or corruption of a file through access via the computer network. A user can be authorized to access folders and directories. The operating system (OS) will associate a file with the owner or user (usually the person who created the file), a group of users (project team, workgroup, department) or anyone with access (others). The file then has flags indicating who has permission to access the file, and who has read or read and write privileges. These flags can be set by “user,” “group” or “others.” For example, the archivist may have permission to write, which means they can delete the file. An editor may have permission to read a file, but not write. To edit, he or she would have to make a copy of that file.

DAM systems add much more to authorization, with support for fine-grained group definitions that can cater to different roles within a workgroup. They can allow for more flexibility when it comes to personnel, like the ability for a supervisor to override permissions if the user in on vacation, but the file must be processed immediately. This complex, role-based authorization gives the flexibility to use files in the broadcast environment, and extends that offered by an OS.

Aside from access and authorization, files must be protected against equipment failure. Backup procedures and RAID storage systems cover most of these issues. The other form of protection is to duplicate the main file archive at a remote site to permit simple recovery from a disaster like a bomb, fire or earthquake.

The level of backup and disaster recovery facilities is a balance between cost, convenience and the value of the content assets. If the archive is compromised, and it takes a day to retrieve a file from the disaster recovery site, then that may be acceptable. In master control, however, the backup must be available instantly.

Summary

File-based workflows offer many advantages to the broadcaster, including the potential to lower costs, enable collaboration and simplify multiformat publishing. However, a file-based workflow should not be dropped on existing systems without due consideration of the implications.

Staff members need training in a new approach to security, and IT procedures must reflect the level of threats. Media organizations will have to take precautions similar to those used by banks and government institutions. This is not easy, as media workers expect unfettered access to the Internet for research and communication. Plus, a great deal of file interchange is needed in the regular production workflows. Graphics and audio assets are frequently handled by different companies or departments from those handling the video. All gateways must be carefully designed to reflect the best security practices, but at the price point that the value of the content warrants.

The broadcast community is now in position to adopt files throughout the workflow from acquisition to delivery to the consumer. We have all the pieces to build the content layer — broadcast-friendly file formats, cost-effective contribution-quality file distribution networks, and secure management procedures.

The remaining issues are business and human. File operations need new, optimized workflows to realize the maximum cost-savings. These can be achieved through the use of methodologies like business process management. Television is set to join the enterprise business community.