The emergence of OTT television and “TV Everywhere” distribution, combined with the dwindling availability of tape stock, is the primary driving force behind the broadcast transition to file-based workflows. In addition, consumers are demanding more and more mobile content. Though file-based workflows help make certain jobs easier, such as distribution to local or affiliate stations and the quick retrieval of archival content, there are still issues that need to be addressed. For the actual setup to function, there needs to be a strong combination of IT and broadcast expertise. As each distribution outlet has its own unique file format, there are multiple transcodes that need to be performed on a large amount of content in a short amount of time.
It is no longer the case that the quality of content earmarked for Internet distribution can be less than that of content for broadcast distribution. Broadcasters delivering content to such online outlets as iTunes and Netflix must grapple with the same QC issues as they do with traditional broadcast outlets. QC checks for audio, video and associated metadata must be in place. As each distribution method calls for its own specific file format, with associated parameters, these result in an increase in the volume of content being managed by an ever-decreasing staff. Automation is an important element in this scenario, as it lessens the load, but it is only effective when combined with a strong human support team combining IT and broadcast engineering expertise.
IT vs. broadcast quality
When it comes to developing and maintaining a file-based workflow, both IT and broadcast considerations must be taken into account. On the IT side, staff members must ensure the storage and server is adequately managing the actual operation of the equipment. On the broadcast side, workers must manage the quality of content that is being processed and eventually aired. IT personnel can handle such integration issues as getting content into the workflow and files out of it. Once all servers and software are in place and tested, the system can run itself. If an issue does arise, IT personnel can troubleshoot as they normally would when a typical exchange server goes down, until the issue is resolved.
IT personnel can ensure that files are getting from point A to B, but often fall short in terms of checking the quality of content being sent out. Some of this has to do with the information tech having grown up with lower expectations of video and audio quality for online or mobile outlets due to previously lower standards. Those who have been looking at video all along for traditional broadcast will take a different approach, making sure the content is broadcast quality. It's hard to find one person that does it all well, but if there is one focused on each area, the two can work hand-in-hand. The broadcast side can set the standards and handle the QC checks, while the IT side can make sure that the setup is working as it should.
Automation is key
Even with the proper personnel in place, to make a file-based broadcast successful, technology must be involved. When it comes to the actual workflow setup, broadcasters aren't progressively adding human resources; they're adding server and physical hardware resources. As the volume of content isn't going down, the only way this setup works is if it is automated. Most broadcasters would not add 100 people to an organization to brute-force this increase in files. Instead, they will want staff to set parameters and use the automated technology to handle the rest.
There are many software and hardware options on the market that claim to be automated but in practice are not. How they are actually achieving automation is key to determining whether they can back up these claims. If a file needs to be reingested after each step in the workflow, then it is not automated. True automation is achieved when multiple processes are all integrated under one unified user interface. This allows the file to move through all means of preparation, from ingest through to distribution, without human involvement. Some may think it necessary to start from scratch when developing or revising a current workflow to achieve these goals, but it is not necessary. What they need to look for is software that allows programs currently being used for processing to be integrated into one interface.
QC falls under the umbrella of automation of processes. With the demand for Internet and mobile content to be of the same quality level as a traditional broadcast, media outlets need to rethink each part of the content going out, from audio and video to ancillary data. Soon, viewers will be demanding closed captioning for mobile content, and loudness regulations will make their way to the mobile arena. Broadcasters can prepare for these changes now by accommodating the transcoding of these different file formats, as well as preparing files to meet audio and video standards.
What types of file transcodes a software system can support is important, not only for compliance but also for media outlets that are changing their archive setup. A lot has changed since file-based workflows were first being implemented; broadcasters are converting file formats in archives to go along with new equipment or to better fit in with the needs of the outlet. To maintain the automation throughout, these processes should not be just merely added, but fully integrated into the current software workflow.
With software providing users with the capability to transcode files to most any format and even correcting for loudness and audio issues, how can one be sure that nothing has happened to the file during the process? Automated QC checks for audio, video and ancillary data need to be put in place to keep the process moving forward. Checks should be performed after media ingest to look for issues before moving a file into the preparation or transcoding stage. Users will also want to check content post-transcode, and before distribution, to mitigate the risk of sending out bad content. There is certain QC software available that will also quarantine a problem file, enabling users to decide how to proceed. This helps to further automate the process, as the problem files do not affect the rest of the files being processed and users do not have to reingest problem files.
It is also important to think through how and when one's processing software is connecting with the trafficking system. Previously, a trafficking system would only link up with the processing software when the file was fully ready for distribution. There is software available today that can bridge the gap between the archive and distribution. This makes it easier when coordinating separate advertising opportunities for each distribution method.
By linking the media processing software to the station's broadcast trafficking and rights system, users can schedule content by release date for air. This is assisted by such integration strategies as Broadcast Exchange Format (BXF) and Framework for Interoperability of Media Services (FIMS), which allow users to easily integrate their processing system with their broadcast trafficking system. As integration now takes place during the processing phase, the air schedule can be put in place at that time, along with the rules around it that are being handled or dictated by upstream business processes. Another process that should fall within this area is the detection of rights for an associated file. There is software that is available that can alert users if these rights are expired; if all of the right items are not in place, it will not allow the content to go to air.
Broadcast engineers and IT engineers have different, but complementary, skill sets. IT resources are needed to keep the systems running, focus on storage and networking support, and maintain the consistent data flow between the tiers of the workflow in their organization. Broadcast engineers are still needed for understanding the wide range of video/audio/caption specifications, and putting policies and workflows in place to maintain compatibility with the hardware resources used for playout to the consumers. Broadcast engineers are a vital resource in the QC process. Having a resource with “golden eyes” or “golden ears” is key to making the video the best quality for the consumer. Clear communication between both parties is essential when setting up and working with elements in a file based workflow, including broadcast trafficking systems. This allows both parties to understand the needs of all staff and limitations of the technologies involved.
Smart TVs have eclipsed 3-D TVs in sales, forcing broadcasters to streamline their workflow and the processes around that architecture. There is no standard format for these various distribution formats, further adding to the amount of content being managed. Automation helps to make sure that files are being moved forward, but it is only effective when combined with proper QC processes.
Kirk Marple is president and chief software architect of RadiantGrid Technologies.
Future US's leading brands bring the most important, up-to-date information right to your inbox