The following steps, when done correctly, help prepare content for distribution formats.
While the process for building a file-based workflow is typically thought of as nonlinear, a series of steps should be followed in a linear manner to ensure successful results. The steps — ingestion, indexing, quality control (pretranscode), transcoding, quality control (post-transcode), publishing, distribution and notification — when done in the proper order, help to better prepare content for the various distribution formats, including online, video on demand and cable. Depending on the different transcoding processes that are being performed, this workflow can be a combination of both hardware-based and software-based technologies. Also, several other considerations to keep in mind when selecting elements for your workflow are the turnaround time, the amount of files being managed and the amount of staff available to help with the process. Let's take a look at each of the steps involved so you can better understand their benefits.
One thing to always keep in mind is that all content must be brought into the workflow either via satellite transmission, a file from an editing workstation, or physical media such as a videotape, CD or DVD. When selecting software to handle a file-based workflow, it is best to select one that can handle the majority, if not all, of these ingestion formats. During the ingestion, files are preprocessed into a form that is optimal for the transcoding stage. This can mean breaking down or demuxing files into essence formats. From this, mezzanine streams can be made from the source media; these are optimal for transcoding.
While ingestion creates the assets in the repository, indexing — the next step — creates the metadata that is attached to those assets. Via indexing, all media-specific metadata (i.e. file length, frame rate, codec, etc.) is pulled out. With metadata being a big portion of the file-based communication process, the best software solutions are those that offer a means to easily manage the data and allow for changes to be easily made. Many software platforms offer catalog management, which allows for all items associated with a file (a thumbnail view, a preview version, the master) to be packaged and delivered along with metadata throughout the workflow along with the ability to search by any of these connected items.
Quality control pretranscode
Before starting the transcoding process, the file needs to be checked to make sure that all elements of the ingested file are correct, such as the frame size and the bit rate, or there will be issues later on. There are two ways quality control can be performed. One is through an external or Web-based program that can check the policies set for the project. A second option would be to use a software program that can be integrated into the workflow. The amount of time you have for this process is going to help determine which option works best for a particular workflow.
All audio, image, closed-captioning and subtitle processing happens in the transcoding phase. Transcoding can take place all within the software platform or within a combination of software and external hardware. Many transcoding software platforms now include audio and video processing, which were typically solely found in hardware previously. Because the audio and video processing is now in the software, this allows multiple transcoding processes to go on simultaneously within the platform, which can speed up the process, particularly when working with a large number of files.
For those looking to use a combination of hardware and software processors, transwrapping (also called transmuxing) is another option. With transwrapping, the source file is ingested, and the video and audio are demuxed into the essence streams. This allows either the video or the audio to be processed within the platform; then both are muxed back together, and the transwrapped file can be processed using hardware processors.
Another feature to consider is the ability to assemble one or more assets into the final product for distribution. One common scenario where this feature would be used is to stitch a black slug, with a promotional interstitial, with the master asset (movie, TV show, etc.), with a trailing interstitial, and finally with a trailing black slug. Some developers also offer multitrack assembly; this can be used as a basic nonlinear editor so that different takes of the same project can be put together.
Along with the software and what processes are handled within the platform, another factor to consider is the actual transcoding process being used and how the content is being distributed throughout the server. Some software distributes transcoding tasks across the transcoding farm, as capacity becomes available. Though effective, this can limit the speed at which a file can be converted. A second option is grid transcoding, which allows source content to be transcoded in parallel across all available transcoding resources, and can speed up the transcoding process.
Quality control post-transcode
While the transcode may have already been completed, it doesn't always guarantee a conforming file. Similar to the quality control pretranscode, this stage is where the transcoded file is validated. Like quality control testing pretranscode, this can also be performed either by an external program or one within the software platform.
The publishing step is where the transcoded files and metadata are taken into the repository and packaged for delivery. Publishing doesn't touch the actual media file, but it may put the files into some special directory structure or rename them so they are properly noted for output.
The second to last step of the process is distribution, which takes the generated files, the transcoded files and possibly the metadata, and pushes them out to a file server somewhere. Then, the files could be posted to a website, sent to an online cable provider, made available to an online music download vehicle such as iTunes for distribution and purchase, or to online video services such as Hulu.
Even though there is a lot of software involved in a file-based workflow, things never happen in a vacuum. The final stage, which is notification, can either be handled by humans or by an automated process. For example, this could mean simply sending an e-mail, sending a message via a Web service or a notification system within the software platform telling the final user that the files are there. The amount of people utilizing the files and their general proximity will determine the best solution.
Clearly, a file-based workflow is a combination of old practices and new. Software allows much of the process to now be automated, but there will always be a human element required for the workflow. An example is dropping a file out of one piece of software and loading up another piece of software, transcoding it and putting it in another folder. Having a better understanding of the various elements involved in a file-based workflow will help you create better results for your projects and give you the ability to better manage your content.
Kirk Marple is president and chief software architect of RadiantGrid Technologies.