Skip to main content

By the time you read this column, the complete overhaul of content workflow within public television will have started. As described in my last column, the first changes will take place within PBS and will focus on the processes of ingesting programs, performing technical evaluations, screening for subjective content-related issues and eventually, archiving programs and interstitials into permanent storage.


Our goals for this process were easily stated, but as it turned out, accomplishing them took quite a lot of business process re-engineering, substantial inter-vendor cooperation and several exhaustive iterations of workflow analysis. Our aim was to create a system that would completely eliminate tape activities after ingest and simultaneously enable our content producers to migrate to file-based program submissions as system evolution allows.

The entire sequence of events starts with the creation of a program record in our database. Via the Web, program producers provide PBS with initial content metadata such as title, segment timings, air rights and other program-related information.

This initial process automatically triggers a variety of processes:
(click thumbnail)

Through an XML-based Microsoft BizTalk interface, work orders for processing the content will be created in ScheduALL, a facilities and human resource management package that allows us to use all our resources better. In time, these work orders will create invoices directly in our Oracle Financials ERP system.

Simultaneously, the proper folder and file structures will be created in our Avid environment and used later by the incoming program essence.

Depending on the original data entry, workflow steps will be created that control the program progress from the original ingest to the first broadcast.

These data elements are eventually matched with the program essence when it arrives at PBS. Initially, this process will still rely on content producers creating program master and backup tapes and sending them to PBS where they will be ingested directly into Avid Adrenaline workstations. During ingest, frame-accurate timings will be verified against the original metadata while the program is checked for any technical defects potentially introduced by the tape processes.


Once a program is complete, contains all the necessary elements (closed captioning, descriptive video services, secondary languages), and is ready for air, we will immediately create and archive three copies encoded at different bit-rates:

The first copy will be archived in the IMX50 format and will be later used for post production, including the creation of promos, promo reels or the repackaging of the program with different content underwriters. This process will leverage the Dynamically Extensible Transfer interface between the Avid system and our MassTech asset management system.

The second copy will leverage a Data Handling Module interface between Avid and the Omneon servers to store an IMX50 file which, under complete automation control, will be turned around to produce an MPEG-2 distribution file encoded at about 12 Mb (eight for video and four for audio). These distribution-quality files will be permanently archived in storage for later rebroadcast. They will contain the necessary elements to be leveraged via real-time streaming or in the IP-based multicasting file delivery system that we will deploy as part of public television's Next Generation Interconnection System (NGIS). The NGIS will be covered in one of my next columns.

The third and final copy to be created by this automated process will be a 1.5 Mb low-resolution proxy that will enable our programming operations staff to screen the programs specifically to identify and flag coarse language, profanity, graphic violence or any other content that might be deemed unsuitable for a general audience by current FCC regulations. These proxy files will be available on a screener's desktop and can be called up from the Broadview scheduling and traffic system. Their availability enables additional verification of segment timing and immediate entry of any content-related flags into the program's database, obviating the VHS dub-based system and data reentry that we use today.

As they stand today, these metadata- and database-driven workflows will provide PBS and its member stations with a far more efficient and accurate processing of each and every program.

Eventually, we expect that the initial tape-based ingest will be replaced with Internet-based file transfers. After that, step-by-step, we will continue to optimize the public television content supply chain.

I have no doubts that our new content ingest workflows represent the beginning of what will be a long and arduous but nevertheless rewarding journey.

Count on IT!

(I would be remiss if I didn't point out the tremendous cooperation that made the emergence of these workflows possible. Our operations folks led by Steve Scheel and Wendy Allen, our programming folks led by Sharon Drayton and Caryn Ginsberg and Michael Hunt as the overall project manager put in very long hours to create this substantial leap in processing efficiency.)