Digital spot delivery

As Lifetime's network operations facility made the transition to digital technology, daily deliveries of commercial spot reels seemed increasingly anachronistic. The facility processed thousands annually, each reel requiring a technician to transcribe label data into the traffic system manually and perform an ingest step to encode it to digital file format.

Thousands of manually processed spots represented thousands of opportunities for error and thousands of boxes and reels to dispose of. The network kept two aging one-inch VTRs in operation solely to handle spot ingest. When satellite and IP-based digital spot delivery services began to appear, its motivation to investigate them was high.

DG Systems was the first such service the facility encountered, followed by FastChannel, Media DVX and Vyvx ADS. Though each has distinguishing features, the premise is the same: Commercial spots in digital file format are delivered to a catch server in the facility via satellite or other connection. With this type of system, ingest was done by the delivery service. No more spot reels!

As is often the case with a transition to a new technology, the hardware was ready, but the software, the processes and the workflow needed careful adaptation, even total re-invention. The most important realization was that spot-prep processes had evolved over the years to include multiple quality-checking and gatekeeping operations. The network began an end-to-end process review to ensure that these needs continued to be met.

As the analysis began, some questions were as simple in nature as they were profound in their impact: When a spot is delivered electronically to an ever-waiting catch server, how will the channel know? If the facility drags the spot onto its on-air server, how does it update its traffic and automation databases? When does the network view the spot, or does it have to? Questions mounted, and an analysis took shape. The station asked: What did it hope to get out of this change? What did it want to avoid?

First steps

The evolution began with the network receiving small numbers of files via one of the services. Most processes remained the same, sans videotapes. Whereas the delivery of a spot reel in the morning's mail was self-evident, the delivery of a file to a server was heralded by a fax or e-mail to a designated recipient. That notification became the source for the label data, which still was transcribed manually into the traffic system.

This traffic system entry continued to serve as the signal to the traffic department that the spot master was in-house. Traffic assigned house ID numbers to the priority spots so they would be included in that evening's dub order, which the librarian then generated in both hard copy and as a text file on diskette. Master control ingested the spots upon receiving the reels and the dub order. One change was that the dub order was amended to group the disk-based spots together for ease of processing and to ensure that the techs knew where to find the spot masters.

Initially, a catch server was interfaced to the automation ingest station to behave like a VTR. A tech first loaded the dub-order text file into the automation database. Then, commercial spots were ingested by decoding the file to video and re-encoding to the on-air content server in real time, matching the spot to a record in the dub order. While not as efficient as a file transfer, this was far more efficient than laying the spots off to tape for later ingest — a step the facility had no interest in taking. These measures were a step forward, but only a small one. Worse, the channel could process only a few spots this way because only one of the catch servers could emulate a VTR.

A leap forward

A great leap forward took place when the station installed the Telestream FlipFactory Traffic Manager (TM), which combines FlipFactory trans-coding with tools to manage the flow of metadata.

The workflow automation system provides a clearinghouse for the catch servers, aggregating the metadata from arriving spots and transcoding the spot files for compatibility with the facility's play-to-air servers. Folders and reports, as well as proxy copies of the spots, were available on the TM server, viewable from desktops around the facility. This negated the need for faxes, e-mails and VHS screening copies.

In order for the traffic and library departments to access these features, the station's IT and engineering departments needed to create a gateway between the engineering and corporate LANs. With viruses and hackers a growing concern, they paid strict attention to network security.

A key goal was to eliminate manual transcription of spot data during processing. The TM server did a handy job of collecting and mapping that metadata, partially conforming it to the facility's needs. However, a key problem remained: The traffic system demanded that certain fields conform to a table of values, and the incoming metadata often didn't match those values. “Procter and Gamble” wouldn't work when “P&G” was required. A job that was intuitive for a person was complex for a software application. There were other metadata issues, such as product names and spot titles being omitted or transposed. Also, the network wanted the data to move to its databases electronically, but not unattended. To solve these problems, the facility found helpful partners in its IT department.

LISA

To provide an opportunity for some gatekeeping and data grooming, IT developed Lifetime Information System Authority (LISA), which allows the user to view the list of newly arrived spots and map values in the metadata to valid values in the traffic system tables. Lookup tables and check boxes make this a keystroke-free operation. This simple tool provides the best of both worlds, allowing the data to flow electronically while providing an opportunity to check and correct it. Hours of daily, manual transcribing are reduced to minutes of “picking and clicking.”

LISA also associates a house ID (assigned by traffic) with each spot file on the TM server. Once traffic assigns a house ID, all conditions are met to move the content to the content server automatically. Doing so, however, means that the spot could air without anyone first checking for quality, content or timing. The station wasn't ready to take that leap of faith. Therefore, it disabled the system's auto-flip feature and preserved the all-important operation of having a technician oversee the transcoding of files to the content server, viewing each one and making corrections as needed.

The new workflow has taken shape. Now, catch servers can receive spots and metadata that from there are aggregated onto the TM server. A librarian can use LISA to monitor and groom the metadata before it flows to the traffic system. The system then delivers dub lists and files to master control for ingest, where a technician supervises the delivery of spots to the on-air servers, quality checking them after the operation. The station will continue to “rack-and-roll” the dwindling number of tapes it still receives.

Some strides remain to be made. An electronic interface between the traffic and automation systems would eliminate a dub-list diskette; some day the facility will take that leap of faith and start auto-flipping.

Thanks to ingenuity and invention, numerous obstacles were overcome, and the channel's goals realized. The workflow changed, but the station maintained its “checks and balances” and can — at last — take full advantage of file-based commercial spot delivery.

George Krug is vice president of network operations at Lifetime Television.