Streamlining the Promo Process

Every single day, broadcasters, cable, satellite, IPTV and mobile handheld programmers must produce dozens of interstitial promotion clips informing and inviting viewers to stay tuned for an upcoming program.

This content traditionally was produced offline. Previously, the sophisticated graphics, video clips and sound bites required an editing system comprised of multiple video and audio components. Nonlinear editing systems have eased the requirements, reducing some of the effort, but seldom minimizing the impact on systems and people.

From a workflow perspective, as each clip is completed, it is individually printed to videotape, carried to the ingest bay and dubbed into a video server under automation control.

Humans enter metadata about the completed transfer into the automation and the server databases. Pre-assigned house numbers identify instances of the material to a traffic database.

The log conveys when the clip goes into the program schedule, and that information is subsequently passed into the automation system. Automation queries the server database, ascertaining the material is ready for air, and at the appropriate point in the programming day, the automation cues the server and the segment plays.

Of course there are many variations for the server-to-air sequence, and another set of variations to creating the content. Yet in nearly all of these steps, human intervention is required-from the creative process to the moment of play-out-even with today's automation and server systems.

There is opportunity for error at each step, so checking and rechecking is often standard operating practice. Each facility, based upon its personnel, hardware, traffic and automation system-even the wiring-makes these tasks unique and more cumbersome than station management would like.

Networked nonlinear editors, file transfers, database translators and transcoding software have eased the workflow somewhat. Still, the production and handling requirements for this content doesn't change significantly.

Promos for daily shows are program specific and time sensitive, often requiring their own audio or video tags or graphic overlays. In many cases, they are repetitive with only minor changes, i.e., the words "coming up next" versus "today at three." Each clip uses the same video and the same graphics foundation. This sets the stage for template-based production-often referred to as "bag-and-tag" or "badge-and-brand."

Broadcasters have long sought a means to mitigate this work effort. Some simply reduced the amount of production while others developed their own automated solutions for some or several of the processes. Ultimately, the ideal objective would be to make the tasks entirely live-to-air.

Station automation systems could conceivably produce clips live-to-air. However, a sophisticated set of secondary events, entered into the traffic log and monitored by a live operator, is necessary. In addition, a set of outboard CGs, audio file servers, multilayer keyers and 2D positioners are required.

Automation sequences of commands and triggers-essentially a live edit decision list-could be subject to timing errors, and any form of previewing before air is nearly impossible. The risk and the degree of human effort is probably of little advantage compared to traditional methods of offline editing.

Broadcast interstitial delivery is headed towards a new domain-a template-based rendered live-to-air solution. The PC spurred this opportunity. When coupled with the same graphics engines used in today's CGs and clipstores, the power of these devices essentially turns what was once a set of separate video and audio engines into a single entity.

OFFLINE SYSTEMS


(click thumbnail)
Using a non-video server to carry clips, audio and predeveloped templates over a media network is nothing new.

Master control switchers added audio and video clip servers years ago, with images created as files in offline systems, converted to native device formats and imported into onboard graphics engines. Outboard live branding engines helped supplement the on-air look, moving some production tasks away from NLEs.

The latest live-to-air processes come from the foundations of digital signage, and appear to be the next promise for the live bag-and-tag activities for broadcast television. Digital signage uses a concept of filling template-based graphics with current information pulled from a database. The databases are updated externally and therefore become more real time and market sensitive.

The broadcast equivalent might be the marketing department's promotion's grid, a spreadsheet that identifies what the tag for each promo should say depending on when it plays. Tag information is collected throughout the day and updated long after the log was called the day before.

Traffic would only need to schedule the position of the appropriate video background clip and promotion, then send the fill information to the branding engine database.

Depending on the time of day and the program episode, variations in the text, audio tag, and even snipes or short animation clips could be automatically inserted directly on the air.

This operation is not unlike the wheel concept. Template-based systems link live-to-air graphics devices to third-party databases that semi-automatically collect and assemble the sequences of text and graphics elements, then populate a real-time rendered graphic with purpose-created messages, news updates, weather information, and even Web-scraped content from the stations own Web sites.

The messages are clear. Video servers, an important element in the content and commercial collection and the play-to-air role, remain. Smaller, purpose-built platforms are being deployed for branding the content that plays out from the larger video server systems.

In fact, NAB2006, demonstrated that this concept is much closer to reality, whether as a real-time filecentric-to-baseband video renderer, or with compressed video tagging at a file only level for real and non-real time replication.

Karl Paulsen

Karl Paulsen is the CTO for Diversified, the global leader in media-related technologies, innovations and systems integration. Karl provides subject matter expertise and innovative visionary futures related to advanced networking and IP-technologies, workflow design and assessment, media asset management, and storage technologies. Karl is a SMPTE Life Fellow, a SBE Life Member & Certified Professional Broadcast Engineer, and the author of hundreds of articles focused on industry advances in cloud, storage, workflow, and media technologies. For over 25-years he has continually featured topics in TV Tech magazine—penning the magazine’s Storage and Media Technologies and its Cloudspotter’s Journal columns.