SD and HD: Coexisting in a multi-resolution environment


Starz Encore uses an OmniBus System’s Colossus in its master control center to control 72 channels of video.

Television stations upgrading their workflow from traditional videotape-based acquisition, editing, production and playout to an IT model where video exists as files, face questions about doing so in a mixed resolution world where SD and HD co-exist.

Should digital content exist as two separate versions: SD and HD? Should a single version exist that’s encoded for each? If so, how should graphics be handled?

This edition, High Definition Technology Update turns to OmniBus Systems Vice President of Technology John Wadle for some insight. OmniBus Systems provides asset management and automation solutions for the television industry.

HDTU: Ideally, how would stations integrate the demand for mixed resolution content (i.e. HD, SD and even other resolutions, such as those for cell phone video display) in an IT-centric workflow where production and playback are done with files stored on servers?

John Wadle: With realization of an IT-based station, the issues brought on by concurrent transmission of HD, SD, narrowband streaming and datacasting will disappear.

In the ultimate digital workflow scenario, all traditional broadcast technology between the point of initial acquisition content (ingest) and the final steps in the transmission chain (digital encoder/mux/transmitter) would be eliminated.

HDTU: How would that take shape in terms of playout for various distribution demands?

JW: A system based entirely on IT servers and clients connected via Gigabit Ethernet would manage digital content files and associated metadata, communicate with traffic via Web Services, and originate (play) scheduled content from IT storage.

One or more play servers would decode and stream scheduled content files as IP to a mix server for each channel providing graphics overlays and video mix functions entirely in software to produce an IP-based channel stream. As a final step in the process, the channel stream would be converted to digital video (601) via a card in the mix server.

Digital video streams from this process would be encoded and multiplexed for DTV as required and sent to the digital transmitter, cable system or DBS provider.

Alternately, the channel stream could be sent to a stream server for IP distribution to the Internet and mobile devices.

HDTU: It seems like you’re saying that the real problem stations will face in serving up programming in various resolutions is integrating graphics for SD, HD and other resolutions into the right video stream. The mix server solves the problem.

JW: A mix server as described could handle a mixed schedule of SD and HD content on a single channel. A second mix server could originate the narrowband stream for cell phones, also from SD and HD content. Both servers would use the same clips regardless of their resolution, delivering to each device in the native resolution required.

HDTU: How does this IT-centric solution to the problem of delivering programming in a variety of resolutions compare with doing the same thing in a traditional video operations world?

JW: As long as the process includes the playout, routing and mixing of traditional video signals, the problems associated with handling multiple resolutions will require complex and expensive up-, downconverters, multiple-format or multiple routers and mixers, and associated complexity in the automation process.

Back to the top