Video over IP

Video over IP conjures up thoughts of bringing high-resolution video content to the home over the Internet. Bringing this content to viewers is a good thing for broadcasters, but video over IP within the broadcast facility — in a low-resolution format — is another benefit that broadcasters can leverage. Many broadcast functions can be accomplished using non-broadcast-quality, low-res video within a dynamic IT system, compared with doing the same functions using broadcast-quality, high-res video within a static broadcast infrastructure. Here are some things to consider before making the transition.

Low-res video

Until recently, broadcast operators would handle all video content within the broadcast facility in high-res format. This has worked fine but requires a substantial hardware investment and dedicated facilities to transport video for viewing and editing. (See Figure 1.) The ability to do a significant number of broadcast functions using low-bit rate video solutions over IP networks introduces some interesting possibilities to the broadcaster, especially as the industry moves to a file-based workflow.

The availability of low-res video content, associated with an original high-res version, can benefit multiple departments within, and even outside, the broadcast facility. Low-res video is becoming more common in the newsroom, production and master control departments; however, areas such as traffic, program management and promotions are just beginning to realize the value. Certain functions that required access to video monitors or tape machines can now be done in less technical areas of the facility where a ubiquitous IP LAN/WAN connection suffices. For example, certain approval and timing operations (where durations, start and end time code positions are captured) can now be performed within the traffic department instead of the master control/ingest areas. (See Figure 2.)

High-res video sources such as baseband (tape, live signal, etc.) and file-based (commercial and program electronic delivery systems) can be captured using current acquisition methods before converting to low-res format. The low-res content, depending on the input source, can either be persisted or not, correspondingly resulting in files (proxies) or in video streams.

Streams are typically used for monitoring purposes. For example, an off-air signal for one or more channels (one's own or even competitors') may be monitored remotely using a low-bit rate capture, which can be easily displayed at multiple locations within the broadcast facility on standard desktop computers. Alarms can also be sent to relevant personnel or other software components when certain conditions in the stream are detected (e.g., bad or no video and audio). This same stream, additionally, can be recorded for logging purposes to meet certain FCC requirements or to provide to third parties, replacing expensive and cumbersome tape recording logging systems. Look for solutions that will not overload the network and will scale easily if the number of channels or monitoring stations increases over time.

Low-res proxies

Low-bit rate proxies can be created from baseband video via real-time encoders, in parallel to the high-res ingest ordinarily performed. They can also be produced from existing high-res video files on broadcast video servers via a transcoding operation, potentially faster than real time. High-res files may be the result of prior standard baseband ingests, but may also appear directly on a video server because commercial, syndicated and in-house content are increasingly acquired electronically, compared with tape. This proxy generation can be provided by the video server itself, in some cases, or by third-party tools, integrated with other solutions in use within the broadcast facility, such as automation and digital asset management systems.

Access to proxies has various applications, from simple browsing to more complex timing and editing functions. Browsing for the purpose of easily recognizing the original content merely requires a relatively low-res proxy, whereas browsing as part of a QC approval step may require a high-res proxy. The format of these browsing proxies can be varied, so standard players, such as Windows Media or QuickTime, can be used. Some browsing functions may require that ancillary data of the source content is also captured — for example, closed-captioning or subtitling data whose existence may need to be verified or potentially extracted as metadata to facilitate future content searches.

Timing and editing

Timing and editing functions, however, require a decent resolution, the guarantee of a 1:1 frame correspondence to the original high-res copy and the ability to identify the time code of each of these frames. For example, timing operations may need to capture accurate SOM and EOM time codes that will be used later for playback to air. An operator may use a proxy to create closed-captioning data files indexed by the time codes of the frames that will trigger display of the closed-captioning. Or an editor may use the proxy of some raw footage to create a new piece of content, which via an edit decision list will allow a nonlinear editor application, acting on the high-res original copy, to frame-accurately produce a broadcast-ready copy of this new asset.

Other features

Audio also needs to be captured accurately, staying in sync with the video in the original, not just to understand what is said, but, as is necessary in the closed-captioning example, to make sure the captions will appear at the appropriate time. Another useful feature to look for in the capture tool is access to the proxy before the capture is completed. This allows some of those aforementioned functions to be performed nearly as soon as the capture begins — clearly a more efficient process.

The transcoding method of generating proxies results in multiple versions, so the video content can be repurposed to platforms where low-bit rate formats are commonly used, such as the Web, mobile devices and smartphones, which are potential new revenue streams for broadcasters.

Working effectively and realizing efficient workflows across departments in this low-bit rate world requires proper asset management and handling of associated metadata, facilitating rapid and easy access to the video content. Therefore, broadcasters can see increased cost savings, as well as improvements in their workflows, by transferring a significant portion of their usual daily broadcast video processes to the corresponding low-res world.

Eric J. Piard, Ph.D., is VP of research and development at Florical Systems.