PxPixel
Why build now for digital? - TvTechnology

Why build now for digital?

The next technical/business model is wide area connectivity to the storage devices. We have seen great promise with regard to technology, digital television
Author:
Publish date:

The next technical/business model is wide area connectivity to the storage devices.

We have seen great promise with regard to technology, digital television and the future of broadcasting in general. As the transition to digital moves forward, many facilities have fully incorporated a digital infrastructure or islands of digital equipment in their plants. However, the full transition has barely begun. As the deadline imposed by the FCC approaches, less than 200 broadcasters are broadcasting a digital signal. Very few of these facilities have the capability to create HD images within their studios, or for that matter, ever intend to do so in the future.

The initial move to digital meant dealing with 270Mb/s component digital video. I designed my first fully digital facility in 1995 when the available equipment was cumbersome and challenging to work with. It also required us to compromise the desired functionality. The industry has managed to move through the initial digital conversion and new tools are available that allow stations to do much more than they were capable of doing in the composite analog world.

Legacy engineering and business models must be revised as alternative entertainment and technologies compete for market share. Remember, just a couple of years ago the fighting words were convergence, extensible, open platform, and, worst of all, compression. At the 1996 NAB show, the thought of accepting compression of any kind on a video image was totally taboo. By 1999, compression was accepted as the norm and the issues were what type of compression each manufacturer applied and at what bit rate.

With the acceptance of compressed media, the next transition begins — open platforms with media captured, stored, managed and shared as a file. With this philosophy, the movement of media throughout a facility doesn't depend on a particular video format. It is a file, regardless of the resolution, frame rate or encoding scheme. While this is not realistic for real-time synchronous broadcasts, generators of content will thrive in this environment. Unfortunately most of the major video server manufacturers are not providing open connectivity to the storage side of the system. Currently, you will be required to purchase the storage solution directly from the manufacturer or their strategic partners. We do see this being whittled away slowly as more IT-savvy clients and system integrators demand these features. The next technical/business model is wide area connectivity to the storage devices.

Over the last few years we have seen broadcasters consolidate their business model as they compete for programming and advertising revenue. For better or for worse, the day of the mom-and-pop broadcaster is rapidly disappearing. As broadcasters continue this consolidation model, I envision a consolidation of the technical model as well. For example, streaming media over the Internet works on the concept of “edge caching.” In other words, media is pushed across the Internet to points of presence (POP) in major cities throughout the country. This places the content geographically closer to the “last mile” for pickup by the viewer. The same technology can apply to broadcasters who have consolidated their operations from local engineering business models to a centrally operated model. The local broadcaster now takes on the functionality of the POP via wide area connectivity and provides the cache along the edge. In this case the “edge” is the feed to the transmitter. Local insertion of advertising, news coverage etc. all reside within this server-based facility and are dropped in as scheduled. Business management and core programming are managed from the central offices of the newly formed networks.

Part of managing all of this media as data will require multiple servers for capture, storage and encoding purposes. In streaming media facilities and network operations centers we typically manage hundreds of servers, PCs and encoders using what we call a KVM matrix. This equipment allows us to route the keyboard, video and mouse of multiple servers to multiple user stations. These devices provide direct connectivity to the remote server including a hard-reboot, if necessary. Category 5 or Category 6 cable is used to connect users' stations and servers to the matrix.

As new facilities are designed there will be three basic types of connectivity throughout the plant: wide bandwidth coaxial cable (for baseband audio and video); fiber; and Category 6 (or better) for network connectivity. Future designs must include demarcation points throughout the facility to break out or repurpose wiring infrastructure in order to accommodate continued changes in equipment or functionality. In the IT industry these are called an Independent Data Frame, or IDF closet. Basically it is a wiring closet that surrounds major areas throughout the facility. These areas are built adjacent to the central machine room, server farms and large work areas.

In the near future, look for super-wide bandwidth routing backbones that will allow very low data rates as well as full-bandwidth HDTV. Consider the possibility of multiple 4Mb/s pre-encoded transport streams arriving at your facility for re-transmission. This is very common at satellite facilities that handle incoming feeds as SMPTE310 or DVB-ASI. Along with these compressed transport streams comes the need for more advanced bit splicing tools able to perform cuts and transitions within the compressed stream.

In discussing this digital transition five years ago with a station director of engineering, he said, “The way things are going, in another five years, you'll come in here and the only technical staff you'll see is a manager of information systems, the director of engineering to keep the license up to date, and a shipping department for when things break.” He missed the date, but he was on the right track. The next technological wave I see coming to the broadcast facility is the movement of data, data storage and asset management. This will come in the form of IP network connected devices and file-based content on open platforms across network-based storage devices.

Webster's Dictionary defines transition as: A passing from one condition, form, stage, activity, place, etc. to another. Regardless of the FCC mandate, competition and market conditions will require broadcasting to transition to a new business model, and new technology will be required to support this new model. The first step for the broadcaster to prepare for this next wave in technology is education. It is imperative that your staff has a basic understanding of network architecture, IP management, data storage, media archiving and asset management.

Start now!

Greg Doyle is president of Doyle Technology Consultants, Inc.