The essence of a BOC

The physical layer lies at the foundation of a modern BOC — where the long-established technologies and practices of broadcast engineering have reigned since the birth of television
Publish date:
Social count:

Introduced in the last Transition to Digital newsletter, a broadcast operations center (BOC) can be conceptualized using a four-layer model consisting of physical, media network, applications and security layers. The discussion begins with the physical layer.

All the systems required for production workflows that process, distribute and assemble audio, video and graphics, comprise the physical layer of a BOC. In addition, all fundamental support for the entire building/infrastructure is also part of this layer.

The physical layer is implemented by various combinations of core, processing, distribution and assembly system resources. To get an idea of the intricacy of each system, the major components will be briefly described.

Core systems

Every building has core systems that enable installation, operation and maintenance of resources. A BOC requires the usual core systems plus a few that are particular to TV facilities.

AC power, HVAC and UPS systems may have digital components, but sinusoidal power is analog. Analog and digital signals are susceptible to EMI, RFI and other forms of noise. Look at the elaborate coding techniques used in ATSC 8-VSB modulation or the shielding coax cables. The need for a clean, stable power source is imperative.

IFB and all other production and support communications must be installed, configured and maintained. Communication while on air must be 100 percent reliable with appropriate back up. If any equipment malfunction or operator error threatens the integrity of the on-air signal, support must be reached instantaneously.

Command, control and monitoring (CC&M) uses GPI and RS-422, but is evolving to include RJ-45 network connection and TCP/IP networking methodologies. It is challenging to interface these varied types of signals into a reliable integrated system.

Processing systems

Digital production occurs on the baseband “essence” layer. Essence can be defined as analog or digital, audio or video with the common property of being a continuous real-time signal.

Where there was once one analog NTSC presentation format, now there are many DTV choices: 1080i, 720p and 480i. Transcoding from interlaced to progressive scanning or between various raster formats must be done on a pixel by pixel basis, which is essentially a baseband process.

Audio processing and distribution, once solely analog, has been supplemented by AES and MADI digital streams. Digital audio has higher fidelity and less line noise. On the other hand, drop a bit and at best you get silence, at worst a loud noise.

Graphic effect (GFX) building happens in RGB color space and will be converted to an appropriate video format and file structure. Illegal colors can be produced because CG systems use the full 8-bit, 0-255 range, whereas ASTC legal colors are limited to the16-235 range. QC is extremely important and difficult in that encoding, decoding and transcoding artifacts that become noticeable on the physical layer. Conformance to legal color space must be verified, unless you’d like viewers to see saturated, bleeding reds. Remember, just because your graphics equipment can create it, doesn’t mean your encoders or transmitters can properly send it.

Distribution systems

Moving signals through the production process can be done in several ways. Analog component video distribution requires three precisely timed distribution signals for each component, plus each channel of audio requires an additional channel. And any needed H and V sync signals require two more channels.

Using SDI rather than component video reduces the required discreet signal paths to one and also eliminates the need for component timing and precise cable lengths. The H and V sync signals are a part of the SDI. By embedding audio in an SDI signal, only one wire is necessary for everything, greatly simplifying routing and distribution.

Audio distribution can be either embedded, discreet, TDM or Dolby E. If audio is embedded, it will follow the video, on a single wire, but must be de-embedded for any processing. Separately, the available backout of an SDI stream requires that embedding/de-embedding equipment be placed in the signal distribution path. TDM processing allows audio to travel on a single wire and must be multiplexed and de-multiplexed at the appropriate points. Dolby E uses lossless compression to move multiple channels of audio on wire pairs. Discreet audio requires an individual wire for each signal, significantly adding to the audio router port count. Consider carefully all of these and your station’s operational factors. Handling audio for TV used to b simple — today it is a much more complicated process.

Assembly systems

Integration of audio, video and graphics elements is done in both program control rooms (PCR) and master control rooms (MCR). Insertion and switching of audio, video and graphics is done on the physical layer, SDI and AES being the primary essence format.

Insertion and switching is a real-time activity. A production switcher, loaded with macros, controls the firing of servers. CC&M communication typically occurs over a combination of LAN, RS-422 and GPI interfaces. The result must be the seamless assembly and presentation of program elements. You get no second takes.

A decision will have to be made as to what type of house sync (tri-level, H&V or composite NTSC) will be distributed in a facility. Signals must be timed to house sync for production and master control switching. Lip sync has become challenging due to compression encoding and decoding delays that vary for audio and video during the production workflow. These delays must be compensated for when the essence is reconstructed as a baseband signal and the final program is assembled. There is no better way to look stupid than to broadcast a segment that has a lip sync error!

The more you know, the less you know

In the past, technical mastery of physical engineering layer principles and practices was sufficient. A broadcast chief engineer was required to possess technical mastery of all technical aspects of physical layer systems.

However, with the addition of IT technologies there is a danger in possessing only a little knowledge. As an engineer studies the hardware, software and conceptual engineering of any of the four BOC layers with which they are less familiar, it should rapidly become obvious that the more you learn about any technology, the more details there are to learn and master.

Preparing for and passing the Certified Broadcast Television Engineer (CBTE) exam can help assure that technical personnel possess the requisite knowledge of physical broadcast infrastructure. It can be particularly helpful to have your IT personnel also pass SBE CBTE certification.

The path to be taken

An important design consideration is implementing an infrastructure with a physical layer that can stand on its own if any or all of the media network, application or security layers crash. You don’t want corrective action to be impeded by a login lockout. Nor would you want a computer to freeze, taking you off the air or a congested network to prevent the timely transfer of a hot graphic for a breaking news segment. Implementing a failsafe end-to-end SDI signal path can avoid this problem.

With analog NTSC simulcast still required during the DTV transition, all broadcasters will have to live with the challenge of dual format production and dissemination. Until NTSC is retired, you will be on the air and transitioning at the same time. You are required to accomplish the nearly impossible task of changing the airplane from a propeller-driven model to a digital jet while keeping everything flying safely.

Reading sources

SBE bookstore:
SMPTE bookstore:

Relevant standards

Back to the top