Crawford Communications encompasses video and IT


The master control playout pod uses OmniBus automation to trigger network feeds and send them through their dedicated air chains. The pod can support up to 20 in-house networks.

In their transition to digital television, broadcasters are absorbing IT into their video infrastructures. Increasingly, they are shutting down legacy operations or merging them with computer-based systems to reduce their vast videotape libraries. Meanwhile, they continue to become more comfortable with automation and its associated workflow benefits. Crawford Communications, based in Atlanta, merged video and IT and created a roadmap to a completely tapeless, file-based facility.

Crawford's initial foray into the project was a response to physical storage limitations. The company currently provides satellite origination for 34 networks and turnaround services for more than 30 additional networks from this site. Years of long-term videotape storage created a severe lack of space in its network library storage room, the central point for receiving and logging clients' videotapes. The influx of new content finally reached the point where the facility was nearing the limit of new business it could accommodate.

Rather than expand the facility's square footage, the company opted for a completely automated, server-based, multichannel master control facility supported by a data-based, hierarchical, storage-management system. From this strategy grew a digital library system called ENCOMPASS, which stands for ENCOde, Manage and Protect your ASSets. The facility called on Digital System Technology (DST) to design and integrate the system.

The design

ENCOMPASS is generally divided into three areas: an expandable 40-channel master control playout center, a multiple ingest station and a terminal-gear room with separate IT and video infrastructures. Tie lines from these new areas interface with the facility's existing operations, including its technical operations center (TOC) (for incoming/outgoing feeds), Avid editing suites and traffic systems. The team also built entirely new air chains to establish an automated digital workflow from the network origination rooms to the playout center and TOC.

The systems integrator consolidated drawings and data delivered by the principal vendors (Masstech, SGI and OmniBus). It used these technical drawings as guidelines for building the master control console space, router integration, and associated wiring and cabling. Later, the integrator created drawings for the other aspects of the project, including:

  • Audio and video signal flows.
  • Time code.
  • Closed captioning and other metadata.
  • Control.
  • Routing.
  • Synchronization.
  • Timing.
  • Networking.
  • Proxy servers and gateways.
  • Test and measurement.
  • Professional time-base generators and clocks.
  • Signal-reference ground.
  • Cable management.


The video row houses equipment for adding graphical elements to network feeds within the air chain.

Playout pod

The facility designated a central area of the 135,000sq ft building for the system. The integration phase focused on incorporating the IT into a functional serial digital facility. The design team selected proven information technologies to connect the facility's back-end storage infrastructure with new and existing off-the-shelf video servers and equipment.

The system's centerpiece is the master control playout center or playout pod. The pod can support up to 20 in-house networks. Each network has a redundant stream with a seamless, automatic switchover to the backup if the main stream fails. The facility has room to build an additional playout pod of the same size for another 20 networks as business warrants.

The playout pod resembles a half-circle, with four Barco display screens for signal monitoring. Each screen accommodates five networks. A vertical layout of the primary server, backup server, air chain and IRD provide the feed's history. Alarms and other warnings indicate the point of failure if signal problems arise.

Large SVGA monitors are spread across the length of the playout pod for signal monitoring of all servers, asset management devices, graphics playout devices, automation components such as playlists, and other IT devices. A large KVM router with nearly 15 monitoring and control stations allows operators and engineers to select sources to monitor from multiple servers and automation devices. Two operators, one at each side of the playout pod, monitor sources and access automation playlists from multiple networks through the facility's OmniBus Colossus multichannel automation system. Playlists for 10 networks are available on each automated PC. A supervisor seated in the middle can make playlist changes and fix problems reported by a network monitoring operator.


At the pod console, a supervisor can make changes to the air playlist and correct problems reported by a network monitoring operator.

The playout pod requires very little equipment at the console due to the scope of the automation system. Most devices that would normally be featured at the console are positioned in the air chain in the terminal-gear room. The automation system triggers the network feeds and sends them through their designated air chains. It feeds each network through a Pinnacle DekoCast graphics playout device or an Evertz logo inserter to add bugs, graphics and squeeze credits. Ensemble Design backup switches provide automatic bypass functions so the signal reaches its final destination regardless of signal failure at any point in the air chain.


IT routing

The facility has transitioned mostly to an IT routing environment for moving files instead of video into servers and storage systems. Hewlett Packard and Cisco provide several layers of IT routing over Ethernet to transfer files throughout the facility. A Leitch Integrator router built to 96×64 (capable of 128×128) is responsible for video and audio routing, although this is generally used only for live events. Rather than playing video out of the editing systems and ingesting it into the server as video, the facility now saves content from the editing systems to a storage array. That storage array links to playout servers for broadcast to air.

The new ingest and playout processes have ushered in operational changes that dramatically alter the facility's workflow. The old tape-based workflow required operators to physically retrieve tapes and load VTRs for playout and duplication in various network origination areas. The computer-based system streamlines the process by moving files between servers through FTP and automatically recalling files from archival storage.

Although these changes are steering the facility toward a tapeless operation, most clients still ship their air material on videotape. Operators in the network library storage room enter new videotape information into the facility's database and send the tapes directly to the ingest area, which accommodates 12 channels of simultaneous videotape ingestion. A PESA ingest router transfers material to storage and playout servers. Six VTRs installed in an adjacent equipment rack serve QC purposes prior to ingestion. Operators use overhead speaker domes to contain their audio sources while keeping other sources out.


In the IT rows located in the terminal-gear room, SGI ingest and playout servers deliver air material to playout servers at 15Mb/s.

Omneon, Pinnacle and SGI ingest and air playout servers are key to building the new workflow process. The cache servers use MPEG-2 compression to deliver air material to playout servers at 15Mb/s and to the facility's existing editing suites at 50Mb/s. The facility can ingest long-form programming such as movies using either data rate. The cache servers use three sources to obtain program material: legacy Avid nonlinear editing systems and their associated Unity network and storage system, FTP transfer from various servers, and SGI ingest servers.


Servers

The 34 network-originated channels rely heavily on the Masstech Mass-Store, a software-based A/V asset- and archive-management system. DST installed a main and a backup system to integrate program material into playout streams for satellite delivery through the facility's teleport. Mass-Store tracks the content stored on the servers, intermediary caches and a Spectralogic data-tape library. It also tracks videotape storage with the network library storage room.

MAM

The data-tape robot is responsible for the archive portion of the facility's hierarchical storage-management system. The data-tape robot retrieves stored material for playout to air, along with its accompanying metadata. The robot features Python drives that use the Sony Advanced Intelligent Tape (SAIT) format. This format provides a tape-to-cache speed of 120Mb/s and provides high storage density for the facility.

The new terminal-gear room houses all electronics solutions and the data tape robot as well as video and audio routers, terminal gear, graphic devices and QC stations. Roughly 30 equipment racks are divided among three rows. The team built the rows to separate IT and video gear for bandwidth and security reasons. On the security side, access to the public Internet is not allowed. This avoids any possibility of file corruption caused by an outside source.

Gear

Two of the three rows are dedicated to IT equipment. OmniBus, Omneon and SGI gear occupy one row; the other row holds Masstech gear, proxy servers and the Spectralogic robotics. The third row houses video terminal gear. This includes Leitch distribution amplifiers and converters, as well as Ensemble Design bypass switchers, frame synchronizers, processing gear and embedders. A PESA Cougar router transfers satellite feeds from IRDs into loggers for recording to servers. The Leitch Integrator router, Leitch Panacea ingest router, Pinnacle DekoCast and Evertz logo inserters are also housed in the video racks. Synchronizers and other traditional terminal-gear products are not needed on the IT side, provided the staff quality-checks and synchronizes everything at the ingest point.

Engineers in the terminal-gear room can use the QC station to monitor feeds and waveforms for three facility routing systems: the video/audio router, the ingest router and the Cougar router. Patch panels and terminal gear (DAs, converters) are positioned at both sides of the QC station so engineers can patch into equipment connected to any of the routing system while trying to pinpoint problems.

Each equipment rack connects to two power strips, each on a separate breaker, to ensure redundancy. Similarly, all devices within the racks connect to redundant power supplies. The failure of any power device will trigger a seamless transition to the backup source. The facility also provides full-facility electrical redundancy through backup generators and UPS systems.

The facility estimates that it can easily accommodate up to 120 networks simply by adding more digital storage. The transition to a new IT-based workflow and a digitized, completely scalable automated system creates new business opportunities and opens the way for a completely tapeless environment in the future.

New opportunities

Jack Verner is vice president of engineering and technology at Digital System Technology.

Jim Schuster, sr. vp, satellite op.
Michael Connell, dir., adv. tech. (ENCOMPASS)
Don Rodd, CE, satellite op.
Bill Elsholz, lead broadcast eng., satellite op.
Jay Pound, database admin.
Carol Burton, traffic mgr., network op.

Design team

Crawford:

Cindy Hutter, eng./sr. project mgr.
Chris Spacone, IT eng.
Mickey Kroll, broadcast eng.
Brian Kincheloe, installation sup.

DST:

ADC patchbays

Equipment list

Barco video monitors

Ensemble Design
5460 changeover switches
5600 audio de-embedders
8500 TBC/frame sync/video processor/A-to-D card
8510 audio embedder modules

Evertz 9625LG+DCP logo generators Leitch
Integrator 96×64 a/v router
Panacea ingest routing switcher
VDA6800 analog video DAs

Masstech Group
MassProxy low-res MPEG-4 proxy servers
MassStore A/V MAM

Omneon ingest/playout servers

OmniBus Colossus multichannel automation

PESA
Cougar satellite/IRD feed router
LN82DV1 8×2 switchers

Pinnacle DekoCast graphics

SGI ingest/playout servers

Spectralogic HSM data tape robots

TBC supervisors and MC consoles

Teranex noise reducers

Wohler ARS-21B and ARS-11 audio routers