The PBS No Downtime Move

Six weeks in March and April—that’s all the time PBS technical headquarters had to move from 1320 Braddock Place in Alexandria, VA, to the department’s new Network Operations Center on the other side of town.

We’re not just talking about relocating an engineer’s bench or two: PBS had to move its entire storage, playout and data center facilities for the network’s full range of TV services, as well as its entire IT infrastructure. In fact, the only element that didn’t have to make the switch was PBS’s Media Operations Center (MOC), the fully integrated ingest facility that directs the flow of content for the network, because it had just been built at PBS’s new corporate headquarters in Crystal City, VA (close to downtown Washington, DC) after the network moved into a 130,000 square-foot facility there in February.

Still, even with the MOC in place, the task facing PBS VP and Chief Technology Integration Officer André Mendes and his staff was daunting. “In making the move, we executed a complete upgrade of our entire technology infrastructure, all while maintaining regular services at our old location,” Mendes explained. “We didn’t have any additional staff to help us make the move beyond one project manager. So we had to maintain our ongoing workload while we were going through all this system installation and integration.”

Moving its technical and playout operations from one location to another was complicated further because PBS decided to launch several new technical initiatives at the same time. “Over a six week period, we completely changed the way we manage our content, moving away from a tape-based system to a file-based methodology,” Mendes recalled. “This meant building both new facilities completely from the ground up. Once the switchover was done, we did move some equipment from our old facility, such as our tape library, but by and large we abandoned our old plant.”

BUILT FOR BROADCAST & CONTROL
To appreciate the scope of the PBS technical upgrade, you have to examine in detail the changes Mendes made. A case in point: “We completely ripped out and replaced our traffic and scheduling system,” he said. “In its place, we deployed Broadview Software’s suite of products. We then deployed our own ACE playout infrastructure, composed of Omneon for our on-air video servers, Masstech’s MassStore and the archive manager, Miranda’s iControl system for routing and monitoring and Omnibus to automate it all, and what we ended up with is a fully optimized broadcast management environment.”

PBS also moved from a standard master control environment, with control rooms for each service, to an integrated joint master control to support its own ACE system and those deployed at participating ACE member stations. ACE is a mostly IP-based, multi-channel master control solution that—at a cost of $1.3 million per participating PBS member station—provides five channels of SD playout (one is a spare) and one channel of HD that can run mostly in an unattended mode.

“ACE is designed with the ability to be remotely managed,” Mendes added, “so in our joint master control we have combined the management of our own ACE system with the ability to access and manage the ACE systems that are already installed in the field. Making the move allowed us to create a facility that closely integrates it all.”

On the content ingest side, Mendes said PBS did a thorough integration between hardware and software packages from BroadView, Avid, Masstech, Omneon, Harris Automation, ScheduAll, Biztalk, Remedy and Oracle Financials. “The result is a fully integrated system that stores content in several formats and allows for entirely file-based content after the initial ingest,” he said. “Content is permanently stored in Avid native AAF, IMX-50, 8 Mbps MPEG-2 long-GOP and 1.5 Mbps Windows Media.”

The MOC is located at corporate HQ to allow easy contact between program management, screeners and content ingest staff. It’s linked to the NOC, which is located 9.2 miles away, by two OC-3 155 Mbps fiber circuits, which are backed up by an OC-3 microwave link.

“This effectively makes the NOC the first node on our Next Generation Interconnection System [NGIS],” said Mendes, “since the content from the MOC at HQ is actually sent to the remote ACE system as files that are then stitched together by the Omnibus automation as directed by the BroadView playlists that are generated at the headquarters facility.”

The NOC is a 60 x 20-foot room with a main video screen that displays all content that is being originated by PBS, as well as all of the signal returns. “The other large screen displays thumbnails from every channel going to air on each of the remote ACE station systems,” he noted.

LESSONS LEARNED
After six weeks, the new NOC came online on Sunday, April 9, at 5:59:50 a.m. To ensure that his people were ready for the change, Mendes and his team went through extensive training prior to the move.

“The biggest challenge was integration,” Mendes admitted. “We went from a siloed environment where everything was separated to a setting where everything is really integrated. Our goal was to automate all of our downstream functions to our affiliates, so that we only had to do each job once. Still, it was an incredible task to get our heads around moving from a physical tape-based system to virtual media and all the associated metadata that goes with it.”

Today, PBS’s NOC is responsible for storing the MPEG-2 on-air content and uplinking it as directed by the program schedulers at HQ. “All of the video files are kept in storage until they are scheduled for air on the BroadView system, at which time they are transferred from the MassStore to our Omneon video servers,” Mendes explained. “The playout is then automatically controlled by Omnibus software, and the programming is transmitted via satellite to our affiliates.”

Looking back at the madcap move, the only thing Mendes would change is how much emphasis he placed on communications. “We were so involved in the process that it was often very difficult to make time for thoroughly explaining such a complex endeavor,” he recalled. “With the amount of integration that took place, we could have all benefited from additional information being propagated both internally and to the field.”

Still, Mendes is pleased with the new file-based way of doing business, as it streamlines operations. “If someone wants to pull a clip from the archive and modify a specific underwriter, all they have to do is create a new defining metadata container, drag-and-drop the new content information in the proper place, and let the system stitch the proper elements together automatically. Before, an editor would have to ingest the program from tape and make the change manually,” he said. “When you are looking at repackaging hundreds of episodes, the old process was just plain unwieldy and prevented us from taking advantage of many sponsorship opportunities.”

As for the future? According to Mendes, “With the proper amount of cooperation from our content producers, and with the upcoming deployment of the NGIS system, we can truly leverage automated processes and concentrate on fulfilling public television’s mission by dedicating our resources to creating more of our wonderful content—rather than spinning our wheels with repetitive manual processes.”

James Careless covers the television industry. He can be reached at jamesc@tjtdesign.com.