RIO DE JANIERO— During his more than eight years with NBC, Darryl Jefferson, vice president for Post Operations & Digital Workflow, NBC Sports & Olympics, has seen the rapid evolution of production advances for the network’s Olympics coverage. He’s directed the network’s Olympics production to an entirely file-based workflow, which has enabled NBC to enhance and accelerate its move to multiplatform distribution and enabled its Stamford, Conn. headquarters to serve as the hub for Olympics production. In a recent interview with TV Technology, Darryl discussed his role at the Rio Games and how NBC is ramping up its technical infrastructure to deliver more than 2,000 hours of Olympic linear programming across 11 networks.
TV TECHNOLOGY: What is the difference in your role now as opposed to what it was in London?
DARRYL JEFFERSON: I now oversee the post-production operations for the sports group, so all of our football, NASCAR, all the other things we do in the so-called “off-season” and the Olympics. I also oversee our digital workflow group that does asset management, file movement, file registration, graphics, post-audio, that type of thing. The team is larger and also does more than the Olympics.
TVT: In your transition to VP, was it a case of basically adapting a lot of the techniques that you learned in Olympics production to other sports production at NBC?
DJ: Yes, every workflow has to be slightly tailored, but yes. Some of the technology—such as the technology stack that we're using in the Olympics, we know it will scale to this many concurrent programs, so you're able to really test the waters at this kind of intense event and know that for 17 weeks of football, or for the length of the hockey season, or for some longer period of time, you know you have solid technology.
TVT: What is different in what you're doing in Rio than what you were doing in London, or Sochi for that matter?
DJ: There's a lot that’s different. In London, for example—because, at the time, we had what we thought was a lot of connectivity between New York and London—we replicated both high- and low-res footage. Things would be recorded in London and replicated to both high- and low-res back to our “mothership,” which at that point was Studio 8H at 30 Rock.
NBC Olympics has a team of approximately 1,000 managing its Olympics programming in its Stamford Conn. sports facility that opened in 2013.
Nowadays, we're moving and recording and there's so much file movement, it's much more efficient to just replicate the proxies, in both directions between the venue and our new home-base in Stamford, Connecticut. What that means is we now have 120 channels in two cities—one in Rio and one in Stamford—and the proxies are replicating from Stamford to Rio and from Rio to Stamford, so all users see all recordings. This is in lieu of just moving unnecessary content around. If someone goes to do something, for instance, cut a sequence and send something to a playoff device or send something for edit, it goes back to the place it was recorded, grabs that subset of the high-res, grabs just that and sends it to where it needs to be.
TVT: In other words, you're not duplicating the high-res proxy in Rio, only if needed?
DJ: Correct. It's much more efficient and yes, we have more connectivity, but it really helps us to be more efficient in terms of the bandwidth used for every record. It also lets us capture more things and it's seen by more people. In addition, in Rio, our five big venues in athletics, swimming, gymnastics, beach volleyball, and golf, also can peer into all of those records, clip off shots, send them to themselves, and also other cities in North America can reach into Stamford and do the same, including the Golf Channel in Orlando, our contact media center in Colorado, New York News, 30 Rock, and of course Stamford. All those people are using the two main hubs of Stamford and Rio to peer into this mass of content rolling into the facility.
TVT:What’s the workforce breakdown between Rio and Stamford in term of personnel numbers?
DJ: I have a whole team here in Rio. We also have about 1,000 people in our home-base in Stamford. There's a whole team there managing that effort. What used to be in 8H is now in the Stamford center.
TVT: What's new on the Highlight Factory side of things now?
DJ: The Highlight Factory has gotten bigger. We have more people working in Stamford, in the Highlight Factory, more people physically in the room and we're using a bunch of different tools. The entire [metadata] logging application is happening in Stamford. We have just over 70 loggers adding metadata that's immediately visible to everyone that's looking at media. Those are some new things.
TVT: Has the goal towards multiplatform distribution pretty much remained the same from the past two to four years?
DJ: It's the same production path but the world has changed a lot. We're pumping out to a lot more. We're pumping out to smart televisions, set-top boxes for a variety of cable providers; we're pumping out to more people with closed-captioning, more people with video descriptives. The outlets we're reaching out to is growing as well. We're streaming out to troops overseas in the Armed Forces Network, there's just more of the outlets. So, yes it's the same but the same is now twice as big.
TVT: I've got to ask this question. Is there any part of the production now that is not file based?
DJ: That's a fair question, but no. Occasionally, we make a dub for other broadcasters. We do multiformat to output tapes to others, but no, nothing. Our cameras are all card- or drive-based.
TVT: Did you find all the facilities in Rio are up to your standards, as far as being fiber-based and having the ability to just plug in and go?
DJ: Yeah, and to be fair, the IBC (International Broadcast Centre) is a hub for more than just NBC and their fiber infrastructure is here. We brought a whole lot of fiber—that was one of the new big things we had for these Games—and moving forward, we’re changing our background in the IBC to a fiber infrastructure, and then tying that into a POP (point of presence) and then getting diverse connections. We have 40 GB of connectivity between here and Stamford. That's partially procured by NBC-U and partially by OBS (Olympic Broadcasting Services). We have partnerships with Level 3, AT&T and the OBS to get that connectivity.
TVT: How are you handling 4K/UHD coverage?
DJ: It's certainly it's own kind of beast. We have basically a linear channel, linear delivery for 4K. We have a 4K controller on cutting a linear program. We're covering the opening and closing ceremony in 4K and we're also mixing the opening ceremony in Atmos on a trial basis. We're also delivering assets via file delivery out to our meet-me point in Colorado for cable companies and others to pick up. It's a different path. It's not like we're taking everything and simply then converting it for a regular broadcast, it's a parallel process at this point.
Anecdotally, it's just interesting to see it and it feels like globally we're all putting our toes in the water. I think broadcasters are seeing what they can do, people are excited to see it at home, but really we have to run through the paces and it'll be a process to find out where, how, where appropriate, and if and when people have the kind of appetite. We hope they do.
TVT: Anything new on the audio side?
DJ: In addition to using Atmos for the Opening Ceremony, we're actually leaving our post-produced audio at home so there's a Pro Tools rig still in Stamford, a Pro Tools room for sweetening, so that will be done remotely through file exchange. That's absolutely new and we're mixing each of the venues in 5.1. There is a patchwork of 5.1 in stereo at each of the venues.
TVT: How are you handling storage?
DJ: Harmonic is our central SAN, the place where we store all of the video files that are being recorded on the record wall. That’s all being recorded by Media Decks as the principle recording device and are writing to Media Grids in a central SAN. We have 988 TB of Media Grid in Rio and we have just under that in Stamford, so that's almost two petabytes of storage and we'll fill it in the 17 days. I have at least a handful of dollars riding on that, just internally some bets. "You'll never need that much storage" they say and I'll laugh and say "I'll bet you on which day we'll fill it." The Stamford facility runs on Media Grid, that's the central SAN, 24/7, and it only makes sense to have a sister grid in our host city and have the two grids replicate between the two.
TVT: You have redundancy in both in Rio and Stanford, basically?
DJ: It's both redundant and it's also recording different content. Stamford is recording our cable day parts, as they’re based out of Stamford. They're recording all the local production that's happening in Stamford. They're also doing other sports that we're focused on. Here in Rio, we're recording all the live competition feeds and their different stages in some of the control rooms that are coming out of here, some of the NBC feeds from trucks that are at some of the venues, and so on. They're recording different things and replicating to each other. It is redundant equipment, but it's not recording the same thing and it's not simply a backup for the other.
The latest product and technology information
Future US's leading brands bring the most important, up-to-date information right to your inbox
Tom has covered the broadcast technology market for the past 25 years, including three years handling member communications for the National Association of Broadcasters followed by a year as editor of Video Technology News and DTV Business executive newsletters for Phillips Publishing. In 1999 he launched digitalbroadcasting.com for internet B2B portal Verticalnet. He is also a charter member of the CTA's Academy of Digital TV Pioneers. Since 2001, he has been editor-in-chief of TV Tech (www.tvtech.com), the leading source of news and information on broadcast and related media technology and is a frequent contributor and moderator to the brand’s Tech Leadership events.
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.