STAMFORD, CONN.—NBC Olympics’ coverage of the Rio 2016 Summer Games was the most ambitious we’ve ever been in bringing content to U.S. viewers. Over the course of 17 days, we recorded and stored 9,267 hours of content, and that’s a lot of content—401 days of it!
Through every iteration of the games, I’m amazed at the level at which we are producing content and at the massive amount of content consumed. With as many as 30 events going on at any given time, and with simultaneous multicamera shows taking place across different sports and venues, the 2016 Summer Games demanded an otherwise unheard-of level of concurrent production.
The infrastructure NBC Olympics uses to enable large-scale long-distance production has evolved over the years, and in Rio, I feel we realized the full potential of this approach and the technology supporting it. Our remote production model enabled fast, cost-effective creation of a wealth of broadcast and multiplatform content—including live streaming of every competition—by staff working at our Stamford, Conn., facilities, as well as at other NBCUniversal facilities across the U.S.
How did we do it? Through an array of parallel workflows, all with access to an Avid MediaCentral asset management portal that looked into the growing files on Harmonic storage infrastructure, we supported network production including daytime, prime time and late night shows; cable production by 10 different NBCUniversal channels and various specialty sport channels; digital-only productions; the Gold Zone, a collection of the best live action of the moment; a hyperfocused gymnastics program; and all web, mobile, live streaming, STB, TV everywhere, and XFINITY MVPD delivery. Our profiles and features group and our news division also peered into this portal, as did teams from the Golf Channel and Telemundo for both linear and nonlinear content creation.
Even on-site production teams at remote venues—swimming, athletics, gymnastics, beach volleyball, golf—could log in, look into our massive store of content, and pull content to complement the action being recorded live. At any given time, 100 simultaneous users would be working through the browser-based interface, which enabled them to find content and send it to any edit platform or to our EVS systems for playout.
At the front end of all these parallel workflows, we used the ScheduALL ERM system to turn events on the competition schedule into individual work orders dictating the router feed, the target channel and the record devices to be used, whether Avid AirSpeed server, an EVS system, or even a tape deck. Cyradis Technology software used ScheduALL work orders to automatically trigger recordings to our Harmonic Spectrum MediaDeck integrated media server systems, which facilitated on-the-fly capture and proxy generation for all incoming materials, and to give each record event a unique ID. Details such as the sport, event, round, heat, and competitor names also were included in that original work order, and that metadata traveled along with video in an XML file, ultimately populating the MediaCentral asset management system.
Harmonic MediaDeck media servers deployed in Rio simultaneously recorded as many as 60 incoming venue feeds as both XDCAM-HD at 50 Mbps and H.264 low-resolution proxy. NBC Olympics staff in Stamford also had independent control over 60 additional channels of local ingest. Each recording’s proxy was written in near real time to a local 960-TB Harmonic MediaGrid storage system at the IBC and, about 40 to 50 seconds later, to a second 920TB MediaGrid in Stamford, connected via two 10-Gigabit circuits. Working with this near-live recording, teams of loggers in Stamford continually added metadata about specific events—an awesome three-point shot by Diana Taurasi during a women’s basketball game, or a fantastic block by Kerri Walsh Jennings at beach volleyball—and inserted time-based markers for time-based events.
The extensive live logs and stats, scoring and timing information embedded as metadata enabled MediaCentral users across NBC Olympics and NBCUniversal to create shot lists, which in turn were automatically conformed in one of two ways: in proxy resolution for streaming along with live multiscreen services, or in high resolution for craft editing in Stamford and at other NBCUniversal facilities. Through the same MediaCentral portal, users had access to nearly 300,000 hours of archive content, as well as the ability to mark clips for delivery to themselves, wherever they were working. Because directors and editors made edits with proxy clips, with the system itself conforming the full-resolution footage accordingly, we only needed to transfer the full-resolution media necessary for a given package.
Production teams used the MediaCentral portal to access this content and turn around segments for live programming or to grab highlights for a wrap-up show or for professional analysis. For example, a news group user at 30 Rock looking for shots for "Today"or "NBC Nightly News" could find and access near-live content from Rio using the wealth of detail associated with every recording and maintained in the asset management system. If you heard a rumor that something big was happening over on the third table of table tennis, you knew that someone had already found that clip.
All live production over router and IP feeds was turned around in Avid edit or EVS playback rooms and cut together from two control rooms in Rio, as well as another nine in Stamford used to assemble all cable programming. As this happened, NBC Olympics staff called all those control room outputs, generated closed captions and, for the NBC prime time show open and close, added video description for the blind. The team did a Spanish call for NBCUniversal and Telemundo and added SAP for every men’s soccer match and all U.S. women’s soccer matches.
To add play-by-play and commentary to the massive number of events we covered, we augmented our staff of 3,000 across the Rio broadcast center and all venues with another 1,000 personnel in Stamford, where many were involved in digital production. We brought researchers, producers and experts on specific sports into our Off Tube Factory to provide knowledgeable insight and color for live programming—broadcast and on our web and mobile platforms—so that for a broad array of sports, we could offer the same high level of coverage typical of our on-site productions, but from our Stamford home.
We streamed all of the competition from Rio, running live IP feeds—45 host feeds from the OBS and another 12 feeds (English and Spanish) from its multilateral distribution service—that were transported back to Stamford, the release points for all our cable networks, in-pattern streaming and digital originals, and then also to iStream Planet in Las Vegas for streaming and out to our Akamai CDN. Working in parallel, a unit in Stamford cut packages for STB delivery for MVPDs and for our XFINITY Comcast and TV everywhere services. Short-form content was cut in Stamford, long-form content was cut in Englewood Cliffs, N.J., and all of it was delivered to a “meet point” in Denver (Dry Creek) where Comcast/other MVPDs could readily grab it.
Through a new relationship with Buzzfeed, which had people embedded in both Rio and Stamford to do social media on Snapchat, Facebook and Instagram, we did our first live content via social media. This was a new reach for us, and it got NBC Olympics content to a lot of new eyeballs. While the Buzzfeed team did have access to MediaCentral and all content on our MediaGrid systems, its staff created a lot of interesting original content from around Rio.
Again in parallel, we ran our Highlights Factory to create VoD clips for web and mobile. Our HR department brought in more than 70 people, largely interns who were first-timers to television, to cut clips for sports with which they had some experience. The cool thing for this group was that they got to know the heroes of the sport, and what excitement and success look like for that sport. Clips that drew a lot of views often were sent to other platforms, such as MPVDs, for further distribution.
NBC hosted much of its Olympics coverage from this outside venue at Copacabana Beach.
And finally, our bread and butter: broadcast production at the prime time, daytime, and late night sets with Bob Costas, Mary Carillo, Ryan Seacrest and Mike Tirico, as well as our presence across the larger competition venues. We had a 75,000-square-foot footprint within the IBC, and this space included two large control rooms, a small control room, two studios, a small insert studio, and a news-production area. Connections between our large control rooms and the studio at Copacabana Beach supported the late-night show and our daytime network production.
For the first time, we augmented our HD broadcast programming with distribution of UHD coverage to cable, satellite, telco providers, and other partners, with Dolby Atmos providing enhanced audio for the opening ceremony.
Our greatest achievement at Rio 2016 was pulling all of this off in parallel. In London 2012 we did a lot of broadcast and multiplatform coverage, but still we were surprised by the level of interest and volume of consumption. We saw clearly that the appetite is there to consume Olympics content in a variety of formats and flavors. To do all of this at once and have it go off successfully was our biggest challenge, and the most fun and rewarding part of these Summer Games. For Rio 2016 we had a complex coordinated coverage strategy with many legs that we needed to keep moving smoothly in parallel. From opening to closing ceremony, our coverage of the Games of the XXXI Olympiad was a symphony of moving parts, and we got to do it all over again for the Rio 2016 Paralympics, from Sept. 7-18.
I really enjoy covering the Paralympics and holding the interest in the games. Though there were fewer events and fewer people on the ground, the level of competition was just as high and the quality of play just as dynamic. In Stamford we worked with a huge team of former Paralympians to create an exciting, informed play-by-play call that enhanced the viewer experience and offered insight into what achievement looks like in each sport. We had the opportunity to apply the lessons we’d learned about working at this port of call, and to think more about how to bring these lessons to South Korea and Japan.
There is no way of telling what media consumption will look like in 2020. Vancouver 2010 doesn’t seem that long ago in a lot of ways, but that’s the year that the first iPad came out. Now a lot of people are using consoles and connected TVs to stream content. What will be the next leap in how people interact with and consume content? We know that for Tokyo the host broadcaster will do a lot of 8K, but it remains to be seen how much 4K will grow between now and the Winter Games. And how will IP-based devices, enhanced audio, and other new aspects of media production and consumption play a role?
Looking ahead to this future, we’re thrilled. With each Games, we are in the front row, learning something more about our country—who watches what type of content and how. Are the kids watching on a particular device and parents on the TV? Do they watch together? Do people watch at work? We gather the microdata that will help us determine how to get content out to viewers the next time around. Because NBC invests in it and trusts us to make this reach, we’re able to push ourselves technologically and push what’s possible in TV.
It’s a heavy and humbling responsibility to cover this kind of event. The athletes of the Olympic and Paralympic games have lived and breathed for this moment, and the fact that we can bring it to a kid riding his bike halfway across the world is simply amazing. It never gets old.
Darryl Jefferson is the vice president for Post Operations & Digital Workflow, NBC Sports & Olympics.
This story originally appeared on TVT's sister publication TVB Europe.
Future US's leading brands bring the most important, up-to-date information right to your inbox
Thank you for signing up to TV Technology. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.