RIO DE JANEIRO—ESPN’s coverage of the 2014 Fédération Internationale de Football Association World Cup has begun.
The world’s most popular event in the world’s most popular sport continues to grow in popularity and there’s proof in the numbers: A worldwide audience of more than 3.2 billion watched live coverage from South Africa in 2010 for a minimum of one minute, according to FIFA research.
It’s also growing in popularity in the U.S., where an estimated 15,545,000 viewers were parked firmly in front of their flat screens during the 2010 World Cup final between Spain and the Netherlands, which aired on ESPN’s sister network, ABC.
That will, no doubt, be similar to the case again this year, until the final match on July 13, with the World Cup to be broadcast from 12 venues in Brazil to viewers in their living rooms and in local bars—as well as what is presumed will be a record number of viewers viewing via their mobile devices.
Key to this year’s broadcast will be ESPN’s extensive use of fiber to bring the action the tournament’s millions of viewers, wherever they’re watching on their technology of choice.
To prepare for such a huge operation, some of ESPN’s production crew arrived in Brazil in late April to implement construction plans for the broadcast—with every element of the custom operation to be dismantled a week after the tournament’s end.
Among those behind the plan is Claude Phipps, director, remote productions operations for ESPN. He started preparing three years ago and had “already made about 10 trips to Brazil” before he arrived in late April to stay for the duration.
“What we have done during past coverage of the World Cup is build our overall facility in one central area, as we did in 2010 in South Africa,” said Phipps, “with the studio and the technical control room infrastructure pretty much within walking distance, aside from Mandela Square,” which was a half-hour drive.
However, Brazil’s “significant” infrastructure challenges means that this year’s facilities “are spread out across 30 miles. With bad traffic, that can mean a two-hour drive.”
The reason for the distance is that Barra de Tijuca (a borough of Rio de Janiero), isn’t especially iconic at present due to construction for the 2016 Summer Olympics, among other issues. “So our set will feature a great view from ESPN’s Copacabana Beach-based production headquarters, the Clube dos Marimbás, in Rio de Janiero, which is connected to the IBC [International Broadcast Centre, which will host media companies covering the event], in Barra de Tijuca, via fiber.”
It’s that distance that has helped make fiber crucial to the broadcast. “We’re sending all content back to Bristol [Conn.] for distribution to various areas including our Avid facilities there, and to other editing sites in Connecticut,” via the Net Insights Nimbra Platform, said Phipps. “That’s a change as well, because in South Africa, we did everything in the IBC. But this new way is saving money.
“Between file transfer, multiple feeds, telecom and Internet broadband, fiber connectivity is what we’re relying on more than anything else,” he said. “That’s how we get our disparate setup to work in unison. While there may be some challenges with latency and communications, we have people with multiple SNGs in the field.”
Described by designer Christopher Lee as “the most perfect stadium in South America, the Estádio das Dunas in Natal, in the Rio Grande do Norte Brazilian state, was built to host a number of World Cup matches. It’s all part of ESPN’s four-pronged approach: Clube will be the site of the host set for domestic and international operations, which will air on ESPN Deportes; feeds and satellite transmissions will be handled out of the IBC; Bristol will be the landing point for the content; and ESPN Brazil (which is also set up in ESPN’s IBC space) will also receive host feeds via fiber from the IBC that it will transmit to its headquarters in Sao Paulo.
“It’s satellite one way, fiber the other,” Phipps said, “and the producers have to make it all work.”
HBS, the multimedia host broadcaster for the Cup, is providing 34-camera coverage for each match that includes the GVG LDX 80, Panasonic P2 AJ-HPX2100G and Sony HDC1500 C2; ESPN is working hand-in-hand, as are most FIFA broadcasters, testing out newer systems like the audio protocol, Ravenna, an IP protocol system with gives the user multiple-commentary audio.
Also heavily involved in the broadcast is Belgium-based EVS, which has its technology at all 12 venues “with containers that were pre-equipped in Germany,” said Nicolas Bourdon, senior vice president, marketing for EVS. “Each container will contain 16 EVS XT3 servers, [eight channels per server], controlled by the LSM remote, and new MultiReview systems [used for clip compilation and emotion channels].”
Any content recorded into the venue server will be accessible by the off-site production teams (in the IBC) for review, selection and import of HD files for production, according to Bourdon; immediately after multiple live feeds (nine per venue), are ingested ingested into the servers, producers will be able to browse clips in low res, then import selected clips into the broadcast in high resolution, Bourdon said.
The IBC will be equipped with a central media storage system known as FIFAMAX, which integrates with servers managing the live incoming feeds (nine per match, including extended stadium feed with and without graphics, tactical, team A and B, Player A and B, clips compilation 1 and 2, ISO feeds), with a total of 5,500 hours of HD and and the same number of low-res video files, allowing off-site media rights licensees to browse, select and import media locally using EVS IPWeb interface.
They will all be sent, via satellite, to the FIFAMAX and logged in with 12 EVS’ IPDirectors for the log-in during live ingest. “So, all of the feeds will be recorded on the EVS media server,” said Bourdon.
UP WITH STREAMING
Based on EVS’ C-Cast remote connectivity technology, the multimedia distribution platform will, for the first time, connect HBS’ live production infrastructures to a central cloud-based platform, aggregating live streams, multi-angle clips, stats and social network feeds. This content will then be distributed to affiliate broadcasters, including ESPN, who can then make it available to their own viewers and bring streaming to the fore.
While traditionally only about 10 percent of the content that is collected on EVS servers will hit the airwaves, know that more will be included in ESPN’s transmissions via streaming. That’s where Akamai Technologies comes in.
Behind the scenes at ESPN’s 2010 FIFA World Cup coverage “We use a distributed network architecture and work with ISPs and network providers to ensure that our technology is embedded as deeply as possible into the networks, close to the viewers,” said Kurt Michel, director of product marketing for media at Akamai, which will provide HD video streaming, site performance and security services to more than 30 worldwide World Cup rights holders.
“Deploying many servers over a large number of networks provides higher throughput and reliability,” said Michel. “Our architectural approach builds in redundancies to eliminate single points of failure, so that when something inevitably fails, the viewers are unlikely to notice.”
Those capabilities are part of what Michel feels will ensure a smooth broadcast.
“Between the recent summer and winter Olympics, a span of just 18 months, we saw a total traffic increase of 72 percent,” he said, noting that traffic peaked at 873 Gbps during the 2012 Summer Games. “During the Winter Games, the peak shot to 3.5 terabits per second in real time—four times higher than the prior game’s peak,” primarily fueled by the men’s U.S. vs. Canada hockey semifinal.
So demand and quality are growing, and fast. Michel said the average bit rate that Akamai delivered during the 2014 Winter Olympics was 1.8 Mbps, up 50 percent from 1.2 Mbps during the 2012 Summer Olympics.
“We can see how mobile viewing is driving traffic,” Michel continued. “During the 2010 Winter Olympics, Apple’s HLS format, primarily used for mobile devices, was 0.3 percent of viewership; in 2012, it was 7.4 percent and in 2014 it was almost 20 percent,” Michel said. “Now, we have more devices and more social channels, and those devices are being used to consume and share content. This results in massive, immediate traffic spikes.” Delivery networks must be able to scale to huge capacities, almost instantaneously. “This is critical,” he said, “since our expectation is that the World Cup will be a huge online event.
The last mile bandwidth has gotten three to four times as fast as it was four years ago for the last World Cup, according to Michel, “so that means the viewers are going to get higher bit rates, thus better experiences overall.”
What’s interesting about sports events, said Michel, is that you’re never sure where the spike in viewership might occur. “It could be a late goal or it could be some kind of news event that can take on global significance.”
He thinks the World Cup will continue the trend of setting more new records in the way that people are consuming live global events over the Internet, and these large events offer the industry a keener look at how the industry is changing.
“These watermark events set some nice comparison points for online consumption trends,” Michel said.
The latest product and technology information
Future US's leading brands bring the most important, up-to-date information right to your inbox
Mark R. Smith has covered the media industry for a variety of industry publications, with his articles for TV Technology often focusing on sports. He’s written numerous stories about all of the major U.S. sports leagues.
Based in the Baltimore-Washington area, the byline of Smith, who has also served as the long-time editor-in-chief for The Business Monthly, Columbia, Md., initially appeared in TV Technology and in another Futurenet publication, Mix, in the late ’90s. His work has also appeared in numerous other publications.
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.