SMPTE ST 2110 a Success in Review

(Image credit: Getty Images)

LONDON—One of the two Technical Emmys that SMPTE will pick up at the NAB Show later this year (in person or otherwise) was awarded for its work standardizing 2110.

Building on earlier work at the Video Services Forum (VSF), the first four parts of ST 2110 were published by SMPTE in late 2017, to provide for a standardized interconnect of media across an IP-based network. This has the additional benefit of providing a common backbone for the facility, instead of broadcasters having to worry about running the correct type of cable and signal to various locations. Engineering personnel can easily configure and control network switches and routers remotely, allowing them to work from wherever they are.


One purpose of 2110 was to get the industry onto a single standard, according to John Mailhot, systems architect for IP convergence at Imagine Communications. He edited ST 2110 documents through publication. “We watched what happened in the audio industry where vendors worked in different directions creating a dozen audio over Ethernet systems all incompatible with each other.”

The premise of 2110 was to avoid similar fragmentation and to consign the proprietary nature of black box development—which had begun to hinder systems installation and business growth under SDI—to history.

Rival approaches to video over IP, notably Aspen (championed by Evertz) and Sony’s Networked Media Interface (“and the prospect of more,” says Mailhot), were eventually subsumed under the 2110 umbrella.

Also driving development was the desire to replace arcane point-to-point signal transport with an entirely new essence-based mechanism.

“Studios built with SDI were constrained by what cables were used,” says Bruce Devlin, vice president of standards for SMPTE and chief media scientist for Dalet Digital Media Systems. “Trying to change a facility’s purpose involved physical uplift and fixing of patch panels. In no way was this giving the versatility you needed to be a responsive studio business.

John Mailhot

John Mailhot, systems architect for IP convergence at Imagine Communications (Image credit: Imagine Communications)

“The dream of 2110,” he adds, “was to put down cabling with enough bandwidth and then figure out what you were going to put on it and in what direction the streams were going to flow.”

A key part of the work was to mirror the rock-solid timing of SDI by incorporating precision timing protocols derived from IEEE 1588.

Devlin says, “We’ve gone from SDI—where timing, video, audio, some metadata and unidirectional routing were all locked together on one cable—to a general-purpose cable that has logically (rather than physically) separate flows for every flavor of video, audio and metadata. Each stream is separately routed and separately timed.

“The beauty of ST 2110 is that it allows this complexity,” Devlin adds. “It does everything from SD all the way up to 8K RGB 12-bit uncompressed to a solution demanding a 100GB link streaming video in one direction. In either case you are using the same protocols, the same switching, the same standard IT architectures.”

From that standpoint 2110 is about as successful as anything the TV industry has ever done, Mailhot says. “At the beginning [2110 compatible product] was built speculatively. Today, it is available because the market demands it.”

For just about any piece of kit you could need in a TV facility—multiviewers, cameras, replay systems—buyers have the option of one whose primary interface is ST 2110.

“Where 2110 excels is building large-scale facilities,” Mailhot adds. “Legacy projects were built around fixed sized matrices and almost no matter how big they were, six months you’d wish it was just a bit larger. In IP there are limits too but you can build an IP system with 2110 and evolve it over time to scale really extraordinary sizes.”

The Alliance for IP Media Solutions (AIMS), an industry group focused on education and adoption of IP standards for media, also characterizes 2110 as a success and “a world removed from the market confusion of 2015,” according to AIMS Chairman Mike Cronk, who is also vice president for advanced technology at Grass Valley. “The number of systems deployed and the scope and breadth of companies deploying 2110 has increased.”

It’s not all been plain sailing.

“There was definitely a lack of education at the outset,” admits Devlin. “Today we’ve got quite a lot of education about the fundamentals of the technology but what is missing is the ‘junior Jedi’ level of apprentice training. With new technologies being introduced faster than educators can create courses, I don’t know where to send enthusiastic students to learn in-camera VFX or IP lighting control or ST 2110 operational needs. I can see the industry re-invigorating the apprentice type training in these cutting-edge technologies because it’s the only way to scale quickly.”

Feedback from early 2110 installs is that the standard could use some simplifying. Cronk says, “Large facilities might have a 3000 input x 10,000 output IP equivalent router and while the switches are a standard Cisco, Arista or Mellanox, if you have to manually type in IP addresses in an excel sheet, it quickly becomes tedious. For every video source you might have 16 separate audio and eight data streams so one active goal is to make a 2110 system more plug and play.”


An under-reported success of 2110 is the collaboration that has had to take place among different organizations. Sharing the Emmy honor with SMPTE are VSF, European Broadcasting Union (EBU) and Advanced Media Workflow Association (AMWA). Together, they formed the Joint Taskforce on Network Media (JT-NM) to coordinate the effort. 

Bruce Devlin

Bruce Devlin, vice president of standards for SMPTE and chief media scientist for Dalet DIgital Media Systems (Image credit: Dalet Digital Media Systems)

“These four organizations have made the 2110 ecosystem system successful,” commends Devlin. “The JT-NM meet every couple weeks to try to keep the super tanker moving forwards. The rudder movements have to be aligned early enough for the ship to turn.”

The VSF is at the heart of the group exploring transport and connectivity issues. Its TR-03 and TR-04 recommendations were the building blocks for transporting essences (individual signals) rather than the composite media.

AMWA took the lead on providing the core control and management software, which equates to straightforward interoperability between products from a wide range of manufacturers.

The EBU’s role is to keep its finger on the pulse of users. “It is collecting user requirements for the next things we should look at next,” says Devlin. “They are trying to figure out, if we’re to use 2110 as an ecosystem rather than simply as a better piece of string to connect a camera to a screen, then what do we have to do to make it better?”


The underlying business model of broadcasters has been decimated by OTT. Finances for huge broadcast infrastructure projects, such as studio refurbs or new builds, has been curtailed. Add in Covid-era shuttering of near all live studio-based production—for which 2110 was principally designed—and its replacement by remote workflows, and ST 2110 faces something of an identity crisis.

“It is not really that 2110 is the wrong standard, it’s that the means of content consumption has started to change rapidly,” Devlin says. “The global pandemic accelerated this when live sports and stage events, all the stuff that 2110 is dedicated to, almost vanished overnight.

“The result is far less investment into live infrastructure overall and what investment there is is switched to remote live and remote scripted infrastructure. Investment in a lot of production is not hitting the 2110 nail on the head at the moment.”

The JT-NM thinks it has a role to play in adapting 2110 for content shuffling between facilities (or between OB and studio) and more broadly in live remote scenarios.

“We’re having to find ways to use the 2110 ecosystem to connect nano-second accurate studio environments with remote operations over the internet or in the cloud where hard and fast PTP accuracy may not exist,” Devlin explains.

An option for this is the Internet Protocol Media Experience (IPMX), a proposed set of standards and specs designed to address the ProAV industry’s need for protocols that ensure interoperability for AV over IP. IPMX is promoted by AIMS and AMWA and is based on 2110. A quarter of AIMS’ 100+ members have a foot in both broadcast and ProAV camps.

“It seems silly to have a hard wall between gold-plated 2110 excellence and the exuberance and creativity of the vast pool of those who can’t afford 2110 in the ProAV space,” Devlin says. “How do you bridge the two to make something that is better than both of those?”

He adds, “Covid has forced us to look at this area. It helps to get more creatives included into what might otherwise be an ivory tower of exclusivity.”

Mailhot points out that remote production actually involves equipment either virtualized in the cloud or physically racked in a room and that 2110 while the operating team controls the kit remotely; “2110 still does its job of being the data plane even as the mechanisms the operations team uses to make production decisions are remoted.”

He says there is increasing interest in compressed schemes in the context of 2110 with JPEG XS the frontrunner. Detailed specs for how JPEG XS can be mapped into a 2110-22 ecosystem as a mezzanine compression are all but complete at the VSF. Vendors like Imagine plan to add JPEG XS capability to product this year.

“2110 in its uncompressed native form is built around the notion that bandwidth is free and cheap on a campus but we are working to use JPEG XS to get to and from cloud,” Mailhot says.


The pandemic may have stalled heavyweight infrastructure projects but demand for ST 2110 product solutions is expected to return.

“The reason is that people get hooked on 4K,” surmises Devlin. “We can shoot, store and process in 4K and content shot in it has a long tail. As soon as you try and shoot in 4K, particularly a live event, a lot of other tools start to struggle.”

He also points to the rise of game engines and virtual sets within mainstream production as the new benchmark for quality.

“To generate VFX in-camera on multi-camera shoots like ‘The Mandalorian,’ you will need high bandwidth to work with uncompressed 4K and 8K,” Devlin said. “With shows of this calibre every bit counts. For this reason, even scripted content producers will realize that the network infrastructure they need will be 2110.”

Adrian Pennington

Adrian Pennington is a journalist specialising in film and TV production. His work has appeared in The Guardian, RTS Television, Variety, British Cinematographer, Premiere and The Hollywood Reporter. Adrian has edited several publications, co-written a book on stereoscopic 3D and is copywriter of marketing materials for the industry. Follow him @pennington1