Test & Measurement’s Evolution in a Software-Based World

Telestream test & measurement QR
(Image credit: Telestream)

In today’s data-based, IP broadcast world, the days of technicians in white coats bench-testing cameras and microphones using specialized test and measurement (T&M) devices such as oscilloscopes, signal generators and spectrum analyzers would appear to be very much in the past. This is increasingly the case due to a major adoption of software for both setting up and troubleshooting equipment and, perhaps more significantly, monitoring channel output for quality control (QC) and regulatory compliance.

The two areas remain very distinct but not entirely separate, with T&M playing a role in compliance through specific test equipment and techniques. “As broadcasters migrate to IP-based environments, HDR workflows, UHD/8K and cloud operations, packet loss, jitter and PTP [precision time protocol] timing become both a T&M and a QC concern,” says Kevin Salvidge, sales engineering and technical marketing manager at Leader Electronics of Europe.

“Software-based solutions are establishing themselves in cloud-native QC platforms and are widely used in file-based and OTT workflows,” he says. “The adoption of ST 2110 has seen the introduction of software-based waveform and vectorscope monitors, but there are still a number of features only hardware solutions can provide, including latency and PTP timing. Because of this, most broadcasters today are looking for a hybrid approach.”

Achieving the Same Goal Regardless
Even though software is offering new ways of doing established jobs, it is not changing how broadcasters approach T&M and QC, as Matthew Driscoll, vice president of product management at Telestream, observes: “The team looking at camera shading and SRT [Secure Reliable Transport]/Zixi delivery to a cloud workflow are very different than the folks doing the QC on what is streamed to a subscriber. The fact that things are migrating to software, the cloud or hybrid workflows doesn’t matter in the sense that you’re doing the same jobs, just in a different location.”

Mark Simpson, Triveni Digital

Mark Simpson (Image credit: Triveni Digital)

Driscoll’s colleague Ravi McArthur, product manager for Telestream’s Qualify automated QC-in-the-cloud system, says the aim is to allow users to work where they want to be, rather than dictating how to use the technology. This has resulted in Qualify now being made available for on-prem operation.

“In the cloud, you effectively get infinite horizontal scaling, which is a big bonus for people with vast and wide media supply chains,” he says. “But sometimes, depending on the kind of resolutions and bit rates people are dealing with, they want their software on-prem near high-speed storage so they can process files as quickly as possible.”

In the view of Ashish Basu, executive vice president of worldwide sales and business development at Interra Systems, the situation has “changed for good,” with software and cloud now “almost everywhere” in T&M and QC.

“Broadcasters are saying they’re increasing the level of automation in checking audio-video quality as much as they can,” he says. “But that is not necessarily aligned with the compliance side. Some of the most advanced broadcasting organizations may have no interest in looking at captioning QC because it’s not mandated in their specific geographic region. We see that variation, but otherwise broadcasters are still interested in delivering pristine content.”

Increasing Complexity
What is clear is that the way broadcasters are approaching T&M and QC is driven by new distribution and platform technologies, Triveni Digital President and CEO Mark Simpson says.

“A key trend is the increasing diversity and complexity of the service being delivered,” he says. “ATSC 3.0 is a more complex standard than what came before, not only in the technical underpinnings but also because more services are being deployed. It is something we feel should have a more integrated, comprehensive approach rather than a lot of different subsystems.”

On the compliance side, Mediaproxy has been completely software-based since it was founded in 2001. “There are virtually no hardware solutions used for QC and compliance anymore,” CEO Erik Otto says. “There’s probably still a lot of hardware involved in T&M, but I’m sure they will have to change as well.”

Even so, Otto adds, there may still be a requirement for dedicated physical devices when it comes to detecting faults such as jitter. “When you deal with packets of data, you need to detect jitter properly for correct synchronization of audio and video streams,” he explains. “To do that you need a clock, so technically, a piece of hardware to give the necessary accuracy.”

The Impact of AI
While a physical timepiece might appear to be irreplaceable, even this area could eventually follow the software trend due to that inevitable game-changer, artificial intelligence (AI). “With AI being everywhere, we are putting it throughout many of our products where the idea of a clock seems like a skeuomorphism,” Telestream’s McArthur says. “We’re very focused on practical AI and are responding to customer requests for features including clock and slate reading, contextual analysis and object detection.”

Ashish Basu, Interra Systems

Ashish Basu (Image credit: Interra Systems)

Interra has been working with AI and machine learning for three to four years, says Ashish Basu, and has incorporated both into many of its products. “We use them in areas like video signal quality or degradation measurements,” he says. “However, we do not say we are an AI company or that we produce AI products. But there is space where AI can make a significant difference for our customers.”

Triveni has had early versions of AI in its products from the beginning because they were rules-based, Simpson says: “AI is a kind of rules-driven technology and it’s getting more sophisticated. Over time, we’ll see an increasing use of AI techniques—for example, our monitoring system with quality scoring for the feeds based on a number of observations.”

Otto has a different take on how AI could possibly change T&M, or, ultimately, the need for it: “Problems with satellite links, 4/5G, internet connections with terrestrial transmitters and networks in general can’t be controlled or predicted,” he says. “Technically, everything else—software and hardware—is in everyone’s control. Because AI is probably better at producing better code and outcomes, it could, in the long term, shrink the need for T&M.”