Today’s broadcasters face continued consolidation, centralization of staff, and ever-increasing pressure to improve efficiency—challenges that can take their toll on quality of experience for customers. Fortunately there are affordable, reliable, flexible solutions for remote monitoring that can help overcome these challenges. These solutions perform all-important proactive quality checks at audio/video service hand-offs to ensure maximum QoE.
Gary Learner The latest generation of compact, low-cost remote monitoring solutions has an expanded range of functions such as QoE-based content monitoring with recording, remote viewing and troubleshooting of A/V feeds across linear, on-demand and interactive services. This functionality is especially important given the addition of over-the-top and Internet streaming services content to broadcasters’ already long list of services requiring monitoring.
Just as content distribution has evolved in recent history—from over-the-air to digital and now to OTT and Internet streaming services—so too has the need for and complexity of content monitoring. It used to be that aired content was recorded and watched afterward to look for faults, which was an inefficient, tedious, time-consuming task that often took hours or even days to detect problems. It was never a practical approach, to be sure, but at least it was possible. Today, given the scope and complexity of services requiring monitoring, that manual method would be impossible even for the most well staffed, well funded operations.
The introduction of OTT and streaming services boosts the number of portals through which content can be consumed. Add to this the wide range of viewing devices—PCs, tablets and smartphones—and associated “flavors” of content they require, and the challenge of assuring the best possible experience (within the constraints of all components) can be enormous.
With media being delivered directly to viewers, who might be scattered across the country or around the world, there is no longer a “middleman” to share responsibility for QoE. Thus every operation, regardless of size, must be able to monitor the availability and quality of services across platforms, CDNs and video service providers—all from a single location—in order to offer a high standard of quality. Such solutions are especially important for operations that lack dedicated monitoring staff and budgets.
Fortunately the industry has already addressed remote monitoring of linear, on-demand and interactive services, eliminating the need for expensive, time-consuming manual and visual channel inspections. With these tools, broadcasters can proactively identify and respond to faults rather than waiting for customer complaints.
With media being delivered directly to viewers, who might be scattered across the country or around the world, there is no longer a “middleman” to share responsibility for QoE. Advanced monitoring solutions today can scan hundreds of channels around the clock and automatically test signal integrity, issue alerts (via e-mail and SNMP), and capture the problematic content when channels do not conform to preset limits. Positioned “behind” the set-top box (STB), such solutions give operators a single system and location from which to access and monitor the video output continuously. Remote monitoring capabilities enable engineers to review video and audio for issues such as static or black screen, as well as errors in closed captions and audio levels.
In terms of on-demand content, today’s sophisticated monitoring systems can ensure content availability and system capacity, record DPI ad insertions to prove ad conformance, or monitor interactive STB guides to ensure a customer’s experience. With complete STB command access, broadcasters can perform troubleshooting more effectively and use a historical review of content and services to address intermittent yet chronic issues. Refined for intuitive use, such systems often combine familiar, VCR-like controls with color cues that clearly indicate the channels being displayed, whether they are live or recorded. Layouts and source selection are managed with simple mouse clicks, and a built-in clock facilitates navigation to the desired time stamp.
Remote monitoring technology is already deployed in applications ranging from competitive news analysis to monitoring of “out-of-footprint” broadcast and distribution channels. Now broadcasters are applying the technology to OTT services.
Long gone are the days of monitoring the quality of a single, linear broadcast. Now broadcasters must not only assure video quality for multiple content streams in multiple formats to multiple devices, but, through OTT services, they must also deliver a personalized user experience alongside that video content. It’s a scenario that effectively multiplies their outputs. On top of that, they’re working with a variety of distribution platforms and might need to deliver content via a number of CDNs. The situation is further complicated by the fact that the groups of files delivered to each CDN will need to accommodate a wide range of devices, each with its own profile. It’s easy to see why it’s a significant QoE challenge. There is no plausible way to monitor all of these outputs and versions all the time, but today’s monitoring solutions make it possible for broadcasters to institute OTT service monitoring strategies that work.
The most viable monitoring strategy makes assessments at key points in the delivery workflow: ingest, encoding, packaging, delivery and distribution to the viewer (albeit in a controlled environment free of the vagaries of ISP service). It’s a mostly passive monitoring process that can give broadcasters reasonable confidence that the content they are delivering is packaged correctly in the formats compatible with target devices.
For a relatively modest investment in time and money, particularly when compared with the cost of monitoring all outputs all the time, broadcasters can assure that most of their customers are enjoying quality service most of the time. (Pictured: Volicon’s Observer Media Intelligence Platform) Monitoring ingested files is critical because they are frequently used as a high-bit-rate coded reference file for all future transcodes of that content. Monitoring at ingest is straightforward, as it typically requires continuous monitoring of just one feed.
Signal monitoring becomes even more demanding in the encoding stage because of the number of files that result from this stage. Working with as many as a dozen files, broadcasters must shift to passive monitoring methods and begin examining data about the file rather than the video itself—a far more cost-effective approach when dealing with large numbers of files. By looking at bit rates, syntax, reference timestamps and alignment of each of the files, the broadcaster can make a sound determination of the files’ integrity.
Once files have been packaged in the appropriate formats for the target CDNs and platforms, they are delivered along with a manifest describing those files. The simplest way for the broadcaster to confirm that the packaging is being performed properly and that delivery is correct and on schedule is to “accept” files in the same way that a CDN would. After that point, the broadcaster no longer has control over the content and must hope that the CDNs—and, subsequently, ISPs—will carry it to the viewer without introducing faults or compromising quality.
Streaming and OTT Services Require Active Monitoring
When it comes to monitoring the quality of streaming content, passive monitoring won’t work. Instead, broadcasters must apply sampled active monitoring on the tail end of the chain. This approach acknowledges not only the complexity of multiplatform media delivery, but also the exponential leap in the volume of media being delivered.
Actively capturing a sampling of outputs can give broadcasters an accurate idea of the quality that most of its OTT viewing audience is experiencing. Thus, for a relatively modest investment in time and money, particularly when compared with the cost of monitoring all outputs all the time, broadcasters can assure that most of their customers are enjoying quality service most of the time.
Besides active monitoring at the end of the delivery chain, broadcasters are also taking advantage of “round-robin” emulation with different devices, rates/resolutions and CDNs. In round-robin monitoring, the broadcaster alternatively checks the lower, middle and upper bit rates; examines Apple HLS, Microsoft Smooth Streaming and Adobe HDS formats; and monitors content for quality. By taking these measurements in a controlled environment, broadcasters can easily separate the issues they can control from the ones that occur during ISP delivery.
With a combination of active sampling and round-robin emulation, a broadcaster can effectively become a consumer of its own OTT services. When these monitoring tasks are automated and alerts have been configured to warn engineers of any issues, the broadcaster can maintain a proactive approach to monitoring across its full complement of services.
For this model of active sampling to work, the broadcaster’s video monitoring and logging system must touch on a multitude of MPEG transport stream points and monitor adaptive bit rate streaming of both encrypted and unencrypted media. In order to monitor OTT media delivered through apps, the system can employ an external device to emulate and record app behavior. This method accommodates both decryption and authentication while illustrating the user experience. With this functionality, the broadcaster can effectively monitor streaming media (encrypted or unencrypted) in any format.
Federal Communications Commission regulations demand that both broadcast and OTT content include descriptive video and captioning, so monitoring OTT content for compliance purposes is just as important as maintaining QoE. Fortunately, the very monitoring tools and techniques that support QoE monitoring of OTT services also enable broadcasters to make sure that their services comply with regulations from the FCC and others.
Advertising, encoding, delivery mechanisms, target devices and other variables are combined to make monitoring across OTT services a challenging but necessary task. The simplest and most cost-effective means of monitoring to address new OTT services is to extend installed monitoring and logging technology. In this way, broadcasters can take advantage of proven technology and workflows to assure that they are delivering the high-quality personalized content today’s media consumers desire.
Gary Learner is chief technology officer at Volicon. This paper was presented at the 2014 NAB Show Broadcast Engineering Conference in Las Vegas. You can find additional papers from the 2014 NAB Show Broadcast Engineering Conference by purchasing a copy of the 2014 BEC Proceedings at www.nabshow.com.
Future US's leading brands bring the most important, up-to-date information right to your inbox
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.