‘Consistent Experience’ Is Key To Combining OTT, Linear TV

WASHINGTON—Collectively, the growing wave of streaming and OTT versions of linear programs has generated a tidal wave of usage, posing challenges to producers and distributors who are hustling to find ways to create a “consistent experience” for audiences who see programs on a variety of devices and via multiple delivery systems.

According to a Parks Associates analysis last month, 52 percent of U.S. broadband households subscribe to both pay TV and one or more OTT video services. The study also found that about two-thirds of U.S. broadband households subscribe to an OTT service, and 38 percent have more than one OTT subscription.

A separate study by the Interactive Advertising Bureau found that in North America, 65 percent of viewers watch streaming video once or several times per day—slightly less than the streaming usage in Europe, Asia/Pacific and far less than in South America and the Middle East. And yet another report from PricewaterhouseCoopers predicts that OTT video revenue will reach nearly $31 billion by 2022, up from about $22 billion this year.


Despite such rosy data, however, the video streaming outlook faces multiple challenges. For example, Dan Rayburn, a principal analyst at Frost & Sullivan, frets that the current streaming infrastructure cannot keep pace with the growth in streaming demand. In a recent online blog Rayburn contended that, “if online video consumption continues to take market share from traditional distribution channels and ultimately replaces traditional broadcast distribution, then streaming video infrastructure will need to increase its capacity by over 3x in the coming years.”

The issue of latency in high-speed streaming of sports has become more prominent due to the recent U.S. Supreme Court decision allowing sports betting.

The issue of latency in high-speed streaming of sports has become more prominent due to the recent U.S. Supreme Court decision allowing sports betting.

Moreover, Rayburn predicts that if 4K UHD replaces the current high-definition signals for streamed content, then “the required bandwidth will increase by a factor of another 4x.

”Adding 8K, used for “true virtual reality” would trigger the need for “another 3x increase in capacity,” Rayburn said in a blog last month. He concluded that such growth scenarios would require a 20-fold increase in the streaming video infrastructure during the coming decade.

“Building 20x the data centers and filling them with 20x the servers in addition to constructing the power plants to power this infrastructure is clearlly not an economically viable or sustainable strategy,” Rayburn added. But his modest conclusion was only that “we need to start talking now about how to meet the “capacity gap” challenge.


That realty—the necessary coexistence of linear and streaming content—has triggered aggressive efforts to empower the media supply chain with new tools to handle workflows for the emerging market.

“Workflows, as repeatable processes, are evolving constantly in reaction to new content manufacturing and distribution scenarios,”said Richard N. Yelen, head of North American Business Development for Accenture Digital Video Products and Services. He cited “the migration from traditional licensed content distribution to ‘app syndication,’ as content owners take greater ownership over the viewing experience.”

“We are increasingly looking at ways of using data [aggregated from consumption platforms, AI-generated from media analysis and other sources] in a highly secured way to drive automation into the supply chain,” Yelen added.

He acknowledged the “distinct differences” between broadcast and cable operator products and services “with obvious intersections” but emphasized that both of those traditional platforms are “quite different” for video on demand and live streaming.

“The difference is between file-based workflow vs. active real-time streaming,” Yelen said. “Live streaming requires more careful/intense real-time management of service continuity and user experience since popular large live events, such as sporting matches, can cause unpredictable viewership spikes.

“These spikes often have load implications across the viewing platform ecosystem, including both core platform services [e.g. user authentication], as well as network and caching [e.g. CDN performance],” Yelen added.

Jon Alexander, senior director of product management for Akamai, noted the special considerations for live vs. on-demand.

“For live and linear delivery, we emphasize live origin capabilities—ingesting video into our network from as close to the origination point [ideally production or playout] as quickly as possible,” Alexander said, noting that for on-demand content, Akamai seeks to move content “into our storage environment or from a customer’s on-premise or cloud storage platform. He emphasized the architecture that seeks to enable “capabilities such as multisource multicast and prepositioning of content.”

Jon Alexander, senior director of product management for Akamai

Jon Alexander, senior director of product management for Akamai

The technologies Akamai leverages for OTT video delivery differ from those used for traditional broadcast and cable transmission, according to Alexander.

“The workflows themselves are different, as is encoding and security,” he said. “For OTT, we’re delivering content files in chunks, and we’re using the open, unmanaged internet as the transport layer. While there are changes on the horizon, the majority of OTT delivery today is unicast—one stream to one device—as opposed to one-to-many broadcast delivery.”

But although the underlying technologies are very different, Akamai is working to enable the same experience, he said.

“The bar for consumer expectations has been set by broadcast, so we’re aiming to reach and ultimately exceed those expectations by tuning every layer of the delivery stack from the client and network transport protocols all the way through the application,” he added. “This helps enable efficient video delivery across the internet from the origin to the end-viewer device, whatever or wherever it may be.”

Aspera, an IBM operating unit, has developed a range of streaming video solutions, based on its Fast Adaptive Secure Protocol (FASP), a large data transport protocol that eliminates TCP’s loss and error handling and the resulting erratic transfer rate swings.

“FASPStream enables centralized, real-time production of high bit-rate programming, eliminating the need to colocate costly production staff at remote venues,” Todd Kelly, director of product marketing, explained. “It also supports in-line transcoding, packaging and delivery to accelerate live video delivery and facilitates open file workflows for real-time editing and production. By lowering the production costs for all kinds of events, especially ‘tier 2’ and ‘tier 3’ events, broadcasters can create and deliver new content [that targets] new audiences.”

Kelly also emphasized the value of collaboration, citing his company’s work with Telestream to build a joint solution that integrates FASPStream technology into Telestream’s Vantage and Lightspeed Live software.

“This API-level integration provides a truly innovative approach to delivering live, broadcast-quality video streams from the venue to the remote production facility, enabling workflows never before possible,” Kelly said.

SeaChange International is also focusing on the multiplatform ecosystem with its new “cFlow solutions” portfolio, which debuted in June. According to SeaChange Marketing Vice President Kurt Michel, the solution was developed to serve broadcast, cable and OTT customers who “want a single workflow management system that handles all of the associated workflows.” In addition, SeaChange’s PanoramiC platform, a pre-integrated end-to-end cloud-based video delivery platform, can be used by content producers, broadcasters, cable and mobile operators and aggregators (such as vMVPDs) for multiplatform delivery.

“We have customers who perform over 1,000 quality checks on incoming content, before they even start processing it,” Michel explained. “With adaptive bitrate streaming, their workflows can literally produce dozens of variations of a given title,” including content rights, geo restrictions and other factors.

He characterized SeaChange’s “medium-agnostic” approach as a “cohesive, viewer-centric system” with modular elements that are “medium-specific where technical requirement dictate.”

Michel cited ad insertion technologies that can be adapted to the mechanisms used on different networks.

“Our session management capabilities allow a viewer to stop watching a title on their traditional TV, and pick up where they left off on their smartphone,” he explained.

Todd Kelly, director of product marketing for IBM Aspera

Todd Kelly, director of product marketing for IBM Aspera


IBM Aspera’s Kelly underscored the importance of low-latency for streamed content.

“Capturing and producing content for live events such as sports, news, training or education programs poses unique challenges, and true real-time, remote production has long been one of the ‘holy grails’ for broadcasters that produce live events,” he said.

The issue of latency in high-speed streaming of sports has become more prominent due to the recent U.S. Supreme Court decision allowing sports betting in the U.S., Kelly added.

Akamai’s Alexander agrees that latency is one of the most significant differences between traditional broadcast/cable delivery and OTT systems.

“We’ve done a lot of development to reduce end-to-end latency to 10 seconds on our generally available delivery and live origin products... equivalent to the latency you see on traditional television,” Alexander told TV Technology. “This is especially important for the live sports experience, where you want viewers—no matter how they’re watching—to see a big play at roughly the same time rather than hear cheers or groans or see something on social media before they actually see it live.”

“While 10-second latency is acceptable for most applications, we continue to push for lower latency,” Alexander added. “We have demonstrated two- to three-second end-to-end latency using chunked-transfer encoding, which is being successfully tested with several customers at the moment.”

SeaChange’s Michel also addressed the latency challenges for sports events and other live programming, which he characterized as part of a bigger reality: “workflow complexity is increasing.”

“Real-time content protection [adds]... an additional layer of complexity,” he said. “Viewer tolerance for problems with live sporting events is usually very low, since they are emotionally attached and the content is highly perishable. The live experience is everything.”

He emphasized that providers are not satisfied with an either/or for live video-on-demand.”

“They require both live and VOD,” he said.


As vendors look toward the next steps in managing conventional and future video delivery, one theme comes through: the need for a consistent experience.

“A consistent experience across viewing devices and platforms” is the objective of SeaChange’s portfolio of content management systems, advertising management and back office systems, explained Kurt Michel, vice president of marketing for the Acton, Mass.-based company.

“Consistency of the viewing experience,” is Akamai’s goal, said Jon Alexander, senior director of product management for Akamai.

“Uniform, unaltered, in-order byte stream that equally supports constant bit rate and adaptive bit rate formats,” according to Todd Kelly director of product marketing for IBM Aspera, in more technical terms.

“Reliable functioning of many large platforms,” is how Accenture’s Yelen summed it up.

“OTT will continue to grow and to dislodge traditional viewing experiences, with mobile traffic growing even further in prominence,” Yelen said. “The rollout of 5G services will be an additional accelerator because it will provide much higher quality and less compressed images when streamed to a device or television. Better connectivity in highly populated urban areas as a result of 5G may also boost consumers’ desire and ability to stream videos.”

Alexander expects to see more mainstream adoption of the HEVC and CMAF [Common Media Application Format] standards for streaming this year.

“We’re still early in the OTT adoption cycle, so consistently meeting performance expectations at 10x or 100x current traffic levels is a major point of emphasis for Akamai,” Alexander said.

Colin Dixon

Colin Dixon

Kelly, citing IBM Aspera’s work with Fox Sports on World Cup coverage, pointed out that, waiting for transfers to be complete before editing is “not an option.” He explained that while monitoring the soccer action in Russia, the Fox Sports production team in Los Angeles was “able to create clips while files are still growing,” constantly keeping in mind that delays “would not meet the audience expectation for live viewing.”

Collectively, all that upbeat exuberance seems to justify the forecast of Colin Dixon, chief analyst and founder of nScreenMedia, a Silicon Valley consultancy.

“By 2022, video streaming will scale to TV audience size,” Dixon said in June. “Linear TV broadcasters will be taking advantage of technology to provide fully addressable linear experiences.”


Gary Arlen

Gary Arlen, a contributor to Broadcasting & Cable, NextTV and TV Tech, is known for his visionary insights into the convergence of media + telecom + content + technology. His perspectives on public/tech policy, marketing and audience measurement have added to the value of his research and analyses of emerging interactive and broadband services. Gary was founder/editor/publisher of Interactivity Report, TeleServices Report and other influential newsletters; he was the long-time “curmudgeon” columnist for Multichannel News as well as a regular contributor to AdMap, Washington Technology and Telecommunications Reports; Gary writes regularly about trends and media/marketing for the Consumer Technology Association's i3 magazine plus several blogs.