AI and Machine Learning Transform Video Market

LAS VEGAS—Few technologies have arrived with as much impact as artificial intelligence/machine learning. The technology is disrupting established business models, changing whole categories of jobs and supercharging the pace of innovation. Still, the technology’s integration is nuanced.

“There has been a lot of hype around AI, but now we’re seeing very specific use cases that actually do give you a competitive advantage,” said Alfonso Peletier, CEO and founder of Epic Labs.

Jon Klein, president of Vilynx, agreed. “What opens people’s eyes is real results. For most media companies, seeing is believing.” Klein will present today’s session “AI Is NOT Technology. It’s Strategy,” in the AI+Cloud Campus.

The AI+Cloud Campus, which is on the show floor, is holding a number of 30-minute sessions through Thursday. Sponsors of the pavilion include AWS, Google Cloud, Sony, Harmonic and Vilynx.

Attendees at NAB Show will get to see firsthand just what machine learning can accomplish. Here’s a taste.


As venture capitalist Mario Gavira and others have observed, Netflix derives much of its success from the clever exploitation of data. It knows what its users watch, when they watch it, how long they watch it for, on what devices and much more. They’ve leveraged this data to not only improve content delivery, but to tailor the creation of original content with the data-backed confidence that it will find a receptive audience. Broadcasters, production companies and other video businesses have more than enough data, and machine learning technologies are helping them level the playing field.

Large production companies have so much content that often languishes unused and unorganized because it’s too expensive to pay people to tag it, said Ryan Steelberg, president of Veritone. Thanks to the improvements in deep learning and computer vision, this video can now be ingested and tagged automatically.

The so-called metadata enrichment has grown very sophisticated using AI, Steelberg said. Veritone’s aiWARE “cognitive engine” can detect logos, objects, faces, spoken or written keywords and more and automatically generate keywords to make the video discoverable. The AI firm Vilynx is demonstrating its AI engine that can not only extract metadata but operates in tandem with a knowledge graph derived from scanning social media sites and over 50,000 websites to match content recommendations to trending topics.

The challenge to date in using AI-generated metadata tags is that the machine can struggle to parse what’s genuinely relevant from what’s extraneous, said Vilnyx Co-Founder and CTO Elisenda Bou-Balust. That’s why the company relied on an approach called unsupervised learning to train its algorithm. “We teach the machine skills,” Bou-Balust said. “It uses face detection to spot a face and if it doesn’t recognize who it is, it will search the internet to figure it out.” Extracting more precise metadata from video assets offers a kind of one-two punch for content firms, Steelberg said. First, it helps them understand what they have and, when it’s paired with customer insights, improves their ability to make personalized content recommendations to their viewers. “Media companies can harness this to make assumptions and correlations in real time about what’s working and what’s not,” he added.


“AI is super important for us,” said Andreas Jacobi, CEO and cofounder of Make. TV. The company’s Live Video Cloud uses a variety of machine vision and AI-based tools from several cloud services to improve the curation and qualification of live content or, as Jacobi puts it, to “separate the best from the mess.” Jacobi will do a deep dive on AI in his session “How AI Can Help Broadcasters Manage the Shifting Content Cosmos.”

“Companies are struggling to create content fast enough,” Jacobi said. Machine learning can help by not only flagging potentially promising video clips from streams of content but by automating the actual clipping, freeing editors to work on more in-depth and creative projects. “We definitely see AI aiding content producers on the editing side,” Jacobi said.

Evan Michaels, vice president of Video Product Management at Evolphin, agreed. At NAB Show, the company is demonstrating a machine learning-based tool that can prepopulate an editing timeline with relevant clips from ingested video. “It’s a rough cut of a highlight reel based on machine learning data,” Michaels explained. A human editor will still be on hand to make the final edit, but they’ll enjoy huge efficiency gains by not having to hunt for promising clips.


While improved compression codecs such as HEVC and VP9 and protocols like HLS and MPEG Dash have enabled over-the-top services to improve content delivery to viewers, there’s still room to wring greater efficiencies in the delivery pipeline. Epic Labs spent two years studying how machine learning could be applied to the challenge, said founder Alfonso Peletier. The result is LightFlow, which uses a combination of deep learning and computer vision to analyze video streams and aggregate analytics on end-point devices to create what Peletier dubs “content-aware optimization.”

Rather than simply compress video using a crude set of rules, LightFlow can “tweak the levers” of compression standards like H.264 and H.265 based on the kind of content that’s being transmitted and the kind of device that’s receiving it. The result, Peletier said, “is a much better user experience and a reduction of operating costs” associated with encoding, storage and CDN usage.


One unifying theme among those working on machine learning is how the wave of video data (professional and user-generated) and viewer analytics has barely crested. “We’re just about to be flooded with a ton of new data — everyone is producing more content, and the sheer number of devices we have for distribution is huge,” Steelberg observed. In such an environment, it’s actually impossible to make sense of, manage and monetize this deluge without machines. But the reliance on AI tools won’t necessarily spell doom for those who make a living in the impacted industries.

“I don’t think machines will do creative things,” Bou-Balust said. “Machines will be assistants. They’ll be your super powers.”