The 24-hour news cycle has meant that more and more content needs to be produced, presenting challenges not only in archiving and retrieval, but also in ensuring that the right clip can be found when it is needed. While electronic news gathering has been around for decades, today’s setup usually involves everything from a lone reporter and single camera to an entire crew on location. The latest developments in ENG tools empower news production teams by expanding the options in delivering content and metadata tagging.
“It is hard to walk into a TV station today that isn’t using some form of ENG or automation,” said Paul Shen, CEO and founder of TVU Networks in Mountain View, Calif. “It is now driven by on-demand technology and virtually all forms of distribution.”
SAME QUALITY IN HALF THE BANDWIDTH
The vast amount of content that news teams produce daily is placing increasing demands on bandwidth. In response, vendors showing the latest developments in ENG at this year’s NAB Show will focus on the role that High Efficiency Video Coding (HEVC)—also known as H.265 and MPEG-H Part Two—will continue to play in production. This video compression standard, designed as a successor to the widely used AVC (H.264 or MPEG-4 Part 10), could be crucial given the vast amount of content produced that needs to be delivered over existing systems.
“From our perspective we’re really watching how HEVC is fitting in with ENG this year,” said Mike Savello, vice president of sales at LiveU in Hackensack, N.J. “You get the same quality in half the bandwidth, and this ensures a good picture where bandwidth has been the biggest challenge.”
Whether it is local news or a network correspondent across the country or the world, teams in the field need to stay connected with the newsroom.
“Our focus is on the journalist working in the field,” said Mike Fass, vice president of broadcast operations at Gray Television, an Atlanta-based station group. “Nowadays they don’t spend as much time in the station, rather they have to work in the field, gather news and send it back on a steady basis. On top of that they are working solo.”
These demands require improved connectivity between the crew in the field and back at the station. “This is really important as it gives smaller teams the ability to cover the local community,” added Fass.
“You want to have your teams be as close as possible to the action and make the operation between the field reporting teams and the newsroom as seamless as possible,” said Kevin Savina, director of product strategy at Dalet, a provider of newsroom production software.
At the NAB Show, Dalet will show an enhanced version of the DaletOneCut news editing tool that offers a seamless editing experience, whether the content is located in an archive or the reporter’s laptop.
“We use the cloud to make that experience seamless and work even when you have very bad network conditions,” Savina said. “That is also one of the big topics related to ENG in general; making the field teams more efficient by using collaboration tools between those field teams and the central newsroom, and making the sharing of information and media as transparent as possible to the journalists.”
ENG can be a bridge not only between remote teams, but also with other new emerging technologies that are now being adopted by broadcasters.
“The industry is moving to IP, and we’re seeing the role that ENG is playing in being the bridge between video over IP and the cloud,” added Shen. “This is an important part of helping broadcasters make the transition to IP. We are building the workflow engine that can allow the broadcaster to evolve.”
WHAT ABOUT AI AND MACHINE LEARNING?
As more content is being produced it is creating new challenges for broadcasters to find those sequences that are necessary to creating a compelling narrative in news stories. Here is where advances in artificial intelligence and machine learning can help journalists sift through all the content.
At NAB, Dalet will highlight how its Media Cortex toolset can be used to enrich media workflow with AI.
“When you are implementing ENG workflows, you can have the teams on the field providing content back to the base, either as streamed content or as a file delivery of rushes, as a file delivery even of an edited package,” said Savina. “That can go through two types of AI. The first one can be automatic tagging to detect; for example, basic speech-to-text, to detect objects or themes inside the media that’s being loaded, so to have saving of the manual tagging that is required when you’re providing rushes to the central system. That’s one of the usages that we are seeing take off.”
AI and machine learning can be used to also help tag content with metadata as it’s created, giving journalists improved search capabilities with automatic recommendations of content relevant to the production of a story.
“It’s not just tagging, but it’s really helping the journalist with suggested content,” added Savina. “We see that use of AI in machine learning as being a growing opportunity in 2019, and we have new products around that in Media Cortex.”
It could even have a role to play in live news production as well.
“AI isn’t just aiding in the retrieval of pre-recorded content,” said LiveU’s Savello. “We’re seeing more and more demand for live content and if you’re a network with 200 or so affiliates, you need to sort through the content as it comes in and AI can be important in determining what is a priority.
“This will aid the speed of retrieval as the combination of AI and metadata will allow stories to come together so much more effectively,” Savello added. “The benefit that AI and metadata has in this context is simply too great a benefit for any production team to ignore.”
One other area where AI is already beginning to make a major impact is in handling what have been more mundane production tasks and making them easier to tackle.
“We’ve seen a lot of emphasis in automating closed captioning,” said Fass. “There is a real advantage in having machine learning handling auto transcription, and if the machine can do it that can be a huge time saver. Another facet is facial recognition so that AI can go through and find clips featuring a person—that is another task that takes so much longer if a person has to do it. If you’re looking for a specific sound bite, this will allow you to jump to a clip very easily.”
Another improvement in news production is coming from ENCO, which has been strengthening the integration of its MOM (Media Operations Manager) system with third-party solutions to streamline newsroom workflows. For NAB, ENCO will demonstrate a Media Object Server (MOS) interface and advanced newsroom computer system (NRCS) plug-ins that seamlessly bridge television news production and playout operations.
“Plug-ins for AP ENPS, Avid iNEWS and Octopus Newsroom allow journalists and news producers to access MOM asset libraries directly within their familiar NRCS user interface,” said Ken Frommert, President, ENCO. “This further enables users to incorporate MOM elements including graphics, audio and video into their scripts.” Rundowns are automatically synchronized with the MOM system, which can then be controlled by the NRCS via the MOS protocol to play out the assets.
Another company that has invested heavily in AI is TVU Networks, which introduced its MediaMind media supply chain concept at last year’s NAB. This year, it plans to show the latest developments in its AI-enabled platform.
MediaMind story-centric workflow is designed to simplify content creation, production and distribution while integrating automation, AI, streaming, 4G and 5G. “By automating and optimizing the media supply chain process, broadcasters can realize the benefits of taking raw content and re-using it across many platforms, programs and even organizations,” Shen said.
2019 AND BEYOND
News production could see other improvements this year and beyond, and higher speed mobile networks could play another significant role for news gathering teams.
“The evolution of 5G is going to bring new opportunities, but some challenges as well,” said Savello. “5G is going to be the next gamechanger, and alternative technology for mobile delivery will start to fall by the wayside.”
Another game changer could come on a different kind of field—namely the one in the sports arenas.
“Broadcast and production teams working in the sports arena are increasingly seeking a cinematic look and feel that cameras such as the Sony F55 or Venice deliver,” said Frank Jachetta, president of MultiDyne.
“Fiber-optic camera back systems exist to convert cinema cameras to live cameras for broadcast television or IMAG screens, while still retaining that dramatic feel that these cameras offer,” he added. “MultiDyne is addressing this trend with the SilverBack V cameraback transceiver. The same system addresses ENG requirements in a more basic configuration.”
For news teams these tools will allow them to do their job better, but in the end it is delivering relevant content to the viewer that is most important.
“It isn’t just ‘embrace it or be left behind,’ however; technology should allow you to do your job more productively,” said Fass at Gray Television. “It is too easy to get caught up in the technology, but at the end of the day it has to make the journalists’ job easier.”