The whizz of a mid-air football. A sweaty, mud-splattered face. Slow-motion clips. Instant replay highlights. The fast-paced world of live sports is largely responsible for the push toward innovation in video server technology. Today, the capture, storage and production processes of the sports community are being adopted — and adapted — for a range of broadcast programming far outside the world’s sporting venues.
While server technology in the professional broadcast realm has always been about recording, storage and playout with little, if any, signal degradation, servers have been advancing to more effectively ingest and store greater amounts of content of varying formats and play out multiple synchronized and simultaneous video streams. Now, more and more, equipment delivering the fast turnaround, flexibility and integration of live sports is being used in studio environments — for news magazines, live shows, reality programming, onstage entertainment and dramatic TV series. Broadcasters are leveraging the speed and mobility developed to full advantage for a range of onscreen programming.
But that’s not the whole story. To grow and thrive, broadcasters and media companies today must meet consumer demand for the richest media experience. Higher quality, faster content delivery to multiple screens and multiple devices, with more interactive capability, is the mantra for our age of converged communications. And the much talked about convergence between IT and traditional broadcast technology must be addressed. For broadcast environments that have not yet transitioned away from tape, today’s servers can help transition to a tapeless workflow.
Integrated, end-to-end workflows
In today’s broadcasting environments, disparate technologies such as cameras, production servers, controllers, switchers, and editing and automation systems must offer a higher level of integration for smooth and efficient workflow processes. Standard communication protocols have been developed progressively, so that these technologies can form a whole. Seen as the central aspect for many production workflows, servers have evolved from stand-alone ingest and replay to fully integrated platforms, acting as a backbone for the media ingest, editing, browsing and playout of production workflows.
Best-of-breed servers now deliver a fully integrated workflow, from content capture through post production, with tools and capabilities to support all production processes. They also feature an open architecture, allowing access to a host of marketplace production controllers for access and control of all content. Integration with third-party editing tools and enabling third-party systems to control server ingest and playout channels gives broadcasters valuable, extended production capabilities.
Most advanced live production servers can also accept wide-ranging remote control protocols, making it simple to integrate with most standard automation systems, controllers and switchers. Switchers, controllers, automation systems, linear editors as well as other systems using one of these protocols can easily and transparently interact with servers to control their content. Natively supporting different types of codecs is also critical to a fully integrated workflow, reducing the transfer time between production and post production as both parts of the process can use the same codec. In a fast-turnaround production workflow, content that is being recorded and encoded on the server is simultaneously accessible to post production. (See Figure 1.)
Figure 1. In a fast-turnaround environment, a production server can prepare content for post production as it is ingested.
The end results are major gains in speed, efficiency and productivity throughout the entire production chain. Integrated workflows streamline processes and speed workflow tasks and collaboration.
Let’s look at the goals of a newsroom. Broadcasters seek to improve workflow efficiency so they can gather, prepare and broadcast news as it breaks. Live production server technology can facilitate end-to-end news production systems with integration with third-party systems such as camcorders, craft editors, as well as archive and newsroom control systems (NRCS). Most importantly, these systems speed overall time to broadcast, enabling live feeds to air immediately as they are being received from the local reporter or news source, without any interruption or processing time. Further, systems that enable a modular and scalable approach to production, integration with existing newsroom infrastructures and workflow streamlining are particularly valuable in news environments.
Real-world reality show workflow
Servers have been deployed in the post-production environment to improve media access for editors and to improve workflow efficiency. Take, for example, a reality show format. Based on a nonstop recording-based process, editors can browse and retrieve media while the server is still recording; a significant productivity improvement compared to the previous VTR workflow. (See “Reality show benefits” side bar at the end of this article.)
Editing, content management and storage
Whether integrated into the server system or accessed through plug-ins, editing tools are capable of delivering edits without the time-consuming rendering processes of older systems. Editing software can also be optimized for the specific application, allowing support for multiple formats and resolutions on the same timeline and extended metadata management features. Some editing tools and servers are fully interoperable with all existing post-production systems. Open content management systems are critical to third-party integration and successful workflows.
Some server technology incorporates comprehensive management suites that allow ingest control, metadata management, on-the-fly browsing and editing, and playout scheduling — all managed from a single interface. Running on a common platform, all network personnel can instantly share content, metadata, edits and rough cuts. The software suite integrates with third-party systems, simplifying the transfer of media to post-production tools or archiving. Suites such as these also allow management and recording, enabling operators to stay continuously linked to the media. In post production, editors can use the descriptive information to search for certain clips or content. The system will retrieve specific clips, and editors can manipulate them through a simple drag-and-drop sequence.
The increase in the volume of content of various formats and spread over countless platforms makes for complex media content management. Features of a centralized media management system may include: a Web-based interface to instantly find content throughout multiple platforms; intelligent media browsing based on descriptive metadata and logging; automated media digitization; rewrapping based on integrated hardware and software tools; and robust processes for instant transcoding and media delivery to simplify the promotion and distribution of content.
The important benefit of all these tools is to reduce post-production time. Allowing faster access to specific content through metadata, for example, allows editors to spend more time on creative, not logging.
Live production servers of today can be designed for different applications. While some servers can be used for all workflow operations, it is possible to focus servers on specific parts of workflow operations, such as ingest or playout. Limiting capabilities according to specific requirements reduces the cost of the equipment and provides valuable cost efficiencies to the user.
Some servers are designed specifically for studio applications. Based on broadcasting and IT networking capabilities, recorded media is instantly and directly available throughout the production network for simultaneous preview, rough editing, archiving, playback or post production. These severs can support a range of formats and codec configurations to fit all types of workflows, while streamlining media transfers to better preserve quality and enable multiple configurations.
The rise of the second screen and the growing significance of social media are having a profound effect on applications now being developed. It’s estimated that more than 70 percent of tablet owners are watching sports on their TV sets while consulting their Web-connected device as a second screen or exchanging tweets with other members of the Web community.
The technology now exists for broadcasters to take advantage of social media engagement, offering viewers original second-screen content and more reasons to engage with the content they’re consuming. All descriptive metadata associated with event footage are used as keywords to facilitate content retrieval on Web-connected devices. This capability also provides revenue opportunities. A second-screen system allows broadcasters to engage a new generation of viewers who no longer sit passively in front of their televisions. Broadcasters can provide original content to consumers via their second screens and utilize the often large amount of wasted and unused content on servers as a valuable asset that can be monetized and delivered in near real time as additional premium content.
In addition to building upon second-screen systems and increasing viewer interaction, next-generation servers will be able to support multiple codec types in parallel. When recording video feeds, the encoding process will start simultaneously, making content immediately available for exchange. Additionally, lower resolution streams will be used for access and review purposes to help speed the editing and production processes, and keep bandwidth use low. Final output will, of course, be in high resolution.
Another trend to watch is remote access, providing people in remote locations with full workflow access and participation capability. And, in the midst of all the innovation, there will be more of the old: continued work on systems to maximize bandwidth and the continued addition of intelligence to workflow orchestration. The fast-paced sports world with its instant replay and highlight clips may have started it all, but we’ve come a long way. And we surely have a lot further to travel.
Reality show benefits
Specific benefits of an integrated server-based reality show workflow include:
- More efficient media access to editing suites provides the production team with more time to deliver a better edit.
- Greater flexibility in the production process facilitates editorial decisions and desires no longer restricted by limiting operating restrictions.
- More efficient workflow allows stronger collaboration among the production team and improves output.
- Large edit jobs are more easily tackled, and turnaround time is reduced.
- The need to duplicate and store media on tape is reduced if not eliminated, and the need to spool through tapes to locate shots is almost obsolete.
- Availability of spare channels offers redundancy for the records of both main streams (A&B) and can be used to record additional feeds for additional recording capability should they be required.
- With a nonstop recording system, editors can retrieve video while the system is still recording. Once retrieved, the content is transferred into editing stations so editors can finalize their stories, cutting editing time by as much as 50 percent.
—Nicolas Bourdon is marketing and communications director at EVS.