Editing systems

Today's job description for a professional editor has changed significantly to include a variety of tasks that go well beyond cutting and splicing images and sound together. The technology required to do the job has also evolved over time and continues to enable new capabilities and improve productivity.

Nonlinear editing (NLE) software has certainly matured over the years, with real-time processing, multiformat timelines and even shared-storage editing no longer a novelty. These features are now standard fare on most editing software packages, whether they are on a Macintosh or PC platform.

However, to accommodate the needs of today's highly competitive post-production environment, NLE software has gotten smarter in how it manages files (and metadata), and it must enable editors to get even more involved in the content creation process. Editors also have to be able to work on all platforms, Mac and PC, with equal competence.

In addition to handling all flavors of SD and HD files natively on a timeline, editing systems have to become more integrated into the production workflow. This means an editor is now handling graphics templates and preprocessed audio files along with the video clips.

The fundamental values of shared-storage editing — collaborative workgroups and faster time to air — are always going to remain so an editing system that does that well and allows other third-party technology to be seamlessly integrated into a production environment is critical. All editors on the network must function equally at all times, with no bottlenecks in connectivity or delayed access to media.

Frame-level asset description

Along with numerous advantages, shared-storage environments bring a number of challenges. With thousands of clips or assets on a centralized server, how do you locate exactly the right one when you need it? Editing software must include an asset management or database file system organization that first allows you to browse low-res clips to find the one you need, and then allows you to retrieve that clip immediately. If it's late-breaking footage coming in from the field, you have to be able to begin working with it as soon as it is ingested into the network.

This means not only simple asset management searches, but also the ability to locate a clip using frame-level asset description. This makes the logging process much faster and clip searches much easier. Reporters can make logging notes out in the field and attach them to a specific clip. Editors back at the station can then use those notes to find and position a clip within the story. Leveraging this metadata makes editors' jobs easier and gets the story to air faster.

This concept also means more than just finding the clip; it's finding what's in the clip down to the frame level. Clips in news are very short, so finding a clip is half the battle, but for longer-form material, like a live sports event, finding a three-hour clip is only a small piece of the puzzle. You have to find a specific segment of that game within the three-hour telecast. That's more of a challenge, but it is a feature an increasing amount of editors are requesting.

Tight MAM integration

The best way to achieve this level of functionality is to extend the organization's media asset management system to content acquisition, even if this is in a camcorder on a remote assignment. This is emerging technology, and it's tricky to do, but the rewards are invaluable to editors asked to produce more content all the time. The key is to get editors involved from the beginning of the content creation process. As videographers are shooting the segment in the field for that night's newscast, they should be thinking about how to deliver that footage to the editors in a way that is most useful to them.

Likewise, when editors receive the footage, they should be able to immediately know where to look to find out what's on the clip, how much of it to use and where in the story it should go. By taking advantage of the incorporated metadata — referencing both the clip and specific places within the clip — there's very little guesswork involved. A forward-looking editing system should allow this metadata to remain attached to the clip for its entire life cycle.

Real-time editing

It is also crucial that any editing software is able to process effects in real time. It doesn't mean you put it on the timeline and then go back to a clip and hit “play.” It should allow you to do an edit automatically, without marking in and out points. Some NLEs on the market today are actually slower to use than tape-to-tape editing systems. Where's the convenience and speed we have come to expect with nonlinear editing?

The widest possible variety of compression types must also be supported natively in any useable NLE, because you often don't know what type of footage you're getting from day to day. Newer formats such as AVC-Intra and JPEG2000 will have to share equal space on the timeline with DV and MPEG-2 (and MPEG-4) material. NLE systems and software should be agile enough so users don't have to worry about preconverting or transcoding before they begin working.

Of course, the use of advanced codecs like AVC-Intra and JPEG2000 require more powerful workstation processing, so today's dual quad-core processors will eventually give way to faster CPUs. Future workstations — and the use of GPUs instead of dedicated ASICs — will certainly have to be up to the task. More hard drive storage and usable RAM will continue to be important as well.

The same goes for aspect ratios. The conversion to DTV in the United States has not made all 4:3 material go away. Quite the opposite is true now and will continue to be so for a long time. In fact, many editors in the industry over 40 years old will probably be working with a mix of 4:3 and 16:9 content for the rest of their careers. Aspect ratio agility has to be just as well developed as codec agility. Both formats must coexist on the same timeline, without conversion and in both low and high resolution, to be useful. The NLE software has to allow users to use aspect ratio conversion at certain times and not at others, and it has to do that for users without editors spending time rendering or even thinking twice about it.

Nonlinear editing is not solely about editing anymore. And because virtually all of the processing is now software-based, new features and functionality for nonlinear editing are now available at low cost.

Today's editors are being called upon to be more than just editors, so the tools they use must benefit the entire workflow, not just one single task. Editors have become the linchpin of the overall production workflow. The editing software now includes its own transcoding engine and other capabilities that used to be performed with specialized boxes.

In the end, that's good news for everyone, because both money and time are saved in the process.

Ed Casaccia is director of servers and storage product marketing for Grass Valley.