In the past, video editing has always been tied to a place, the edit bay. The Moviolas and Steenbecks of film were joined by linear tape editing suites. In the days of quad and C-format, these were large and very expensive. Non-linear editing, typified by Avid, transformed the process of editing video from a linear process to non-linear, much more like film editing. Even terms like “bins” were borrowed from film.
Non-linear was split into “offline”, or editing a low-resolution proxy, and “online”, conforming and finishing the broadcast resolution files. The divide between off- and online can be blurred, but there is still the distinction between the shot selection and rough cut, which only needs a proxy and the final cut of the high-resolution video.
Although a rough cut can be made on a laptop, the needs for storage systems and professional monitoring have reserved the “online” for the post house. However, both Adobe and Avid are now talking about editing in the cloud with Adobe Anywhere and Avid Interplay Sphere. I recently had a closer look at both products.
Collaboration lies at the heart of program making. A team of creatives cooperate and meld their skills to produce the final result. Close integration between the team can speed the production workflow. Take the example of editing. The commissioning editor, the executive producer and all manner of folks from the production team may need to see the progress of the edit. Even if the editor doesn’t want everyone looking over his shoulder as he or she works, at very least the director may want to be involved. If a cut needs discussion, then it must be rendered out and conveyed to the participating team members for discussion. Typically this is a low-resolution proxy. The rendering and delivery all takes time and slows the impetus of creativity.
There is a general trend in program making to increase shooting ratios. With tapeless cameras, the temptation is to leave a camera running “in case”—never miss a shot. This is especially the case with the reality show genre. The consequence is hundreds of hours of material to be logged and shots selected, and that is before the real edit starts. This leads to teams of editors with a senior lead editor, all working on the same source files.
What if they could all access a central store of the clips? You could log on set, select shots near to set, and edit back at headquarters. It has been possible for some years to log frame-accurately with proxy files, but wouldn’t it be easier to use the broadcast resolution files? With proxies, the edit list must then be conformed to the original files, extra stages that add time.
Direct attached storage — Thunderbolt — or shared storage in the post house are still only easily accessible by people on the premises. Could the cloud offer a solution for collaborative editing?
The cloud is a fluffy term, with no formal meaning. To many providers, it is the provision of a service: software as a service (SaaS), infrastructure as a service (IaaS), and platform as a service (PaaS). The service may be a public offering like Amazon Web Services or a private provision, in-house or from a service provider.
From the business perspective, "as a service" becomes OPEX, rather than the CAPEX of buying hardware and software; hence, the attraction for many media businesses.
The sheer size and streaming rates of broadcast resolution files have restricted the use of the cloud, but that is finally changing as the costs of bandwidth become affordable for video.
Adobe Creative Cloud
The release of Creative Suite 6 also saw Adobe selling software on a rental model. Users can pay by the month to use the Creative Suite, including Premiere Pro, the editing application. This means no more worries about upgrading software — it is the latest version, and it comes on tap. You can fire up a workstation for the duration of a project and then leave it dark in between, if that is cost-effective. This is very different from the usual model of buying software, then buying upgrades every year or so all from a capital budget. The monthly software cost can be charged to a project as an operational expense.
At IBC2012, Adobe showed an early iteration of the next step forward, editing in the cloud, dubbed Adobe Anywhere. This innovative approach to an editing architecture splits the NLE down the middle. The Adobe Mercury engine, which performs compositing and streaming, sits on a central server with high bandwidth access to shared storage. The server uses NVidia GPU acceleration to get the required performance.
The remote client is a Premier Pro CS7, yet to be released. The client provides the user interface, familiar to Premiere CS5 & 6 users. Relieved of the compositing and rendering tasks, the client can be a modestly resourced laptop. User commands are transmitted to the server, and the rendered view streamed to the client and displayed in the program monitor window. For all intents and purposes, it looks like a regular copy of Premiere; it’s just that the media files may be in another country. For multi-layer sequences, the server does compositing and sends single stream to viewing window.
Just recently, I was sitting in the Adobe offices in England, getting a demo of this interesting product. We were playing video clips on a server in Germany, over a Wi-Fi connection, and editing in England. With several layers of video, with effects applied, on the timeline and I could finger dance on the ‘J’, ‘K’ and ‘L’ keys to scrub a clip with no lag or buffering. I was told that the stream adapts the available bandwidth using the Mercury engine. At a still frame, a high-resolution JPG is sent to the client.
The first time I saw this kind of editing was with Forbidden’s FORscene. That uses a proprietary codec, with intelligent caching designed for frame accurate web browsing of video. Performance is good — very responsive with no apparent lag, unlike some AVC proxy editors I have seen from other vendors.
There is no release date yet for Anywhere, but I guess NAB2013 would be a good event to launch at. Although, I was told that it wouldn’t be released until Adobe is happy with its performance and capabilities.
Who is it aimed at? Early to say, but it must be large media companies who can run the central server systems. I understand that the performance bottleneck is the bandwidth out of the disk array rather than the rendering in the Mercury engine. Some early work is using the Omneon Mediagrid for clip storage. A high-performance NVidia GPU card can serve several clients, although no performance figures are available yet. Considering the cost of GPUs, a server farm and basic laptop clients is going to be less costly than equipping every editor with a top of the line laptop.
Another visit I made recently was to Avid to catch up with developments in Interplay in the quiet of the Avid offices (I can never take in the subtleties of technology on a show floor). Avid has taken a different route with its "cloud" editing, addressing the specific needs of newsgathering with its Interplay Sphere. It uses Media Composer and ties in with the rest of the Avid ISIS/Interplay infrastructure. It's available for Windows and Mac (the Mac version will have upload capabilities in the near future). Sphere allows reporters and producers to gain instant access to all their media assets: footage, graphics, animations and audio sources, all from the field.
When I worked at a TV station, the reporters all came back to base with footage to be edited for the evening news. With newscast running around the clock, and 24-hour new channels, the return to base was inefficient. Staying on location meant another team editing stories back at base.
Products like Sphere remove geographical constraints. News can be cut when and where it is happening. This cuts costs and cuts time to air. Tie Sphere to bonded cellular backhaul, and there is less need for expensive DSNG trucks to cover minor stories that don’t warrant the expense — a factor that would have kept such stories from being covered.
The recent Olympics saw much innovation in remote production, with NBC, as an example, using FORscene as part of their collaborative workflow that spanned two continents for the duration of the Games. The restriction on using the cloud for video has been and still is connection speed. High-resolution needs a fast connection, and for the economics to add up, it has to be price-competitive with legacy methods.
Remote editing has its place, especially for applications like news. But finishing, where the NLE must be hooked up to a broadcast grade picture monitor, remains the forte of the post facility. Will we every get a laptop display good enough for finishing? In view of the small number of potential sales for such a device, I doubt it. However, the latest high-pixel density laptop displays are much improved from the early laptops of 15 years ago. Who knows what the future holds.
There is no doubt that the edit bay, with good video and audio monitoring and controlled lighting, is an easier place to critically assess picture quality — much better than a Wi-Fi-equipped coffee bar.
Editing has come a long way from the Moviola, and the laptop, with-Wi-Fi-to-the-cloud, presents a whole new way to edit, freed from the necessity to be tied to a place. However, it must be remembered that underlying the technology, it’s all about codecs, their efficiencies and the cost and availability of Internet connections. I anticipate we will see more developments at NAB.
Note: Avid text updated Jan 4