Peeling an Orange With a Hairbrush

Lighting has come by accident to include responsibility for most of the visual material that now adorns production sets.
Author:
Updated:
Original:


(click thumbnail)Lighting was one of the primary stage elements for the Eurovision Song Contest 2007.Lighting has come by accident to include responsibility for most of the visual material that now adorns production sets. To some extent, it’s our own fault for leaving a relatively simple digital control protocol like DMX lying about where others were bound to trip over it, but it really wasn’t something we lighting folk deliberately set out to make a grab for.

Certainly it gives the LD control over more of the visual design elements of a production, but if we were working with even half-competent production designers in the past, the visuals of a production were already part of a collaborative process that was greater than the sum of its parts.

THE LAST TO KNOW

Whilst giving the LD even more creative input into the production, designing the video content injects an entirely new and expanded range of disciplines and design decisions into a process already fraught with pressure. Other than actually framing the shots and shooting the production, lighting has always been the last design component to be put in place.

While sets, costumes and props generally have lead times measured in weeks and months, lighting is considered the one visual element that can be changed to deal with production contingencies or whims, even during shooting or live-to-air broadcasting. The addition of video elements into the lighting process has certainly increased the demand for on-the-fly adaptation and redesign.

LED EFFECTS

Despite the confused public optimism that any day now, the LED will be replacing our high-brightness (if incredibly energy inefficient) incandescent lamps, the major application for the LED in production remains as a point source of light, facing towards the audience and/or the camera. Since the blue LED came on the scene to join its red and green brethren, it was inevitable that clusters of R, G & B LEDs would be brought together to create color changing effects. With a simple 8-bit control protocol like DMX512 available, it was to be expected that it would be used to give 24-bit (16,777,216) color-mixing capabilities to the LED clusters.

Less predictable was the use of dozens, then hundreds, and now thousands of these color-changing effects devices to display low-resolution, real-time video. Initially, the LED effects devices were used for standalone lighting effects. Then, as the DMX control systems developed more sophisticated matrix functions, the LED devices were used to extend the visuals from the full-resolution video elements in the set. Now, the LED devices have become video display elements (pixels and groups of pixels) in their own right.

Although DMX512 has been used to control scenery motion, atmospheric effects, lasers, pyrotechnics, projectors and even the occasional coffee pot, bending it to carry full-motion video is quite an ask. With a capacity to update as often as 44 times per second, DMX can certainly handle refreshing progressive 24, 25 or 30 frame images, but any image of even moderate resolution is going to cause huge data headaches.

We are seeing on our screens right now, such visual extravaganzas as the NBC game show, “1 vs. 100,” where each of the 100 contestants sits in front of a matrix of LED devices that change color and display patterns as the game progresses. The system is very responsive and very complex, employing 10s of thousands of channels of DMX to control a matrix of approximately 160x80 devices. It’s visually very impressive, even though this matrix has a picture resolution similar to that of your five-year-old cell phone.

TOO MUCH DATA

If we consider something as moderate as a Windows 95-style VGA screen at only 480x640 pixels, the numbers are quite daunting. A screen at that resolution has 307,200 pixels, each of which requires three slots of 8-bit DMX data (for R, G and B). Driving this would require a controller producing 921,600 DMX channels, or some 1,800 DMX data universes.

Even using a DMX-over-Ethernet protocol such as ArtNet (which can theoretically encapsulate 256 DMX data streams), would require eight ArtNets to transport the data. The lighting console to drive such a matrix would require the processing capability of 29 of today’s fully tricked-out, top-of-the-range grandMAs or more than 100 ETC Congos.

A resolution comparable to the widescreen (1,680x1,050) desktop monitor that this article is being typed on, translates to 41 ArtNets and 162 grandMAs, rather than a $40 video card and a single DVI link cable. Clearly, DMX512-controlled LED elements, driven by a lighting console, is not a sensible way to deal with more than a paltry few thousand pixels of video.

DEDICATED APPROACH

A more reasonable approach is to employ dedicated video processing systems that record, store, manipulate and replay video in formats more suitable to video projection and display systems. Although varying in format from dedicated hardware to desktop software packages, systems such as SAMSC’s Catalyst, MA’s grandMA Video, Green Hippo’s Hippotizer, VJ from Arkaos, Martin’s Maxedia, RADlite from Radical Lighting and High End Systems’ Axon/ DL2, all fall into this category. Something that they have in common is an interface allowing them to be controlled by DMX512 data.

Once again, we come up against lighting consoles driving technologies that are shoehorned into an interface designed to do an entirely different task. There is nothing even slightly rational or intuitive about using two adjacent channels on a lighting console to set z-depth on a cubic morph and image rotation speed or left-edge image overlap. No matter how you map them, 200-plus hundred media server control channels just don’t layout logically on a console with 96 faders, four scroll wheels and a trackball. This doesn’t mean there aren’t dozens of bright young lighting programmers who can drive a WholeHog console with each hand to program a powerful media server without taking their eyes off the stage. But once again, it begs the question as to why a system broadcasting 512-byte packets of 8-bit data, with no error correction, should be used for such a task.

Despite the way the development process has stumbled blindly into its present state, and the way we have acquired responsibility for designing powerful video elements in many productions, classy and subtle video designs are beginning to appear on our stages and hopefully, soon, in our studios. It’s just hard to understand that we can get the production design so right while getting the interface design so wrong, again.