Creating the Picture, One Pixel at a Time

After a year as a lowly studio lighting technician, when I was finally admitted upstairs into the inner sanctum of the control room, I was impressed by all of the consoles, monitor bays and oscilloscopes that belonged to the video engineers. At the time, I was working very hard to take what I had learned in lighting for the stage and parley it into a career in TV, and so decided to ignore all that video stuff while I got on with learning my trade. I did, of course, have to learn a great deal about video control on the way to becoming a competent lighting director, I had always thought of video (like audio) as somebody else's business.

In later years, as an inveterate knob-twiddler with time on my hands, waiting to broadcast late night news and sports hostings, I eventually figured out how to drive every one of the video processing, compositing, effects, storage and captioning devices in the neighborhood. I even learned to drive the Ampex HS-100 video disc while it was still alive. At the time, I had no intention of putting this knowledge to any use.

Somewhere along the way, lighting also became video, a process that continues to evolve in unexpected directions. In hindsight, it was a natural step for LSD's 1998 Icon M fixture, with its DLP-based configurable gobo, to give rise to the High End System DL device--a robotic luminaire merged with a video projector. The most recent of these, the DL 2, takes the idea a step further by including a video camera and an infrared light source, for capturing video images in the dark.

What continues to make no obvious technical sense is the now-established trend to produce video servers and processors that are controlled by a DMX512 datastream from a lighting console. Certainly, it makes sense to have the capacity to tightly integrate two of the most important visual elements of a production. However, the decision to map an open-ended panoply of video transport and processing functions, into an 8-bit protocol with a 512 byte packet size, an indeterminate delay between packets, and no form of error detection or correction still leaves me slightly bewildered.

That the controlling DMX datastream is generated by a console designed to deal with 8-bit dimmer levels and some arbitrarily assigned 8- and 16-bit control functions for robotic luminaires makes this decision even more indecipherable. It seems akin to the idea of undertaking neurological microsurgery with the controls of a Bobcat excavator. Nevertheless, the idea has caught fire. Now, no music, awards or big event production would dare go to air without screens displaying the outputs of several Hippotizer, RADlight, Maxedia, ArKaos, Catalyst, etc, media servers, to swirl, pixelate and otherwise munge video material, usually in synchronization with the music.

THE ROAD TO LEDs

The next step along the road has been the adoption of the LED as a light source. Despite some bold attempts to give the contrary impression, LEDs remain Next Year's Light Source, as they have for a well over a decade. It really is going to be while before we're lighting major productions with LED keylights and fill lights. While we wait for that day to arrive, LEDs have reached a point where they now come in some good base colors for RGB mixing, and with a brightness sufficient for them to be easily seen as point sources behind panels, and embedded in scenic elements.

Now there are many of these LED fixtures sitting around in production departments and rental warehouses, and while we may be unable to actually illuminate much with them, they are a very responsive, DMX-controlled source, with no appreciable thermal or optical inertia. Also, in many cases the LEDs can be controlled individually or in quite small clusters.

Inventive lighting designers have taken to placing arrays of these fixtures in direct view of the audience and sweeping colored patterns across them, as part of the overall color and texture of their designs. If we add tubes, ribbons, spheres, cubes, floor panels and lengths of fabric, all fitted with small clusters of separately DMX-addressable LED sources, we begin to see a complete design environment that can be controlled, point at a time, for color and intensity. This of course is also the recipe for a video screen.

Matrix lighting controllers of varying levels of sophistication have been around for decades. They have been used in applications as diverse as electronic signage, sweeping staircases in musicals and those big color chase effects in the thousand parcan rock 'n roll concert rigs of the late 1980s. Indeed, many of today's high-end lighting consoles have very sophisticated matrix control capabilities as part of their effects repertoire.

The situation before us today has become possible as the result of the convergence of bright, highly responsive light sources, independent pixel-level control over color and intensity, high-speed video processing systems and affordable control networks that can address thousands of subpixels at something approximating video frame rates. LDs are beginning to do such things as matching the content of a full-definition video screen in the set with low-resolution echoes of the same images that diffuse across an entire stage.

Artistic Licence's Colour-Tramp was probably the first control system designed specifically to turn an LED array into a near real-time low-resolution video display. During the development of Colour-Tramp, which currently controls up to 20 DMX universes (10,240 channels), Artistic Licence realized that actually getting the channel allocations correct for such a complex system was a significant problem. To this end they have developed the Visual-Patch system which uses a video camera, in conjunction with RDM (Remoted Device Management, the bi-directional addition to the DMX512A protocol), to switch on each device in turn, and map its address into the video image.

Such systems will have to remain low resolution for quite a while, as to achieve even a standard definition (640 x 480) display with its nearly 1 million sub pixels (307,200 each of red, green and blue), would require a processor pumping out 1,800 universes of DMX512. Besides, I'm just too busy at the moment to come down with my L'il DMXter to find out why that pixel in the stage floor is stuck.