It's not hard to monitor video. What could be simpler? Plug a BNC video cable and RCA audio cable into a monitor, and — glory be — it works! There was a time when it was that easy, in the era when you could predict with concrete certainty that video came in two flavors: NTSC and PAL, with an occasional dash of SECAM, and audio was analog. Steadily over the last 30 years, the world of analog certainty has become the fuzzy world of digital confusion.
First, video became analog components (several flavors subtlety different), and then the signals became digital. In 1978, we saw HDTV for the first time in North America at a SMPTE conference at the St. Francis Hotel in San Francisco. Francis Ford Coppola was there looking at the NHK research development, and I peered over his shoulder at what would become the future of this industry and a major force in my career. At the time, it was made up of analog components, but we figured out how to first digitize and then compress the content to smaller pipelines.
The point is that the proliferation of signal formats and carriers (digital and analog) has steadily made monitoring content more complicated. Oh, and don't forget the interconnection and scanning standards developed for the computing industry. That certainly does not make it any easier, with VGA, SVGA, XGA and a host of alphabet soup connection standards. If you really want to be confused, add compressed formats, which conveniently, or inconveniently, often use the same BNC for interconnection.
Scaling content to the screen
As the industry has strayed further from the simplicity of analog on BNC, it has the good fortune that hardware and software had to be developed to enable both professional video industry hardware as well as CE solutions to monitor multiple scanning standards. For the first couple of decades while the modern desktop computer industry was changing, the format on-screen was a matter of scanning the beam faster or slower and adjusting deflection hardware to get the geometry to work out. When displays became digital, a new product was needed (a video scaler), which could fit the content to the screen instead of the screen to the content. Though a simple concept, this has a profound impact on monitoring today.
Some monitors had scalers built-in by the late 1980s, allowing any single signal to be adapted to the technology of the display. About a decade ago, products emerged that took multiple signals and scaled them each to fit in a window on the screen. The first such device I saw, perhaps 20 years ago, was designed and manufactured in Europe, and I remember the price was astronomical and the quality degraded. But monitors above 20in were called projectors then.
By about the middle of the last decade, systems leveraged the availability of scalers designed largely for CE applications to allow flexible windows to be mapped onto a single output, and at higher total resolution than that most common for the day. Feeding the output of such a processor to a flat-panel display, plasma and later LCD permitted content to be displayed at nearly the same apparent resolution as a small monitor in the ubiquitous monitor wall, which was often a 9in diagonal screen.
Think about the resolution for a moment. Using the 525 signal rule of thumb, optimum viewing distance for a 9in monitor is 6X the picture height, or about 43in. If you're any closer, you see scan lines, and any farther away, you miss some of the potential screen resolution. On a 42in monitor, you can get about five virtual screens of this size across the display. Using rack-mounted monitors, you would get no more than four. Using multi-image processing, more monitors can be displayed in the same surface area.
Let's not confuse the displayed resolution, however, because to fully display the native resolution of a 720 × 480 picture for each virtual 9in monitor would require a display approaching 4K resolution. The key is that unless you are about 30in away from 42in, or less, you don't perceive the loss of real resolution, assuming perfect image scaling.
This convenient fact allows us to sit back at a comfortable viewing distance and see good approximations of what a wall of CRT monitors would show; it has revolutionized monitoring. I built the first all-flat-panel monitor wall, with multiple 16in LCD monitors that had essentially the same resolution as 720p HD signals, for an ABC “Monday Night Football” production truck in 1999. It was a technical success, meaning the goal of reduced weight and power consumption compared with CRT options was achieved. It would be silly to do that today because about 10 to 12 flat screens would have allowed the same results with infinitely more flexibility and less wasted space between monitors. And maybe my hair would be less gray as well.
The importance of multiple inputs
From the view of a system designer, it is important to build systems that allow multiple inputs to be mapped to more than one output, with internal routing of signals to allow duplication of inputs. Using the ability to scale images to arbitrary virtual monitor sizes creates flexible composite displays that provide tremendous monitor wall layout options to production professionals. Systems can offer internally generated clocks, map external tallies to the virtual monitors, display audio from discreet and embedded sources, accept HD and SD sources from multiple standards, decode ratings and closed captions, and accept background images to fill the screen. These capabilities make a virtual monitor wall much more powerful than the CRT approaches, which were the sole option a few years ago.
It doesn't stop there; many of these capabilities are starting to show up in stadium-size display processors. Don't be surprised if the largely static mapping of inputs onto an output plane becomes dynamic over time to allow complex productions to be created in part by manipulating the canvass that a complicated multi-image processor addresses.
John Luff is a broadcast technology consultant.
Send questions and comments to: firstname.lastname@example.org