PxPixel
Graphics and effects - TvTechnology

Graphics and effects

A look at the underlying technology of today’s new graphics platforms and the wide array of features and benefits available
Author:
Publish date:


Graphics and effects systems have come a long way in the past 20 years. (Left) Shown here is the Chiron II, the first character generator from Chyron, in use at ABC more than 30 years ago. The system occupied an entire rack by itself.

(Right) Shown here are the HyperX, Duet LEX, Duet LE and C-Mix systems from Chyron, installed at WDBJ-TV in Roanoke, VA. The systems all fit into one rack.

The need for speed

The graphics department at a television station used to be staffed by artists skilled in “flat art,” the creation of pen, pencil, ink, pastel, oil or other art. Commercial artists in general were highly trained in the graphic arts and were taught the special needs of commercial art. Portfolios were moved in carriers appropriately sized for large-format art. It was a gentle art, and a skilled person who could create the art needed for a television broadcast rapidly was needed in every television station. “Rapidly” did not mean in a matter of minutes, but rather hours, so the graphic content of broadcasts had to be established early in the day so there would be a chance of getting what you needed in time to shoot it with a camera, live to air, or perhaps stored on tape or a still store.

The time pressure of broadcasting, as well as with newspapers, magazines and other modern media, created a need that software stepped up to fulfill admirably. Beginning in the 1980s, computers gained enough horse-power to be capable of running sophisticated, graphically intensive software. Special-purpose systems were expensive and required long periods of training to make a transition from “flat art” to computer art. The results were astonishing. Often the art created for television electronically was much more sophisticated in its use of manipulation of existing media than anything that could be done with conventional means.

Over the last two decades, there has been a process of “democratization” in technology. Nowhere is this more obvious than in graphics and effects systems. Electronic touch-up of images has reached a level of sophistication that allows incredible productivity and creativity at the same time the cost of hardware and software has plummeted. Photoshop and other software packages are now ubiquitous, used for marketing communications, media production, photography retouch and even home use. Barely a decade ago, the scale of computers needed to be effective in the graphic arts (for television or other purposes) put it in the realm of specialty hardware. Today, mail order PCs have more than enough capability, and consumers are retouching their own digital photos as a matter of course rather than going to professionals for expensive assistance (sometimes with predictable results, of course).

A new generation of commercial artists has been trained. These artists have no less skill than their counterparts from the 1970s, but they also are just as comfortable with a mouse and tablet as they are with a pen, pencil and brush. They have new skills not available to their predecessors when they graduated and entered our industry, many of whom have made the transition to technology quite effectively. The parallel to film editors is quite striking. In both cases, technology, while initially restrictive of the creative arts, has evolved into powerful tools, freeing creative professionals to do things they only dreamed would be possible.


Today’s systems are more compact. Shown here is the Duet HyperX CG from Chyron, installed at KBS-TV in Korea. The system occupies only 4RU.

From a technical standpoint, this transition has been all about the education of computer software and hardware professionals about technologies for which they initially had little appreciation. Color science has developed for 100 years, more or less. Computer graphics are barely 20-years-old. Translating the science of the reproduction of color to software has been somewhat of an iterative process extending over a long time. Unfortunately, many early graphics systems did not take into account colorimetry issues, or at least not to a sufficient degree. The second problem was that the graphics output cards were often adapted from designs intended to simply feed a CRT, with poor results for broadcast uses.

Today, a number of hardware solutions are available with SMPTE 259M SD and SMPTE 292M HD outputs, including the requisite reference inputs. Software too has become much more sophisticated in dealing with the issue, and often now offers the option of using ITU-R Rec 709 colorimetry, or SMPTE 240M. Obvious caution should be exercised if you think you might need to create a graphic that must be used for print and broadcast purposes. The file will have to be output specifically with the intended output medium.

More power

Effects for broadcast use began at about the same time that serious character generators and computer graphics systems were first available. However, early effects systems needed much more horsepower than early desktop computers could supply, and often the most effective systems used mainframes, in at least one case on the campus of a major university. It didn't take long for that to evolve, and as early as the late 1980s, PC effects software was reasonably credible.

Today, rendering farms of microprocessor computers not unlike high-end desktop machines create motion picture scenes in very high resolution using software capable of modeling even human and animal movements with remarkable results. Simply flying through a logo? Any desktop can do it with off-the-shelf shrink-wrapped software packages.

One might logically ask where is this all going. Clearly, the power of desktop computers continues to roughly increase following Moore's Law. Microsoft has issued a release of Windows (Windows XP Professional x64 Edition) with 64-bit and hyperthreading. Graphics and effects-intensive applications are one of the intended targets of these advanced products. One audio mixing application claims a speed gain of 30-percent simply by compiling for the new operating system. Memory space is increased from 4GB to 128GB, allowing much more complicated modeling without using virtual memory. It is entirely logical that applications that can take advantage of Hyperthreading and larger memory space will become more powerful and at the same time faster.

It is interesting to note that for the last 15 years or so, graphics professionals have often preferred to use Apple computers due to better support for graphics applications. A recent announcement that Apple will be switching to x86 platforms adds momentum to the Intel architecture. Apple still plans to tightly couple hardware to its fine OS X operating system. However, with more capable processors, one might logically expect even more compelling reasons to be expounded for keeping graphics professionals on MAC platforms.

So we are back to the beginning. The key to graphics has always been professional training using the tools available to promote creativity and efficiency for business. Recent graduates of schools of art, universities and trade training schools are given the best technology with the best training in art, which still yields superior employees who are creative in a process that remains time bound.

John Luff is senior vice president of business development for AZCAR.