Trends in broadcast graphics
Three trends are changing the look of broadcast graphics for weather, sports and news coverage. More data is available to use, use of live data has greatly escalated and there’s been a miniboom in virtual sets. Graphical compute power has increased dramatically, allowing for the creation of extremely realistic graphic representations of data and the on-air delivery of a virtually unlimited number of real-time data streams.
More data – and more precise and cheaper data – is available than ever before. Two or three years ago weather maps were accurate to within a kilometer; today, commercially available terrain data can be acquired for about the same price that provides accuracy to within three meters. In sports, the fly-throughs used for the 2002 Winter Olympic Games were so realistic that most viewers probably assumed they were live. They were, in fact, data reconstructed from satellite and other photos by Harris’ RealSite software running on an SGI Onyx 3200 system.
The growing use of weather and terrain data is based on the increased ability to do complex texture mapping. First, all the polygons that compose the 3-D mountain range or ocean or building are created. This number can vary from several hundred thousand polygons to several million per scene. Then, the geospatial texture data, which can be as much as several gigabytes, is mapped onto it. This process has become more affordable because of the rapid increase of texture memory in high-performance computing.
As well, software techniques like clip-mapping and image-based rendering can create “virtual” texture space in addition to the hardware memory. Three years ago a PC graphics card had a couple of megabytes of texture memory, the Silicon Graphics Octane workstation had 16 MBytes of texture memory and the Silicon Graphics Onyx graphics server had 64 MBytes of texture memory. Today, texture memory on the typical PC card is 32 MBytes, compared with 128 MBytes on an Octane2 system and 1 GByte on an Onyx system.
Geospatial imaging revolves around the computational size of the data, but news data sources – the tickers you see everywhere – are streams of data served from a variety of sources. For instance, the news services and stock exchanges provide live data feeds, and content aggregators can provide continuously updating sports scores.
Increased use of live data
Where is all that extra data being used? Virtually any station or cable channel that’s involved in information programming – sports, news or weather – has at least one ticker, and maybe several.
Years ago, a ticker was used primarily for severe weather alerts, and information had to be typed in manually by the Chyron operator.
Today, that process is completely automated, and the large amount of available forecasting data is beginning to be used for microforecasts, to direct television viewers to the station’s Web site, which uses the same processing and serving hardware.
The day can’t be far off when micronews, similar to local traffic reports, will be delivered the same way. Once the weather department has paid for the 3-m- or 1-m-resolution data, why not turn around and use it in a news story? If police are chasing a bank robber, the station could broadcast a fly-through of the buildings and streets to make the news more compelling.
Sports coverage is another example of increased use of live data. In the old days, if sports producers wanted to show the television audience a clock, they would focus a camera on the clock on the scoreboard. Today, the digital output of the timer that drives the clock becomes a graphic.
The Olympic biathlon is another example. In addition to the clock that counts down the athletes’ skiing time, a graphic now represents the number of shooting targets the athletes have hit, as they hit them. Lock-down cameras and radar guns transmit the data to workstations and servers that instantly deliver the information as a graphical representation on the home television screen.
Sportvision is the current leader in live sports data; Orad and Princeton Video Images (PVI) are also major providers. Sportvision’s RACEf/x software, branded as FoxTrax for the NASCAR on FOX 2001 telecasts, is but one example of technology driving increased use of live data in sports.
More and more broadcasters are looking for ways to broadcast to niche audiences because the competition is so fierce. We know that CNN delivers national and international news much better than a local television station. What’s left for local stations to differentiate themselves with is local news. Even with the trend toward centralcasting (where there would be one operations center for, say, 37 affiliate stations), all this realistic, accurate, available data, combined with a virtual set, allows the local station to affordably present local news in dynamic ways.
Virtual sets from leading software companies vizrt, Orad, Brainstorm and Discreet are enjoying a miniboom across the United States at local broadcast stations and cable channels, and around the world. For production and live coverage of the 2002 Winter Olympic Games, Wige Data, one of the world’s leading providers of sports data processing and result services, delivered vizrt’s viz [virtual studio] software package powered by Silicon Graphics Onyx2 systems to German public television broadcasters ARD and ZDF. Covering all of Germany, the stations used the virtual sets to air interviews and on-site commentary.
Because, again, much more texture memory is available on high-performance computers, virtual sets can now be much more realistic. All the extra memory can be used to create the carpeting or the desk or, in the case of the New York Stock Exchange, a banister on a balcony that doesn’t exist.
With the increased use of ever more realistic virtual sets, the vocabulary of “on location” could go away. Correspondents will still be reporting “live,” but the location of the “live” reporter will be in a studio, using a texture-mapped, photo-realistic representation created for the occasion or served from the growing archives of terrain data and local 3-D fly-throughs.
A few years ago the vice president of news at Fox made an insightful comment. He pointed out that viewers have been trained by their remote controls to surf. In the five seconds that viewers switch to your channel, you can get them to stop surfing only by having something compelling to see. That “something” is a graphical element – the more realistic, the better.
The increased quantity and quality of data and the increased streams of data are all converging. The technology, certainly, is here. The trend, and the challenge, is to use that data to present a much more immersive experience to the television viewer to prevent him or her from pressing those up and down buttons.
Shawn Underwood is the director of product management, Visual Computing, at SGI.