David Austerberry /
06.01.2010 12:00 PM
New technology helps directors meet the demands of sports
Sporting events draw a big investment in new technologies like 3-D.

While TV audiences may be migrating to online and mobile entertainment, one area where television excels is the live event, specifically sports. One of the genres where 3-D is expected to be most successful is sports. Sporting events also attract large revenues for subscription television. Put all that together, and it is easy to see why sports receive a big investment and are a focus for innovation in production techniques.

Many programs go through post production as an essential part of the workflow: selecting shots, adding VFX, color correction and finishing. The sports director has no such luxury to refine the program. He or she can select from the camera angles available on the monitor stack, use on-the-fly switcher effects like DVE moves and run replays of key action from video servers.

This places a special emphasis on what the camera delivers to the truck. To this end, point-of-view (POV) cameras give sports fans a perspective on the game that they could never see from a seat in a stadium or standing by the green.

Viewers expect unusual views of their favorite sport. They expect the tactics to be explained with graphic aids like telestrators, and they want a constant stream of statistics. Finally, they want the latest format: HD, 5.1 and even 3-D.

To meet these needs, the sports production must use custom cameras and camera support systems, logging systems, and specialized graphics systems. International events present especially complex demands, as the various broadcasters taking the feeds have different requirements, from 3-D down to a few SD highlights for use in a newscast.

For a large international event, the issue with video is to provide each broadcaster with what it wants, and there are conflicting requirements, such as whether the broadcaster's home team is playing, whether the broadcaster wants a clean feed or if it needs the house graphics. On the audio side, the complexity comes from providing facilities for the different commentary that each broadcaster will want to add. Even a national event may have to deal with some of these issues if more than one language is spoken in that country.

Acquisition

To present the director with the optimum choice, the acquisition of video and audio is key. Sharp, well-exposed and shaded pictures with great composition go a long way to easing the task of the director, and technology can help. For the audio, careful selection of microphone type, polar diagram and surround configuration, plus good placement, can all add up to create an exciting sound design.

Much of sports coverage cannot be achieved with fixed cameras, and the wireless link frees a camera of all tethers.

Wireless

Wireless cameras have been around for a long time, but the technology has been steadily improving. Cameras and links are expected to support HD and to reliably backhaul signals from fast-moving vehicles and helicopters.

Deep interleaving is a new technique that helps reduce interference caused by signals' loss of seconds. This could be from ground to air, when the transmitter is temporarily obscured by bridges or other obstacles, or from air to ground, where obstacles like trees may obscure the helicopter.

Sports that take place over a large area, such as marathons and motor racing, have been limited in the past to provide wireless coverage without setting up relays. Borrowing concepts from the phone industry, it is possible to extend diversity reception to use cellular diversity, with use of cellular receive zones and the camera able to move freely between cells.

Recent developments in modulation use COFDM, but with parameters modified from regular DVB-T to provide optimum performance for contribution circuits (DVB-T being optimized for distribution), much as DVB-T2 improves performance over the original COFDM systems. Typically, 19Mb/s can be delivered over a wireless link. To achieve the low latency needed in sports coverage, an I-frame-only video codec is necessary.

Cameras, auto-focus and image stabilization

The move to HD displayed on large screens means that focus errors, camera movement and vibration are more obvious. Many sports applications need a long focal length to achieve closeups from remote camera positions. A long focal length means small depth of field, making focus adjustment critical. In a football stadium, a camera could be 100m from the penalty box, and a focal length of 900mm would be needed to get a head and torso shot of a player. This is an angle of view of less than one degree, so it requires little motion to create a pixel of movement.

Many phenomena can cause visible movement, such as wind on a scaffolding platform, but even the most solid stadium can move when the crowd gets excited. The camera platform in a stadium is typically high up on the roof, and on maximum focal length, vibration can be easily visible to the viewer.

Auto-focus is a standard feature of consumer camcorders. Some more experienced operators may feel auto-focus is not for them, but long-focus shots of moving objects would test even the most experienced operator to maintain sharp focus. Here technology can help.

Image stabilization is also commonplace in consumer camcorders, and for aerial shots, large gyro-stabilized mounts can remove the effects of helicopter motion. To apply the concepts to large box lenses, different design concepts are called for, but several field lenses are now available with image stabilization.

Continue to next page

This has led to the use of image stabilizers in some sports coverage becoming de rigueur. However, vibrating platforms are not the only movement that can impair the potential sharpness of an HD image. POV cameras may be mounted on vehicles or carried by the camera operator.

One technique is to detect shake causing yaw and pitch, and to shift a group of lenses within the zoom assembly in the vertical and horizontal axes. The shifted lens group corrects the path of rays through the lens and removes the effects of the lens vibration.

POV cameras mounted on vehicles or carried by the operator are subject to a different type of motion. This is at a different frequency range from the vibration of a stadium and calls for a different type of correction.

POV

One area that television excels at is the POV camera. These can give fans the view from inside a goal, from a cricket stump or from the bottom of a diving pool.

The POV cameras use miniature cameras with small lenses and are then fitted to the appropriate housing for the sport. The technology has advanced as subminiature HD cameras have become available. Applications include static cameras or remotely positioned cameras with pan and tilt control.

A typical POV application is to get close up from inside a football or ice hockey goal. The camera must be small and resistant to a direct hit by a hockey puck. Such cameras will provide unobscured pictures of goals during the World Cup from within the net using a miniature camera that captures a 125-degree horizontal field of view without curvature distortion. This will allow the entire goal mouth to be televised from the rear of the net. A motorized mount enables the camera to be returned to its normal position if struck by an oncoming ball.

Slow-mo and hyper-motion cameras

Most sports coverage now finds slow-motion replay essential. There are three ways to implement this. It is possible to use regular cameras and record to a video server, and then interpolate frames on playback. This is a low-cost solution, but for high-quality slow motion, a high-speed camera must be used. This can be a three-phase camera for 3X recording or an ultra-motion camera for much higher speeds.

Slow-motion cameras are based on regular television cameras but can shoot at two or three times the normal frame rate and output a three-phase HD-SDI signal, with each phase being at 25fps or 30fps.

Hyper- or ultra-motion cameras are more often derived from high-speed cameras designed for applications like crash analysis and weapons testing. Early cameras were VGA or SVGA resolutions, but the latest versions use full 1920 × 1080 sensors more suited to broadcast operations. The camera records to local memory (which typically stores around 6000 frames) at 10X or 20X speed. Once an event has been captured, recording is transferred as regular HD-SDI at 25fps/30fps. This can be recorded to a video server and then played out frame by frame as required.

The World Cup coverage is a good example of the current camera techniques. From stereo to POV, many innovative techniques will help bring fans to the edges of their seats. Alongside the regular field cameras, there will be a cable camera suspended over the pitch; jibs to provide mobile shots of play, the bench and prematch interviews; and POV cameras covering the goal. All of these provide the director with the widest range of views from strategic shots to closeups of the action, both live and as replays.

Audio

Audio plays a vital part in any sports production. Spectator noise helps convey the emotion of the crowd. Shotgun mics pick up the sound of play, such as bat against ball or foot against ball, and finally the commentator provides the description. Part of an HD production can be surround sound. There are many techniques for acquiring surround sound, from sound field to dual mid-side-mid (M/S).

The 2010 World Cup is using both the sound field (coincident) and the ORTF technique (spaced array). The ORTF array is a development of the IRT cross, but rectangular rather than square. (See Figure 1.) Using four cardioid capsules, the array gives good 360-degree imaging with the optimum spatial separation.

Switching, 3-D and mobile

Most events in a stadium need at least two mixes, one broadcast and one for the stadium displays. The broadcast mix could also be in two versions: graphics and clean.

The demands of different delivery formats mean that different mixes could also be required for broadcast. Although 4:3 and 16:9 can be handled by a protected center zone of the screen, important events may also be covered in 3-D, which currently needs a separate mix of the 3-D cameras.

3-D needs both different cameras and camera positions. The switching style must also be different, as cuts between sources must take into account the Z-axis to avoid unpleasant parallax clashes for the viewers. Current practice does not lend itself to deriving a 2-D feed from a stereo mix because of the different composition, switching and framing.

3-D also needs different graphics treatment. 2-D graphics can be keyed as a layer over the action. In 3-D, the Z-axis of the action and the graphics must be taken into account; plus the graphics can be true 3-D, meaning they can have a Z-axis component.

Mobile-specific feeds do not have all the issues of 3-D, but they still have a different requirement from the HD (protected 4:3) mix. Composition should be tighter, and graphics should be simpler and larger relative to screen dimensions to make them more readable on small screens.

With 3-D and mobile being subscription content, it is easier to figure the business case for the additional cost of operating separate switching chains.

Logging

The logging workstations that allow operators to tag clips with keywords are essential to the director's toolset. They are one of the many applications that have been developed to enable the coverage that fans come to expect. Using the keyword metadata on the timeline, editors can quickly access material to pull together highlights packages.

There are many ways to implement logging, but touch screens or shot boxes are popular. The logger can tag players' names, or elements of the play like “goal” or “penalty,” against the clip timeline. Without logging, the process of highlight creation could not keep up with the pace of play.

Summary

Broadcast sports technology is in a state of constant evolution to provide fans with more compelling coverage. Some would say that the only advantage of being at the stadium is the atmosphere, as television can provide so many angles on the play, along with all the analysis from the experts.

Some of the innovation comes from small companies that build the custom equipment like camera support. Other innovation is being driven by the major vendors looking to stereoscopic 3-D to drive new markets for the CE products through to broadcast equipment.



Comments
Post New Comment
If you are already a member, or would like to receive email alerts as new comments are
made, please login or register.

Enter the code shown above:

(Note: If you cannot read the numbers in the above
image, reload the page to generate a new one.)

No Comments Found




Thursday 10:05 AM
NAB Requests Expedited Review of Spectrum Auction Lawsuit
“Broadcasters assigned to new channels following the auction could be forced to accept reductions in their coverage area and population served, with no practical remedy.” ~NAB

Manor Marketing /   Tuesday 08:24 AM
aQ Broadcast Introduces New aVS Version 4 Firmware
Manor Marketing /   Tuesday 08:32 AM
aQ Broadcast Features aQ Production Suite at IBC 2014
Sue Sillitoe, White Noise PR /   Monday 10:45 AM
100 Free DPA Microphones – How Do You Wear Yours?

 
Featured Articles
Discover TV Technology