Phoenix Television, based in Hong Kong, installed its first virtual studio system from Orad in 2000, which at that time required an SGI Onyx computer. To get the most out of the virtual environment, producers at Phoenix were keen to use it dynamically rather than just as a static space. From early on in the project, they used robotic pan-and-tilt heads from Vinten Radamec, which were linked into the Orad software to ensure that camera moves were translated exactly into matching moves in the virtual set.
In mid-2009, Phoenix moved to a new headquarters building in Hong Kong. This allowed it to build a large, multiarea studio with newsroom, giving it a great deal of flexibility in operation. The new studio complex is equipped with the latest Orad ProSet virtual studio software, which works in conjunction with a sophisticated camera robotics system from Vinten Radamec. This includes two powered pedestals for fully automated control of pan, tilt and elevation with six robotic pan-and-tilt heads on conventional tripods or pedestals.
In the near future, the installation will be upgraded with the new Vinten Radamec Fusion FP188VR robotic pedestal with the FHR120VR pan-and-tilt head. Together, these provide fully automated control of pan, tilt and elevation as well as high-precision positioning around the studio floor, allowing cameras to be placed anywhere in the studio while ensuring that the virtual environment system can match the angles precisely.
An important benefit of the Vinten Radamec system in live virtual environments is that the range includes camera mounts, which are capable of being used either robotically, being automatically driven to the precise position required, or manually, in which case the sensors track the actions of the operator and report all positioning data to the control system multiple times per second. That gives the director freedom to move the camera during production knowing that the virtual environment will remain in sync. An interface box on the head connects to digital lenses from Canon and Fujinon to capture zoom and other data to add to the positioning information passed on to the virtual environment. Once calibrated, this means that the Orad software knows exactly what each camera is seeing at any instant and adjusts the field of view of the virtual set accordingly.
This level of accuracy also enables what users call “augmented reality,” which uses physical sets but inserts virtual graphics into part of the scene, even within dynamic camera moves. Augmented reality has become a characteristic part of Phoenix broadcasts and is also widely used in remote studios in Beijing, Shenzhen and Taipei.
The placement of virtual graphics onto real objects can also be used at the same time as virtual backgrounds, which means the system has to be capable of calculating the parallax between foreground and background objects too, adding even greater importance to precise positioning and zoom data from each camera angle.
While the sophisticated use of the virtual environment opens up the possibility of highly polished, visually slick content, there is another important constraint. As an independent broadcaster, Phoenix is keen to put as much of its budget on-screen as possible, and so it operates even its live transmissions with minimal crew. This requirement called for the tight integration of four systems: the newsroom from Dayang, playout automation from Pebble Beach, camera positioning and location from Vinten Radamec and graphics and virtual scenery from Orad. Shots and camera moves called by the director have to happen with a single key press on whichever device is most appropriate at the time; everything else has to be automatic.
In the case of the new installation at Phoenix, both the graphics system and the Vinten Radamec robotics are slaves to the newsroom automation system for principal control, but they communicate directly with each other for positioning information to ensure that Orad is aligning its graphics with the most updated and precise location data.
The result of this partnership relationship between camera robotics and the virtual environment is that the integration is seamless. This ensures that audiences are never distracted by mismatches between real and virtual and can stay focused on the content.
It also allows the system to be used for multiple programs in rapid succession, confirming the required camera angles as new sets are being loaded.
While the technology of the virtual studio has been available for a decade or more, the benefits they bring were slower to be recognized, and so their widespread use is only now growing rapidly.