Serving Up ITV, Part 1

Once the deployment of digital television (DTV) advances enough to warrant meaningful data broadcasting and other services – such as interactive television – content developers and broadcasters will need a means to store, manage and process that data as it makes its way through the DTV system.

Nonaudio- and video-centric information that is intended to be carried in the ATSC-compliant MPEG-2 streams will require hardware and software that can integrate and synchronize not only video and audio elements, but also other supplemental data – in a harmonious format that the broadcaster can efficiently and effectively use. In turn, the receiving devices will need tools to use this supplemental information in a way that viewers will find attractive, entertaining and valuable.

Video servers and their associated sub-systems are a natural medium for storing and playing-out the video/audio content, as well as the metadata or associated subchannel information needed for broadcast television video transmissions. This process is now recognized as "data broadcasting."

The potential enrichment of program content – conveyed in time coincidence with conventional broadcast programming – is only one of the many features and aspects of DTV for which standards are already in place. Applications for future enhancements in the field of data broadcasting continue to be developed. This installment of my column will look at aspects of auxiliary digital services heretofore only marginally possible in the analog NTSC broadcast domain.

DATA BROADCAST FRAMEWORK

Data broadcasting technology is structured by, and described within, the ATSC A/90 data broadcast framework. In the United States, portions of the software environment for the A/90’s work are contained under the description of the Digital TV Application Software Environment (DASE). The Media Home Platform (MHP) is the European equivalent of digital video broadcast (DVB). Both activities are intended to enable and unify technology aimed at standardizing interactive digital television content and behavior.

(click thumbnail)Table 1: DASE levels
DASE working groups are focused on providing three fundamental levels of applications, described in Table 1. Each of these DASE levels, which are intended to provide increased capabilities and services, will be possible once the deployment of DTV is accelerated and the technology is incorporated into set-top boxes and/or receivers to a sufficient degree.

Over time, television receivers will be outfitted with additional integral or outboard component features that will enhance the entertainment and information capabilities of digital television broadcasting. These devices, – possibly referred to in the future as digital appliances – will combine such subsystems as personal video recorders (PVR), Internet or wireless communication interfaces for two-way interactivity, complete computer-centric control systems, as well as sophisticated viewer-sensitive selection systems that target the individual user (in much the same way the Internet selects and processes information according to a user’s profile).

DASE SYSTEM INTERCONNECT

The DASE System Interconnect incorporates the fundamentals of the digital television receiver – from the demodulator to broadcast (MPEG-2) transport and user control to display and audio; it also incorporates the layer of applications (the DASE system) that supplements the platform services. Fig.1 depicts how the DASE system is interfaced to the DTV-user system.

The architecture of MPEG-2 was fundamentally created to carry the data necessary to enable the feature sets envisioned for data broadcasting. These applications – such as false video-on-demand, interactivity and the appearance of directed information channeling to the viewer – are possible because of the various systems’ levels embodied in the transport mechanism of MPEG-2.

Future developments in MPEG-4, -7 and —21 may eventually offer even more capabilities as applications are developed and systems are deployed. How this architecture is assembled, in terms of the MPEG-2 systems and DASE, will be the focus of the remainder of this installment.

The encapsulation and structure of information used for data broadcasting relies primarily on the MPEG-2 systems and extensions to the Digital Storage Media-Command and Control (DSM-CC) specifications (ISO/IEC 13818-6). In the same fashion that the Internet has attracted millions of users in part because of its customizable feature sets, objectives for DTV audiences also include the ability to customize the viewing experience on a user-by-user, and channel-by-channel, basis.

The systems necessary to implement user-selectable controls are defined in the DSM-CC specification, whose original aim was to provide VCR-like controls for two-way services, such as VOD. DSM-CC’s download protocol is used to deliver files in formats that support a delivery on either a one-time or a carousel basis; the files are sent via a synchronized non-streaming download or an asynchronous streaming basis.

The carousel concept is based on multiple sets of short snippets of data being periodically sent, as well as repeatedly sent, so that the receiver appliances collect sufficient data to respond to the actions, without having to wait for a lengthy download period before any of the information is usable.

The depth and complexities of MPEG-2 provide avenues for the deployment of interactive television broadcasting. Under the MPEG-2 umbrella are two packet-oriented coding formats that are described in MPEG-2 systems: the program stream and the transport stream. Both are format-optimized for their particular application or environment.

Historically, MPEG-2 program streams were to be used for storage systems (including video servers) and employed in VOD and interactive multimedia applications.

Today, their main applications are in DVD. Program streams are designed for high-reliability transmissions where their large- and variable-sized payloads require low-loss/low-error performance.

On the other side, MPEG-2 transport streams carry multiple MPEG-2 programs and applications for environments that are hostile – meaning the conditions are potentially lossy – the quality of the transmission varies and the susceptibility to accumulate errors is high (for such things as microwave, satellites or cable systems). Transport stream coding formats are the standard that the ATSC selected for program and emission transmissions.

An MPEG-2 program is a set of component streams or elements, generally comprised of compressed audio and video, which share a common time base. A program multiplexer assembles this collection of compressed audio and video elements, which were created from baseband (digital) media sent through video and audio encoders, into bitstreams that are properly coded according to the MPEG-2 system’s standard.

It is this program multiplexer that adds clock references and time stamps to the encapsulated elementary streams, generating an MPEG-2 transport stream. For DTV emission, one or more sets of transport streams, created from multiple program multiplexor outputs, are multiplexed (muxed) to form a compliant stream (e.g., ATSC) that suitably meets the needs of the transmission format it is intended to serve.

The final stream comes together in this emission mux, which will further accept streams from data servers, from PSIP generators and/or form-conditional access generators – generally at the time of transmission formatting. It is at this stage that any scheduling or synchronization between MPEG-2 programs, in the form of transport streams, and data (for data broadcast applications) needs to occur. Often a third-party data management entity is employed, whose task it is to automate, schedule and control the activities of the emission mux such that the synchronization of outbound data and any return channel data management is harmonized.

The data management tool may also be the mechanism that records – for billing purposes – the specifics of all the transport stream activities, since it is at this point in the process that all the elements are combined into a single transport so as to be ready for input to a transmission-specific device (e.g., fiber or microwave link, or 8-VSB modulator).

In part two, we will continue with the integration of data servers for DTV broadcasting and explain how future profiles and architectures for interactive services are being configured.

Karl Paulsen

Karl Paulsen is the CTO for Diversified, the global leader in media-related technologies, innovations and systems integration. Karl provides subject matter expertise and innovative visionary futures related to advanced networking and IP-technologies, workflow design and assessment, media asset management, and storage technologies. Karl is a SMPTE Life Fellow, a SBE Life Member & Certified Professional Broadcast Engineer, and the author of hundreds of articles focused on industry advances in cloud, storage, workflow, and media technologies. For over 25-years he has continually featured topics in TV Tech magazine—penning the magazine’s Storage and Media Technologies and its Cloudspotter’s Journal columns.