Technology in Transition: MPEG testing and monitoring

MPEG testing and monitoring

By John Luff

A few years ago, I did a series of seminars with Michel Proulx, director of product development for Miranda Technologies, on two topics: SDI systems and “Quantum Video 101 for Analog Engineers.” The goal was to help engineers and production professionals understand and feel comfortable with video they could no longer “see.” We saw lots of blank stares and a fair number of nods of comprehension, and we heard a few cogent questions. One of the questions we heard in more than one city was how to “do video” without scopes. That was a difficult question then; today the tools of the technical trade have begun to catch up markedly.


Achieving the right MPEG-2 quality and bit rate is more important than ever due to the increasing use of DVD, streaming video and multichannel digital transmissions. Shown here is Snell & Wilcox’s Mosalina, which provides MPEG-2 users with a means of assessing picture compression quality.

Not long ago, the only MPEG testing and monitoring equipment clearly was not ready for operational environments, and it sometimes cost more than the encoder it was monitoring. Today, the tools have become much more approachable, more affordable and considerably more intuitive. Of course, we also are more comfortable with video that can be seen only after it is reassembled to simulate the original signal.

For the purposes of this article, I want to define “testing” as verifying compliance, validity and compatibility. Similarly, “monitoring” might be redefined as verifying acceptability, suitability and quality.

Decoding MPEG

MPEG typically uses an interconnect, which is usually DVB ASI. ASI is 270 Mbits of data, coded NRZ as opposed to NRZI in SMPTE 259M, and containing any number of services multiplexed together. Unlike analog or digital video, looking at the carrier cannot tell you anything about the content. SMPTE 259M at least can be quickly decoded to show the sync information. It is not much more complex to turn the video itself from what appears to be random bits into the original picture.

MPEG reception requires several steps to be taken. Sync must be identified, program tables must be decoded, and elementary streams must be extracted. Only then can the proper bits be directed to a decoder to reconstruct the original signal, or at least a representation of the original signal, for MPEG is not about reproducing reality as closely as possible. MPEG is entirely about reproducing a level of quality that will show the perception of the original reality that is defined as acceptable by most viewers for the intended use.

Monitoring in the context of this article then is about establishing that qualitative assessment of the content. It is done by testing a range of material that has been encoded, decoded and displayed before an audience in a formalized testing regime. (CCIR REC BT-500 defines the method of consistently performing this testing.) The key is to find an electronic measure of quality that can be correlated to real-world viewers under these controlled conditions.

Monitoring modes

Several manufacturers have developed the technology required to do this in two modes. One uses a set of standardized sequences, runs them through the encoder and then looks at the results after decoding in a closed-loop system. This technology produces good correlations to the real world, but of course it cannot be used on live material for which no difference signal can be computed using the closed-loop system. This approach is quite good at testing in the manufacturing environment and establishing global parameters for an MPEG system (bit rate, GOP, coding standards). You might liken this type of equipment to automated measurement equipment in the analog world, which either requires full-screen test patterns or the presence of an inserted test signal in the vertical interval.

The other approach looks for the signature of compression artifacts in an open-loop system. In this method, the expert viewers are shown a wide range of material and judge the presence of various coding “errors.” The correlation of necessity is not as strong, but if the samples are statistically large enough and the range of content tested is varied enough, it is easy to see that the correlation can be statistically valid, even if the accuracy on an individual scene is not as good. This approach is more applicable to online testing in a program environment and will establish when something is stressing the system in unexpected ways, or perhaps the failure of a part of the system. In this case, the analogous test instrument would be a waveform monitor that you can use to interpret the quality of a signal if you understand signals in general and how to interpret the display.

Both of these approaches required considerable research and testing to establish the validity of the technology. Under certain circumstances, each can be an extremely valuable tool in subjective measurement of quality. Each has its strong points, and the reader is advised to contact the manufacturers for in-depth reviews of their products and the methods of use they recommend. This tests the content, but it does not say anything about its adherence to uniform standards needed for successful interface and communication between equipment.

Testing

The other half of the equation, testing as defined above, is more about ensuring that the bit stream is compliant with the specifications for the physical layer (bit rate, levels, jitter, rise times), as well as the compliance with requirements for the syntax of MPEG. MPEG is a decoder-centric specification. Bit streams can be produced by any means as long as they can be decoded by a compliant decoder. The encoder must follow all the rules for assembling the MPEG syntax. It must use all the management tools correctly, inserting valid data in all the tables, and properly referencing all of the elementary streams. Time stamps must be valid, and the use of the compression algorithms must match those specified for the profile and level in use.

For instance, MP@ML is permitted to vary up to 15 Mbits/s. A stream identified as MP@ML, but using 4:2:2 coding at 30 Mbits, would break syntax rules and be labeled as not compliant, even though many decoders might be able to produce a perfectly acceptable picture.

MPEG testing can be done without reference to the content of the picture. It suffices to decode the syntax and compare it to the standards, and display the content in ways that users can interpret. Often alarms are set that will notify the user when particularly important parameters are not within those expected.

Only a few years ago, a number of manufacturers were building larger and much more expensive MPEG testing hardware. Today one can buy, for a couple of thousand dollars, a system that can be controlled completely over an IP link. One overseas manufacturer brought his system to my office in the form of only an Internet address for demo. I connected to a box on the other side of the world, and we worked through their offerings in some detail without ever leaving my desk. Other manufacturers have concentrated on comprehensive hardware solutions with integral displays. Exploring the range of offerings is not easy though, as they now number in the dozens.

At the end of the day, you should consider having both testing and monitoring capability if you use MPEG professionally. Without all the right tools, it is difficult indeed to know just how good, and valid, your MPEG bit stream might be.

John Luff is senior vice president of business development for AZCAR. To reach him, visitwww.azcar.com.

Send questions and comments to:john_luff@primediabusiness.com

Do you have a comment about this article? To tell us your thoughts, click here.

Back to the top

Return to Broadcast Engineering