Concatenation Needs Much More Elucidation

You might not have noticed that manufacturers can’t make up their minds these days. Yes, my rant this month is about digital concatenation.

I mean, suppose you want to use a Panasonic HDTV recorder. Well, you’ve got some choices. There’s good old D-5 HD. Then you’ve got your DVCPRO HD, your AVC-Intra 100, your AVC-Intra 50, and your AVCHD. Oh, did I forget to mention the MPEG-2 on the DMR-E700BD?

It ain’t just Panasonic. Over at Sony, you’ve got your MPEG-4 in HDCAM SR, your Sony-proprietary system in plain-vanilla HDCAM, MPEG-2 in XDCAM HD and HDV, and, like Panasonic, AVCHD. Never mind the different data rates of MPEG-2 in XDCAM HD alone.


Anyhow, in some sense, that ain’t an issue. Nellie, my last functional neuron, tells me that D-1 was called D-1 on account of it being SMPTE’s first digital recording format. Me- thinks Digital Betacam was one of the first that used data compression.

That was so scary to us video engineers that Sony ran demos showing a hundred generations of re-recording. If there were any flaws, they sure weren’t jumping out of the picture.

The same thing happened with every new compressed format, SD or HD. Here’s a hundred generations of Ampex DCT; there’s a hundred generations of JVC’s Digital-S. No problem.

Sony did a slightly different song and dance for HDCAM. This time they had not just compression but also filtering. What was once 1920-pixel luma changed to 1440, and 960 chroma dropped to 480. They had to show that wasn’t such a big deal perceptually.

Then Panasonic did something similar with DVCPRO HD. In 1080i mode, what was 1920 dropped to 1280, and what was 960 dropped to 640; in 720p mode, what was 1280 dropped to 960, and what was 640 dropped to 480. Yeah, you could see the differences, but they weren’t the end of the world.


Too bad time marches on. When HDCAM came out, you could count the number of TV stations transmitting HDTV on the fingers of one hand. Heck, methinks you could do it on the tip of one finger, and you wouldn’t even need that one finger to count the number of broadcast HDTV sets in homes.

That ain’t exactly the case today. You might have a producer shooting in HDV, editing on maybe the Apple intermediate codec, squirting that into DVCPRO HD for transfer to the play-out server (which might be JPEG2000), broadcasting that as MPEG-2, and having it turned around by a satellite service as AVC. Did I mention the viewer’s TiVo? How about the 270 Mbps studio-to-transmitter links? Eeps!

We ain’t talking a hundred generations of the same thing anymore. We’re talking the real world, and it definitely ain’t pretty.

Never mind compression. How about just checking out that filtering? Sony made a 1440-luma and 480-chroma choice in HDCAM. Panasonic chose 1280 luma and 640 chroma in 1080i DVCPRO HD. You might prefer one to the other; I ain’t going to try to sway you. But if you record in either and transfer to the other, now you’re looking at 1280 luma and 480 chroma. And I ain’t done yet.

You could also argue the fine points of 1080i versus 720p. But suppose you shot something on a nice Panasonic Varicam at 720p, and it got cross-converted and transferred at some point in its life to HDCAM. Now it’s 960 luma and 480 chroma. If it went the other way, it’s still 960 and 480 but maybe with interlace artifacts, too. And I still ain’t done.

At some point, you can pretty much guarantee the signal is going to be filtered to 4:2:0, maybe in HDV, maybe in transmission to the home. Now it’s 960x720 in the luma and 480x540 in the chroma (yes, one has more rez horizontally and the other vertically). No, it ain’t over yet.

Go to your friendly neighborhood Best Buy or Circuit City or Discount Digital, and you’ll find TV screens that say they are 1920x1080. Heck, for all I know, if you take a magnifying glass and the necessary time, you can probably count 1920x1080 pixels on the screen. This ain’t about lying; it’s about scaling.

The 1920x1080 that appears on that screen ain’t the 1920x1080 that you’re transmitting. It’s a subset. The picture is blown up a little. And, if you’re familiar with scaling technology, you know a little is harder to do than a lot. Just try designing a good scaler that’ll go from 1919x1079 to 1920x1080. I’ll wait here for you.


So, on the half-shell, what started out as 1920x1080 could end up at a TV set as some crummy version of less than 960x720 luma and 480x540 chroma. And that’s only if you’re lucky, on account of compression concatenation.

When you compress HD to 18 Mbps, that’s a compression ratio on the north side of 55:1. Let me put that another way. You are throwing away more than 98 percent of the information at the camera chips.

All of those hundred-generation tests show that ain’t so bad, if you’re using the same compression codec. But that ain’t going to happen all the way from the camcorder to the home TV. So, if the camcorder’s tossing 98 percent of the info, and the edit system is tossing a mere 90 percent that sounds pretty good, eh?

Well, it would be if the 90 percent was a guaranteed subset of the 98 percent. But it ain’t. With Motion JPEG, MPEG-2, MPEG-4 Part 2, AVC, ProRes 422, VC-1, VC-2, VC-3, JPEG2000, and DVCPRO HD as just a few of the codecs out there, the likelihood of anything being a subset of anything else is pretty danged low.

Now, then, I ain’t complaining about all the choices or the quality of any one of them. But how about testing the whole chain? Your perfectly good system could still screw things up.