You might not have noticed that HD-SDI ground loops are possible. Yes, this is another rant about the Age of Digital TV, which started with the invention of the on-off switch.
“But, Mario, seriously, aren’t we still a year away from the Age of Digital TV?”
It’s pretty serious, all right, but you need to take another look at the calendar. There ain’t one single broadcaster required to start transmitting digitally next February. Every full-power U.S. broadcaster was required to be transmitting digitally as of May 2003, most of them a year before that, and a few as long ago as May 1999. As for Class A and other low-power stations, they don’t need to transmit digitally even next February, which is how come they’re griping that NTIA didn’t make a requirement of digital-adapter boxes that they need NTSC tuners.
Now, then, if you ain’t talking specifically about broadcasting, you’ve got your D-1 digital videotape recorder going back to the 1980s, your digital frame sync and TBC maybe a decade before that, and your digital character generator dating from the ’60s. So we’ve been in the Age of Digital TV for one whole heck of a lot longer than next February, but that doesn’t mean we know how to deal with it yet.
AGE OF DIGITAL TV
I ain’t even talking about too many fat folks on-screen on account of we don’t use automatic format description (look it up) or lack of lip-sync due to anything from a chip camera to a plasma TV to video effects to an MPEG decoder. You might think those delay issues are in the encoder, but, if you make the pictures noisy enough, good luck matching time stamps. Welcome to the Age of Everything on Television as Out-of-Sync as a Fellini movie.
Anyhow, I said I wasn’t talking about that, so I’d better change the subject. I have a deep, dark confession to make. I miss EIA-250.
There! I’ve said it!
Yes, I admit that, back in the days of EIA-250, I used to complain about it, but it gave us a base level of quality that we just ain’t got these days. I mean, suppose you saw video hum. You whipped out your EIA-250, looked up low-frequency video noise, saw you were out of spec, and fixed the ground or stuck in a hum-buck coil. Shazang! No more video hum.
These days, the same piece of coax cable, with the same connections and the same ground potential, is maybe carrying HD-SDI instead of analog NTSC. You ain’t going to see hum. But maybe you ain’t going to see anything, if the hum is messing with your eye pattern. Or, worse, maybe you’ll see pictures for a while, and then they’ll disappear.
Heck, it doesn’t even have to be hum. Run NTSC through any hunk of coax shorter than 300 feet or so, and you’ll be hard pressed to notice anything wrong. If it’s really thin coax, maybe you’ll notice a little softness in the picture and crank up the EQ. EIA-250 provides all the guidelines you need, and VITS gives you the necessary test signals even while your cables and amps are in service.
If you’re running HD-SDI, on the other hand (where—hint to my secret identity—I still have five fingers), you’ve got to start rubbing your rabbit’s foot at 100 feet or so. Never mind ground potential. Never mind kinked cables or tees or those nice, evenly spaced cable ties that can have the effect of wiping out higher frequencies (yes, Virginia, sloppy cabling can be better for your HD-SDI than neat, perfectly even lacing, and I am not making that up). No, never mind that other stuff. Just try running through a 50-ohm barrel or two.
Quick question: What is the impedance of your BNC barrels? Really? Are you sure? If you’re dealing with HD-SDI, you’d better be.
Just in case you’re thinking of going optical to avoid both the ground potential and the cable losses, have you considered cleaning fiber connections? Have you got the test equipment? How about the cleaning stuff?
Someone sent me a proposed SMPTE recommended practice on cleaning SMPTE hybrid camera-fiber connectors. There are some scary pictures of how dirty the fibers can get. There are even scarier pictures of how dirty they remain even after they’ve been “cleaned.” I won’t even mention electric-to-optical and optical-to-electric modems and their power supplies. Whoopsie! I just did.
I can see it’s time to change the subject again. NTSC composite color video came in just one flavor. HD comes in 1080 and 720 lines (active, of course), interlaced and progressive scanning, and a bunch of frame rates.
The standard says it can be 24.00, 30.00, and 60.00, but, from a practical stand, U.S. broadcast HD is 29.97 or 59.94 fps, and, if you want that “look of film” (which I’ve got to call it on account of “film look” being someone’s trademark), maybe you’ll shoot 23.976.
Now, then, there are 60 seconds in a minute and 60 minutes in an hour, so, if you’re at 29.97 fps, and you count every frame, your timecode clock will be slow by 3.6 seconds per hour, which can add up pretty quickly in a TV station. That’s why dropframe timecode was invented. You just drop the first two frames of every minute except every tenth minute, and your timecode clock comes pretty danged close to matching astronomical or atomic time.
If you’re at 59.94 fps, you just drop twice as many frames—same principle, simple math. But answer me this: If you’re shooting at 23.976 fps, what do you drop for drop frame? You ponder that one for a while. Let SMPTE know when you’ve got it figured out. Digital HDTV is great. I’d just like correctly shaped folks who speak in sync and don’t disappear every now and then.
Future US's leading brands bring the most important, up-to-date information right to your inbox
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.