No more hum bars

Countries are making a mistake by not adopting the video frame rate standard to 720p/60 and 1080i/60
Publish date:
Updated on

After the Japanese surrendered in 1945, they switched to driving on the left side of the road. Driving on the left allowed them to protect their burgeoning vehicle industry, with Detroit unable to switch the steering wheel position.

This same industrial-political mindset was reflected in the TV broadcasting industry. When I started working in broadcasting, each country was vying to protect its own electronics industry from vendors outside its normal economic circles. But one decision about broadcast video standards was not political: frame rate. With standards not locked to the country's power grid distribution frequency, hum bars were a certainty on early receivers.

Power supplies improved over the years, and by the early 1960s, most video operations were running with sync pulse generators operating on crystal lock rather than line lock. There were no ultra-visible hum bars — unless there was a receiver fault.

The pre-World War II 405/50 standard in the UK was revived after the war, and although there was an opportunity to choose 525/50 for color operations, 625/50 was selected. The decision to go PAL instead of NTSC was a European political statement. Some people in BBC engineering sacrificed their careers by backing NTSC. France, always an oddball in the business world, went SECAM with the Soviet Union as its buddy.

There were benefits with PAL in the early days. Color phase corrections in the receiver were automatic, to a certain extent. However, the benefits didn't last long because improvements in circuit electronics were just about everywhere in the chain. SECAM was a fringe-lunatic standard with studio equipment made more expensive and complicated — even just to mix two FM signals together.

But standardizing PAL was not the last move by the UK to protect its own — now non-existent — receiver industry. They also moved the aural carrier 0.5MHz further out than the rest of Europe. So the studio video signals were still PAL-B/G, but the transmitted signal became PAL-I. South America didn't want to be left out of the signal race, so it created the PAL-M and PAL-N standards.

None of this national individualism worked, naturally. Japan was the first of the Asian manufacturers that were hardly fazed by the protectionism.

One would have hoped that the fight for business protection would have been seen as a disaster and, ultimately, more expensive for everyone. But for a different reason, perhaps, there is a great danger of elongating the standards problem. And, again, it is Europe that is missing the chance.

Europe is beginning to endorse HDTV. The push comes with the availability of larger displays at reasonable prices, which will allow for the additional resolution to be appreciated. European engineers remember the disaster of the HD-MAC standard that was created many years ago — and adopted by no one — so they cautiously are pushing for 720p and 1080i. The pressure is from the pay-per-view providers rather than the national broadcasters, and that is probably the right way. Those with the more popular content should make the gamble — not that you have ever read about content in this column.

But now Europeans are perpetuating a mistake. They are going for 720p/50 and 1080i/50. Hollywood has already sold out with 24-frame standards when it had the perfect opportunity to improve motion performance over film; now Europe is selling out too.

There is absolutely no reason why the standards adopted should not be 720p/60 and 1080i/60. It would get rid of the flicker we all see for the first few days of watching European television, and it would make simple international program-sharing a reality at last. Now we will have to wait (10 years, maybe) for second-generation HDTV, when the United States will probably go for 1080p/60. And that's no ho-hum.

Paul McGoldrick is an industry consultant based on the West Coast.

Send questions and comments