Reader Feedback

I suspect Real Player wants people to believe this, for obvious reasons, and has perpetuated the myth.

Editor: In Barb Roeder's recent article, “Getting Started with Streaming Media,” is the following paragraph:

“Today's streaming media can be delivered using several different methods. True streaming media requires a specialized server such as RealServer, Windows Media Server or the Quick-Time Streaming Server (QTSS). Your content is then delivered and viewed in real time, whether it originates from a live event or an archived media file. The caveat to true streaming is that the bit rate of the movie must match the bandwidth of the connection or buffering will occur and playback will be interrupted.”

I have been streaming Real Player video files off of ordinary servers (including a free server) for some time now. So long as my connection speed is adequate, I don't experience buffering or playback interruption. I have heard this statement made as regards Real Player so many times — even a presenter for an Adobe seminar insisted on it and insisted that what I was already doing was impossible!

I suspect Real Player wants people to believe this, for obvious reasons, and has perpetuated the myth. You can't blame them for trying, but if you look hard enough in the Real Producer manual it is quite clear that the only drawback of using a non-dedicated server is that you lose some of the more refined capabilities of Real Server, and that buffering and interrupted playback is “more likely” — but hardly inevitable. I have an ADSL line, and I can watch an hour-long video presentation I created and encoded for 28K, 56K or faster connections, and never experience an interruption. When I had a 56K modem, sometimes I did, sometimes I didn't — including on dedicated servers.

So I think it's true to an extent and to a point, but it's hardly the black-and-white issue most people seem to think. I think it's more a matter of your connection speed than whether the server is dedicated or not.
Steve Sakellarios

Barb responds:

Any video can be served from an ordinary server, which is sometimes referred to as HTTP streaming. HTTP is the protocol for transport of web pages over the Internet. The specialized streaming servers mentioned in the article use, Real Time Streaming Protocol (RTSP), at least as a first choice, for delivering the streaming media. RTSP has a lower overhead and less error correction, so real-time delivery is better. The error correction and feedback communication built into HTTP permits the guaranteed delivery of data, just not necessarily the timing of that data. Hence, we wait for web pages if the connection is slow, but we always get all the information unless the connection is broken altogether. If the player has to wait for packets due to poor connection performance, buffering occurs during playback.

As Steve says, “So long as my connection speed is adequate” there is no buffering during streaming from an HTTP server. Using an ADSL line capable of up to 6Mb/s, chances are you would never see poor performance because most streams are designed for 300- to 500Mb/s even in the broadband market.

So how much ERP do I really need?

Aloha, I read the article “Engineering KICU's path to DTV” by Jim Boston and David Lingenfelter. I have a simple question: What is the approximate ratio of DTV to NTSC ERP needed to maintain existing city grade coverage patterns for UHF stations?
Ken Wooley

Jim responds:

Ken,

The FCC used the Grand Alliance guidelines in assigning DTV power for each station 12dB below the NTSC power level for UHF to UHF transitions, but didn't use it for VHF to VHF. The difference in power between NTSC and DTV is due to the signal-to-noise ratio for impaired viewing between the two. It is generally recognized that NTSC viewing becomes objectionable at 28dB S/N. With DTV, the signal hits the error cliff edge at a little above 15dB — a difference of approximately 12- to 13dB.

One reason that the FCC is concerned with UHF is that it increased the time factor for considering coverage for UHF from FCC (50,50) to FCC (50,90) (50 percent of sites receiving the signal 90 percent of the time). As you know UHFs tend to have larger grade As than VHFs because of the large powers usually radiated, but much smaller grade Bs. (Note: Grade A and B coverage is not used in DTV measurements.) Also UHF's radio horizon is usually the same as the optical horizon, whereas VHF can have RF horizons well beyond the optical. Thus the FCC tended to be a little more liberal with VHF power levels.
Jim Boston
The Evers Group