Today's broadcasters demand flexibility, efficiency and scalability
JOHNSTON, IOWA: One positive effect of the poor economy over the last few years is that as a result of the lack of funds, IPTV has been unable to complete the conversion to DTV. The reason that I say positive effect is that early in the conversion, an awful lot of the hardware that was sold was close to prototype and exhibited some odd behavior.
During the last few years, the manufacturers have had time to refine designs and improve code algorithms. As a result, the hardware available today, including MPEG-2 encoders, is much better developed, more stable and efficient. Five years ago, when I came to IPTV, we were talking about a four-stream digital multicast and some HD. Now there are stations sending an HD stream and an SD stream and maybe a few barker channels as well as data. So the technology has moved ahead as all things computer-based do.
I won't quote Moore's Law in this situation because it really doesn't apply, and every time I hear it used to explain the trend in the computer industry, it makes me cringe. I read an interview once with Gordon Moore in which he explained that Moore's Law was an observation that he made in 1965 about the exponential growth in the number of transistors on silicon. At the time, Intel produced RAM chips and he determined that they would become a low-profit commodity. As a result, Intel got out of the RAM business. Moore's Law is a frightening trend that continues today and results in almost everything becoming a disposable commodity. Moore's Law and Murphy's Law will intersect some time in the future and the collision will be horrendous. But I digress.
A SERVER NAMED BOB
A long time ago, I wrote an article for Digital Journal called "A Server Named Bob," in which I recounted being at a restaurant pondering what IPTV should do for video servers in the DTV era when a waiter walked and said "I'm Bob, and I'll be your server."
If only it was that easy. For the last year or so, the team at IPTV has been looking at DTV encoders. As concerned as we were about making the right decision with servers, encoders are even more critical to the process and more difficult to pin down. The IPTV plan is to do multicast during the day, high definition during primetime and opportunistic datacasting all the time. Seems pretty simple, so how hard can it be?
Well we've started writing specification for an RFP for encoders, and it is a lot more complex than I ever thought it would be. The idea of replacing hardware every two to three years works when you're talking about PCs that are cheap. But when you're looking at an investment of several hundred thousand dollars, not too many broadcasters see that as a three-year investment. It is critical, therefore, to purchase hardware that is going to last and can be upgradeable. One of the requirements we are specifying is that all upgrades be software-based. The idea of having to pull encoders to handle updates-and you know there will be lots of them-is out of the question.
One critical component that needs to be considered is the barker channels that I mentioned.
One of the realities that came to be, once receivers started appearing in the marketplace, was the lack of tolerance to the dynamic allocation of streams within a digital channel. If a DTV tuner has gone through its boot-up process, it does not like to see changes in the quantity of streams associated with the RF channel.
This problem was anticipated in the ATSC standard and the Directed Channel Change Table (DCCT). The theory is that the station, through PSIP, can instruct the receiver to switch virtual channels within an RF channel.
A station switching from a multicast to an HD program can instruct the receivers watching the multicast to change to the HD stream. Unfortunately, the DCCT implementation in a receiver is optional and as of this time, I am not aware of any CE manufacturer that is implementing the DCCT.
According to ATSC-65b, "If Directed Channel Change is not supported by a DTV receiver, there is no visible impact on the main broadcast program perceived by the viewer."
However, when the program that the receiver is tuned to and the virtual channel that is associated with it simply vanish, receiver reaction is unpredictable and varies depending on the hardware. Some receivers simply lock up and the consumer must reboot; not exactly the kind of transition that we as broadcasters want the viewer to experience.
So how do we deal with it? The barker channel or trickle stream? The idea here is that the number of virtual channels doesn't change as far as the tuner is concerned but the data rate within the channel does. So the SD and HD encoders have to be able to dynamically move from full bit-rate encoding to very low bit-rate. The figures I have seen quoted for barker channels are in the neighborhood of 300 kbps to 1.5 Mbps, which is enough to put up a simple graphic instructing the viewer that the program content is on another virtual channel.
The problems here are that during primetime, when everyone is running HD, a viewer surfing through channels may go through 20 or 30 barker channels.
In addition, if a station is running a four-virtual-channel multicast during the HD broadcast, they are giving up 1.2 Mbps at the minimum for barker channels. Giving up 10 percent of the data capacity is a fairly steep price to pay for what amounts to a trouble slide. I certainly hope to see something more in the line of 100 kbps or less in the future. Better yet, I'd like to see DCCT implemented so that the full channel can be used for meaningful content.
Another significant concern regarding the encoders is how they play with others. In theory, you'd like each encoder to dynamically analyze the content that it is encoding and process it, using the minimum data capacity necessary to supply the specified quality of service.
The encoders then would feed their signals into a statistical multiplexer, which combines them along with data into an MPEG-2 transport stream. However, in order for this to work, the multiplexer must be able to talk with the encoders and the datacast system to determine the amount of space available for opportunistic data.
If you've ever looked at the data capacity of a transport stream, you've no doubt noted that the amount of null packets available on a frame-by-frame basis is very dynamic, and huge swings in the available capacity are the norm. The simple solution would be to set all the encoders, including the "opportunistic" one at a fixed rate, and as long as the sum of all the encoder capacities doesn't exceed the capacity of the transport stream, everything is fine.
Unfortunately, this leaves a lot of unused packets in the stream that are not benefiting the viewer because the content stream doesn't need them for the quality of service selected, and the datacast can't use them because their availability cannot be guaranteed. Statistical multiplexing should allow the station to send virtually no null packets by centrally managing all the streams and dynamically allocating the capacity and assigning the packets to whichever services need them.
This can, and is, being done by most manufacturers to varying degrees, but we all know that nothing comes without a cost. To achieve this level of control, the services management must have extensive information on the content as it is being encoded so that it can make the allocations. This takes time and time is latency, so usually the more dynamic and flexible the service is, the more delay through the system. For the most part, fixed delay does not create real problems other than the issues of live broadcast that we have been dealing with since satellite newsgathering began. But it does have to be recognized and planned for in the design of the system.
This is certainly not the exhaustive list for encoders and doesn't even touch on the MPEG-4 and WM9 encoders that we also need to include in our study. But it does give us a feel for the complexity and the tight relationship between the systems. I wish my server Bob could help me now.
Today's broadcasters demand flexibility, efficiency and scalability