—I attended a SMPTE Section meeting in the U.K. recently and got involved in a discussion with several broadcasters regarding our road maps for the future. The discussion focused primarily on the potential for implementing UHD in our studio environment regardless of whether or not we will ever broadcast it terrestrially. We discussed the many aspects of UHD including larger color space, high dynamic range, higher frame rates in addition to the increased resolution. I talked about how at Iowa Public Television, we work in an uncompressed environment when creating shows in our studio and how that has simplified our infrastructure. But as I look forward to implementing UHD, even at a relatively small facility, the laws of physics and the laws of economics seem to be conspiring against the concept of the uncompressed backbone.
In addition, next-generation broadcasting will almost certainly be based on IP packetized data transmission to allow for tighter integration of terrestrial broadcasting within portable, mobile and fixed devices that will be acquiring content from broadband services of all types. That tight integration is not just at the receiver by way, as we have to realize that in the future, the content that we create will not be delivered exclusively by our over-the-air services. Ideally my station must be able to create content in a manner that allows for its seamless distribution over a myriad of different methodologies and systems.
IT BOILS DOWN TO MONEY
This is where it becomes so incredibly challenging to understand and make informed decisions about the networks involved in the acquisition, creation and distribution of content and the interaction between these networks.
The decision for acquisition and creation are pretty straightforward and essentially boil down to how much money you can invest. As I said earlier, we were able to construct an infrastructure at IPTV that can handle uncompressed HD and any lower data rate. We didn’t spend a lot of time worrying about error correction or concealment or capacity. We designed the infrastructure with a tremendous amount of headroom and capacity and were meticulous in how the plant was put together. We put in the time, effort and money on the front end and the dividend is that the signals that travel through the facility are kept pristine.
|UHD shown in comparison with other popular digital video formats
The transmission aspect is a little trickier in that we needed to compress for ATSC broadcast and do it in such a way that the quality video and audio content was not degraded. Obviously that is an impossible task since any form of lossless compression would not provide the reduction necessary to squeeze a 1.5 Gbps video stream and its associated 5.1 channel audio stream into the data bandwidth available in a 6 MHz broadcast channel. Couple that with the fact that my station has an HD stream, two SD streams, an audio-only stream and some datacasting applications and obviously some compromises are required.
Add to that the fact that unlike our internal networks where we have complete control over the environment, once the signal leaves the antenna, we have no control over the delivery network and the challenges grow. Pile on to this the idea that in addition to the terrestrial delivery network that we currently rely on, we will also be relying on wired and wireless networks over which we have even less control and the audience that is using these service is the one that is growing. It becomes pretty mind numbing to try and figure out how to ensure that every viewer accessing the content is getting the best quality possible.
In the professional space we manage the quality and reliability of our networks through service level agreements that specify the characteristics of the service and any guarantees that may be associated with the service. SLA’s can be a real challenge for those of us who have been in the business a long time as the terminology and metrics used are typically related to data packets, so the metrics are about the packets. Our interest is about the content and ultimately that is about the information within all of the packets and trying to decide what is an acceptable packet loss or jitter parameter is difficult.
A modern compression codec utilizes both spatial and temporal compression algorithms to create a stream. That stream is then packetized for transmission, but because of the nature of the compression engines used, the value of the data within each packet may vary from unit to unit so the impact of packet loss will vary depending on which packets are lost as well as the quantity.
In an extreme example, if all of the packets that are lost are null packets, the impact to the content is negligible, but if a few packets that are lost contain the data necessary to recreate the video or audio service, the impact is enormous. I am obviously overstating the case to make the point that it is very difficult to use data metrics as a way of measuring the quality of delivered content.
The reason this is so important is that we are seeing this technology move into the professional space and we need to educate ourselves, the manufacturers and the service providers regarding what we need and why. If I create an IP network within my facility, I will again have complete control over the network and will design it with the capacity and overhead necessary to ensure reliability. If I have to link that network to others outside of my control to allow the utilization of cloud services then I had better fully understand the capabilities and limitations of the services being offered to insure the quality of not only the packets on the network but the content included in them.
Bill Hayes is the director of engineering for Iowa Public Television. He can be reached via TV Technology.