In a short amount of time, file-based content delivery has taken the industry on like a storm.
Five years ago, file-based content delivery was making inroads with broadcasters, but it was still a somewhat unusual way to deliver video. Now, file-based content delivery is the norm. Thanks to the deployment of key enabling technologies, less expensive networks and a rise in file-based content delivery to the consumer, file-based delivery is coming of age.
For decades, content was delivered to the broadcaster by one of two methods. Network programming was delivered on a network satellite, and syndicated programming and commercials were delivered by overnight courier. The network programming feed made sense because there was a single source distributing the same content to many stations. The overnight courier was an efficient way to do point-to-point delivery, where many different commercials were being sent from many originating points to many different broadcasters. While satellite transponder and shipping costs were not insignificant, they were, and in many cases they still are, the most cost-effective way to distribute content.
Almost as soon as the Internet came into being, broadcasters and content distributors began looking into ways to deliver content over this new medium. But financial and technical hurdles stood in the way. When it was developed, the Internet excelled at moving text from one place to another. Because even a small video clip is many times larger than the average text file, the cost of bandwidth was one of the major stumbling blocks to moving video over IP. Second, there was the last mile problem. You might be able to get high-speed connectivity between two cities, but it may be impossible to get a high-speed link over the last mile from a telephone company central office to the broadcast facility.
Today, bandwidth costs are falling, and while it may still be a determining factor in whether a particular project is able to move forward, in many cases, affordable connectivity for video distribution is available. Let us look at some key enabling technologies that have made file-based content delivery practical.
It almost goes without saying that digital video technology is the keystone of file-based content delivery. From where we sit today, digital video is a given. But it was not clear at the time that the industry would be successful in creating a single ubiquitous digital video standard. Without its development, file-based content delivery would be impossible.
Video and audio compression
Without the development and deployment of interoperable, efficient video and audio compression, file-based content delivery would be just a dream. Compression ratios of 10:1, then 100:1 and now beyond have allowed broadcasters to lower their transmission costs with an acceptable impact on quality. The industry has been extremely well served by the development of compression and would not be where it is today without it.
Extensible Markup Language
A comparatively new invention is Extensible Markup Language (XML). XML has allowed broadcasters and content distributors to include metadata about the files they are sending. The industry has recognized what a critical role metadata plays in the content distribution chain. The adoption of XML allows us to use many of the software tools developed in the computer industry for the construction and manipulation of metadata.
Material eXchange Format
The Material eXchange Format (MXF) is a wrapper that allows a content delivery service to wrap together video, audio, closed captioning and metadata in a single container. The container can be streamed from the content delivery service to the broadcaster and played directly, or it can be saved as a file. The power of MXF is in its object model — the way metadata is arranged in the wrapper. A receiving application can decode the metadata in the MXF header and quickly determine if the content meets delivery specifications without detailed analysis of the contents of the file. Furthermore, the items in the file and the way they are referenced in the metadata have been standardized to increase interoperability.
Error correction has been around for many years. In fact, error correction has been available for files sent across the Internet since the Internet's inception. However, the industry is now discovering standardized ways of implementing error correction that allow interoperability among different implementations.
Peer-to-peer (P2P) networking has been used on the consumer side of the Internet for quite some time. P2P gets around a number of issues with broadcasting over the Internet. It can be difficult to send the same file to hundreds or thousands of sources simultaneously, because traditionally, the sender must establish a one-to-one conversation with each receiver. Clearly, a traditional broadcaster would be overwhelmed if it had to electronically talk to each receiver. P2P allows portions of a file to be sourced from any number of senders, so the load of sending the file can be distributed beyond a single source. Several professional implementations of P2P have been created, and the implementation of this technology on the broadcast side is increasing.
One-way IP transmission
Because of the unavailability and high cost of connectivity, satellite transmission can still be the best way to reach some facilities. For file-based distribution, engineers came up with clever ways to modify existing IP technologies. IP was designed assuming that there would be a two-way conversation between the sender and the receiver. The predominant method of error recovery in IP requires that the receiver notify the sender that a packet has been lost so that the sender can retransmit the lost information. Without a return path, this and many other protocols designed to work over IP will fail. A few smart people have figured out how to make file-based delivery work using IP either without a return path or with a low-speed return path. Without these modifications, file-based delivery of content using IP would be impossible.
Large-scale fiber deployments
Over the last 15 years, the telecom industry has laid tens of thousands, perhaps hundreds of thousands, of miles of fiber-optic cables. No one knows exactly how much; the carriers keep this a closely guarded secret. At this point, fiber passes by most homes and businesses in metropolitan areas, and fiber is installed in all major league and most college sports venues. Some, but not all, last-mile problems have been solved by a massive investment in fiber by the telecom industry.
File-based content delivery is increasing. Of course, when you consider the effective bandwidth of an overnight courier truck loaded with tapes, it is difficult to beat. Even so, demand for file-based delivery is growing.
There are several factors driving this. First, file-based delivery allows just-in-time delivery of content. For example, if an advertiser wants to change a commercial, it can do so closer to air using file-based delivery technology. Second, there are costs associated with filling the overnight courier truck with tape that go beyond the shipping costs. The costs associated with a national tape duplication facility are significant. Finally, depending on the technology used, file-based content delivery ensures that the copy created at a broadcaster's facility is either a bit-perfect copy of the original, or it does not arrive at all. This eliminates tapes that are rejected for duplication-related technical errors.
File-based content delivery is not a panacea. Traditional delivery mechanisms will probably coexist with file-based delivery for quite some time. But the technical advances described in this article coupled with falling prices for bandwidth mean that file-based content delivery is here to stay.
Brad Gilmer is executive director of the Video Services Forum, executive director of the Advanced Media Workflow Association and president of Gilmer & Associates.
Send questions and comments to: email@example.com