I have been reading a recent report by the EBU on peer-to-peer (P2P) Internet file delivery. While broadcast (one-to-many distribution) from a transmitter has a fixed cost, no matter how many receivers are tuned in, Internet is generally unicast, or one-to-one delivery. That means the media server must run a small application for every single Internet browser viewing the stream. Typically, a server can run a few thousand streams, so a large webcast to millions of viewers potentially needs hundreds or thousands of servers, plus all the associated firewalls at the server farm and routers throughout the network.
However, there is a halfway house: multicasting. Each router can split a single stream to many paths, potentially saving on server resources. But multicasting has never been that successfully deployed, especially across the many separate networks that make up the Internet. It proved to be simpler to use unicasting, but it doesn't scale very well.
The content pirate solution is P2P, in which a browser pulls content from other client PCs on the network. With no central server, it has been difficult to prosecute the pirates. It was this technique that the EBU decided to investigate back in 2006 to see if it could be used as a way to deliver broadcasters' files.
P2P is not without problems. Most domestic Internet connections are highly asymmetric, with upload data rates about 10 percent of the download rate. P2P also does not work well across different networks, and many countries have a number of network operators.
The EBU investigation identified the efficiency advantages of P2P over unicast, but found that many of the features needed by broadcasters, such as audience measurement and content distribution control, are difficult to implement. Latencies to access content can be high (up to 20 seconds), and it is also difficult to maintain QoS with P2P. It is all very well with pirated content, but broadcasters need QoS control.
P2P accounts for half of the Internet traffic in some countries, even though it has been associated with piracy, and ISPs do not receive direct revenue for the service. It could be said that all Internet users subsidize the P2P users. One downloader relies on many uploaders to serve content. If the number of downloaders is the same as the number of uploaders, then the average download bandwidth is limited by the average uplink bandwidth — not very much with asymmetric links.
Not all the problems with P2P are insurmountable, and research and development continues. Will it someday break out of the pirate mold, or will other technologies like edge caching and multicast provide the solution to lowering the cost of Internet distribution?
The EBU report describes possible receiver devices beyond the PC. There is no reason why a hybrid broadband-broadcast receiver could not also act as a P2P client with suitable additional processing power.
The report concludes that P2P could supplement, but not replace, conventional cable, satellite and terrestrial broadcast, or Internet multicasting and content distribution networks.
As network capacity expands to support 100Mb/s and beyond, the economics of different methods will change. Maybe P2P will become a real commercial proposition to reliably deliver video material. I encourage all of you to read this report, available from the EBU as Technical Report 009; it is a thorough review of what could be a game-changing technology.
The report also touches on spectrum issues with unicasting to 3G phones. These newer technologies all promise great ways to deliver niche and long-tail content, but as a way to deliver live TV to the masses at a low cost, broadcast is a tough technology to beat.
Send comments to: email@example.com