Big Data can help stop IPTV and cable operators being dumb pipes

Premium content will remain the primary competitive weapon for pay TV operators, both against each other and to combat the ever rising threat from OTT upstarts. But not all operators have a lot of top premium content, with the cream often lapped up by just a select few. In Germany, for example, Sky Deutschland has bagged all the rights to the top tier Bundesliga league, which is now arguably Europe’s strongest football division. Many operators then need another sting to their bow and that can only mean offering the best quality of experience.

But with OTT services getting better all the time as bandwidth increases and QoS mechanisms improve, IPTV and cable operators in particular need to invest in measures to ensure they stay ahead. The one advantage they have is a managed end-to-end network, and they must capitalize on this to the fullest to deliver high-quality services right to the end device in the home.

There are two separate aspects to quality of experience (QoE), essentially the content we watch and the way it is presented on the screen. Operators may not be able to offer all the premium or blockbuster content they would like, but they can at least endeavor to help users find what they would most enjoy watching from the available catalogue, which is becoming increasingly vast. At the same time, operators have the potential to control the viewing experience by monitoring key points across the end to end network path, as well as in the device.

Both these challenges involve access to relevant data in real time, and so the solution in each case lies in the realm of what we now call Big Data. We are not talking about the same data for each of the two QoE categories, but both can be addressed as a part of a broad Big Data strategy. This can put some clear water between IPTV or cable operators and their OTT competitors, which have incomplete access to relevant data for both monitoring viewing quality and generating timely recommendations for users.

On the recommendation front, the goal is to create that “positive surprise” that goes beyond the Amazon approach of suggesting that because you like program A and other people who like program A also like program B, then you might like program B. This approach, based on so-called collaborative filtering, will yield some content of interest, but mostly predictable and failing to generate that out of the blue surprise that can really boost the experience and with it the subscriber’s confidence in and loyalty to the service. Just inserting a few random guesses is not good enough, because too often it will yield recommendations to content that the user has absolutely no interest in, which will decrease satisfaction with the service. The trick is to assimilate different sources of information about customers, and their preferences in the context of what they are viewing at the time and perhaps also what they are doing on secondary devices such as smartphones, tablets or PCs.

This requires having access to all that data, being able to extract the right meaning and then deriving pertinent recommendations, all in real time. It is a difficult challenge but with significant prizes not just through upselling content, but also by making ad targeting more effective. There is accumulating evidence now that if users are shown ads that they find entertaining or are for products and services they are interested in, or preferably both, they are more likely to engage or even make a purchase.

Similarly for the viewing experience, operators can bring a lot of data together to get a much better picture of what the end user is actually seeing on the screen. In the past, cable operators especially have tended to overprovision to cope with occasional capacity crunches, while IPTV providers have likewise tended to overestimate the bandwidth needed to deliver acceptable service and, as a result, confined their catchment areas to a smaller number of potential subscribers closer to their DSLAMs than necessary. The objective here, which Big Data can help meet, is to establish a unified system for monitoring, management and troubleshooting. Until now, this has typically been a fragmented process, especially in the case of IPTV, split between several operational systems covering fault management, network management and customer service for example. While the major monitoring vendors are trying to bring this all together, their systems do not have access to all the relevant data, which is also a constraint for OTT providers.

Existing monitoring, fault and network management systems fail to take account of the increasingly complex inter-relationships between components in their respective domains, which potentially can give operators much greater visibility over the quality the end user is getting. The first step is to identify all those elements, from set-top boxes to IP routers, in the network that provide relevant information in real time that can be used. There is also information that can help with longer term quality assurance planning, from customer call records, for example. Assimilating all this is quite a task, but can be done within a common Big Data-based network operations center.

It is then possible to troubleshoot more quickly when problems do occur, while also optimizing the network with less need for over provisioning, taking account of how the users are behaving. It is here that the two sides of the Big Data equation come together, for knowledge of what customers are doing and especially predictions of what they are about to do, will help ensure the network can cope with traffic loads and maintain viewing quality. But this knowledge will be generated partly from data about what users are viewing and even what they have just been recommended.

It is clear then that total QoE should be addressed via a single coherent Big Data strategy rather than in separate silos. The task may too great for some smaller operators, but for them we are likely to see cloud based Big Data services emerging and evolving.