EBU crashes heads together over HD

It seems that the EBU (European Broadcasting Union) has decided it is time to take the bull by the horns over ultra HD.

After all, it is now six years since the first production and transmission of ultra HD was demonstrated by Japanese broadcaster NHK in Tokyo, and still we seem no closer to consensus over standardization of the critical parameters such as image format, frame rate and codec type. There are many options on the table, and convergence has been hampered by confusion over what specifications are required or desirable to deliver the ultimate quality of experience for varying screen sizes.

There are also the constraints of cost and available bandwidth, with future evolution of HD dependent not just on increased network capacity, but also improved compression ratios, which is why the emerging HEVC (High Efficiency Video Coding) is important.

Pressure on bandwidth will come not just over distribution, but also contribution, given that pictures captured for ultra HD at 3920 x 2160, at 300 fps as has been proposed as a unifying figure, would generate streams at 52Gb/s. This will certainly give cause to think again about sending uncompressed raw video as some broadcasters have been doing, and there will be renewed demand for improvements in compression at the contribution stage.

The EBU has attempted to bring order to the mounting chaos of future HD standards, clouded further by the 3-D issue, by setting up its Beyond HD group. But, realizing that it was not much use just debating the future of HD behind closed doors among its European members, the EBU has reached out to manufacturers including Sony and Panasonic, as well as non members, notably NHK itself.

It met with these three in Geneva recently, along with BSkyB, to discuss issues of harmonization, which has been made more urgent by a clear move among manufacturers to push ahead and market new TVs next year. They are all desperate for products that will raise margins after the relative failure of 3-D so far, while there is a limit to the premium they can charge for smart TVs now that Internet connectivity is almost taken as a given and is not that much of a selling point.

The EBU did well by doing its homework first and thrashing out key issues while identifying what sort of roadmap made sense for HD given the display technologies, compression algorithms and bandwidth that were likely to become available over the next decade. The first task was to define what “Beyond HD” was. For some, it begins with 1080p, since current HD services are normally either 720p or 1080i, which both represent compromises. 720p with progressive scan is optimal for sport and fast-moving action, but sacrifices resolution, which can result in sub-optimal quality for content with a lot of detail but not necessarily fast action, such as art documentaries and some nature programs.

1080i can look juddery for fast action because, with interlacing, every alternate line only changes with every second frame but gives higher picture resolution than 720p. So 1080p, with progressive scan, combines the best of both and for most people would be regarded as the pinnacle of quality at present, but is only starting to deployed.

However, the EBU decided that the industry was already on the way to 1080p, and so defined “beyond HD” as the future beyond that likely to emerge over a four to 10-year time span, starting with some variant of ultra HD at 3840 x 2160 resolution. That is double in each direction or 4X over the screen area greater than the 1920 x 1080 of 1080p.

But, this then begged the next big question, which was how much resolution is desirable, under what circumstances and how much is affordable in terms of bandwidth or investment.

The first point to note, as the EBU did, is that frame rate has to increase almost in proportion with the resolution, so the toll on bandwidth is even greater than some broadcasters will have originally anticipated.

Frame rate has to increase because as the resolution gets higher, so the jump of picture elements between successive frames becomes more perceptible. Yet, proposed deployments of 1080p are set actually to reduce the frame rate in order to avoid increasing bandwidth too much, which would make the whole exercise pointless. In fact, 1080p needs a frame rate of at least 50 fps, and ultra HD, or 4K as it is often called, will require 100 fps. Some trials have been looking at a higher frame rate of 120 fps for 4K, which immediately introduces a conversion problem if content shot at one rate then has to be displayed on TVs that support another. For this reason, there are proposals to capture video at the high rate of 300 fps, partly because this can be cut readily down to both 120 fps, by dividing by five and then multiplying by two, and also to 100 fps, dividing by 3. It would surely make more sense to standardize on 100 fps, but that remains to be seen.

The nest question is over resolution itself, with the starting point being a law called Rayleigh’s criterion, which defines the smallest distance that can be resolved by an imaging system, determined by the wavelength of the light being received and the diameter of the object lens — in this case, the pupil of the human eye. While this varies slightly according to the color content of the image and the individual concerned, it is around 1/60th of a degree. Under normal vision, the viewing angle is within 30° horizontally and rather less vertically, so doing the math, that comes to a maximum of 1800 picture elements across the width of the screen. That is just covered by 1080p with its 1920 pixels across the horizontal.

At first sight, then there seems to be no call for anything beyond 1080p at all. But, this reckons without the impending revolution in display types with huge wall-size screens well over the horizon now. It is true that the smallest angle that can be resolved depends not on the screen size but on the angle of viewing. On that basis, a giant screen viewed from 100ft does not need any more picture elements than, say, a tablet close up, although each pixel would have to be proportionately larger.

In practice, though, large screens do require more picture elements because there are situations where they may be viewed from closer up than the normal optimum distance. For example, wall-sized displays will comprise multiple smaller panels, each of which can function as independent TVs, in which case they will sometimes be viewed from closer range and require smaller picture elements than they otherwise would. Further to that, these large screens will enable immersive viewing where the horizontal viewing angle will be much greater than the current 30° limit. That is why we will need ultra HD.

It may well be, though, that we will not need to go much further, and there may never be a call for the next level up, which is 8K at 7680 x 4320. Or, certainly never beyond that for viewing on two-dimensional screens. But, even then, the bandwidth implications are considerable, and, as the EBU has pointed out, often misunderstood. Even without upping the frame rate, ultra HD at 4K generates 4X as much data as 1080p, which, in turn, is double 720p or 1080i. 8K brings another fourfold increase again, and, if combined with a frame rate of 300 fps as may come to pass in a decade or more, the bandwidth consumed would be 192X greater than current HD services. That is why the EBU talks of the dramatic financial impact of "Beyond HD," which, therefore, will have to be plotted carefully. It is true that for distribution, we have the emerging HEVC, but that will only bring an immediate 50 percent or so improvement in encoding efficiency over H.264. So, while welcome, this will merely provide mild pain relief for congested networks.

On top of that, there is scope for increasing the range of colors in line with the higher resolution, and generally the bit depth for each pixel, which would rack up bandwidth further. There is also 3-D (another subject altogether for a further blog perhaps), and then, finally, the EBU talks about “beyond stereo."

There the mind really boggles.