Since its inception, video has been running over coaxial cable. In the early days, the PL-259 connector, sometimes known as the UHF connector, was used. Shortly after the conversion to color, the BNC connector became the standard electrical interface for professional video. Since then, the BNC has survived the conversion from analog to digital, and even made it through the explosion of digital video formats. But, after some 50 years, is it time for the professional video industry to start looking at a different interface for video signals?
To put things in perspective, it may help to look back at the history of the BNC and the signals it has carried. A number of video professionals first encountered BNC connectors carrying analog NTSC signals. As the industry progressed, it became clear standards were needed for interfacing professional video signals as digital information rather than as analog voltages. The industry developed the short-lived parallel digital video interface, but advances in technology quickly lead to development of the Serial Digital Interface or SDI. SDI had a number of key design criteria, one of which was that it be carried on a BNC connector. Early on, SDI was “just NTSC on a digital interface.” Video was still a 525-line signal presented to the viewer in an interlaced mode at a 29.97Hz frame rate (625 lines and 25Hz for PAL). The interface delivered 720 active pixels per line of horizontal resolution.
With the move to HD, two competing formats, 720p and 1080i, both could be delivered using the BNC connector. At this time, the frame geometry of professional formats exploded. The aspect ratio moved from 4:3 to 16:9. Permissible frame rates expanded from the traditional 29.97Hz and 25Hz to 24Hz, true 30Hz, 50Hz for Europe, and 60Hz for the U.S. The bit depth per pixel (the number of bits assigned to describe the value of each bit in the raster) went from eight bits to 10 bits and then 12 bits. The industry sought to increase image resolution in four axes.
A number of years ago, the world of SDI and IP technology started coming together in the area of video transport. As terrestrial IT network technology improved, and as transport providers began to be able to guarantee high quality and high availability of IP networks, professional video users began experimenting with IP streaming of professional video feeds. This ultimately led to the creation of the SMPTE 2022 family of standards for the transport of professional video over managed IP networks. While this technology has been successful, it has essentially operated as an SDI “cable extender,” transporting a bit-for-bit copy of the SDI signal hundreds or thousands of miles away. This approach has been successful, and this group of standards is deployed in thousands of units all over the world.
SMPTE 2022 is intended for use outside the studio. In fact, any applications where it is used inside a facility are unknown. But, inside the facility, at the beginning of 2012, video is still delivered via the BNC connector, just as it was when the U.S. withdrew from Vietnam and the first Space Shuttle launched. Is it time to look at how IP and serial digital video might be used together within the studio?
Before talking about what it would take to make this change, let's take a quick look at why it might be compelling to merge IP and digital video. First and foremost is cost. IP technology is widely deployed on a scale that dwarfs the entire professional video market. With huge deployment comes not only lower infrastructure cost, but increased spending on R&D. Put simply: The more IP technology is sold, the more dollars a company has available to put into research. Another benefit of moving to an IP infrastructure is that there are many people who understand IP technology well. In fact, many professional video engineers have become knowledgeable about IP technology.
Almost every piece of professional video equipment connects to an IP network in one way or another. Engineers who know how the worlds of IT and professional video come together may find themselves in a good position. It may be easier for video engineers to obtain a solid education in IP technology given the huge number of courses available. Another advantage of IP networks is that they are self-routing. Information contained in the header of each IP packet tells the network equipment how to route the packet. There are business cases where having this self-routing capability opens new business opportunities not available when using externally routed video networks. Finally, IP networks are ubiquitous. From a functional standpoint, how far is technology from a news person being able to plug an IP camera into a network connection in his or her hotel, or on a street corner, and being able to stream that video anywhere in the world?
Doing this, however, requires one significant, difficult and costly change: It will require a break in the connection between video format geometry and the electrical and physical interface — the BNC connector. Right now, just about every professional video standard written mixes together the concept of the video format and the BNC electrical interface. The industry needs to start thinking about separating the two, and doing so sooner rather than later.
Separating video format from the electrical interface means talking about video formats in terms of pixels per line, lines per frame and so on, while excluding any discussion of how these signals appear on a wire or connector.
The following parameters could be used to specify the video raster format:
- Video width: typically expressed as the number of pixels per line. Many people might not know the total number of pixels per line is not the same as the number of active pixels per line. Both need to be specified. For example, 1080p60 has 2200 total pixels, but only 1920 of those are active on any given line.
- Lines per frame: Again, there are two parameters here — the total number of lines per frame, and the number of active lines per frame. The two are not the same. 1080p60 has 1125 total lines per frame, but only 1080 of those lines are active lines.
- Pixel bit depth: Most HD specifications have a bit depth of 10, but some of the higher-resolution formats have 12 bits. Of course, manufacturers have been experimenting with higher bit-depths for some time now.
- Aspect ratio: This is the ratio of picture width by picture height. The two aspect ratios encountered most TV applications are 4:3 and 16:9.
- Scan type: These are interlaced, progressive or segmented frame.
- Frame rate: Knowing how often complete frames are presented to the viewer is critical. Many different values have appeared over the years, reflecting the history of the particular format, including 24Hz, 25Hz, 29.97Hz, 30Hz, 50Hz and 60Hz. Again, manufacturers have been experimenting with frame rates as high as 240Hz.
To be clear, this video format information is already well standardized. But, the standards mix video format information with electrical interface specifications.
Focusing on standardizing how video bits are mapped onto IP networks then allows smart systems engineers to concentrate on developing new products and systems that efficiently transfer not only streaming video and audio, but also large video files plus the myriad of other IP-enabled technologies needed in modern studio production, such as camera lens settings, IFB, prompter text and control, intercom, and more.
The time has come to seriously look at moving beyond the BNC as the do-all electrical interface for professional video applications.
Brad Gilmer is executive director of the Video Services Forum, executive director of the Advanced Media Workflow Association and president of Gilmer & Associates.
Send questions and comments to: email@example.com
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.