Timing in hybrid facilities

Timing requirements within broadcast facilities have changed from the days when analog NTSC was the only game in town. Now digital and analog reside side by side, and if you're lucky enough to have a completely digital facility, timing problems are a thing of the past ... or are they?

Analog timing review In most broadcast and production facilities, video timing of the various video sources is required so that switching between video signals does not cause a disturbance to any receiving equipment, monitors, VTRs or at the transmitter. It also maintains the NTSC signal as a valid RS170C waveform.

Analog NTSC facilities typically have a master sync pulse generator (SPG) to provide the main reference for every piece of equipment that needs to be synchronized. This reference is usually an NTSC "color black" signal. Most equipment, such as cameras, VTRs and graphics systems, can "gen-lock" to this signal and provide the engineer with timing adjustments to allow the output waveform to be correctly timed to the timing reference. The timing reference location is usually at the input to a routing switcher or production switcher, as this is both an easy place to select and measure many signals and the place where the signals need to be in time. The typical delays incurred within an analog facility are fairly short and are in the order of a microsecond for a production switcher down to a few tens of nanoseconds for a distribution amplifier. Signals passing through different paths encounter varying delays which are compensated for either by adjusting the cable length if possible when the delays are short or using adjustable delay amplifiers when more delay is needed.

The process becomes more complicated in larger facilities when a source, such as a camera, has to be timed to many locations at the same time. Some sophisticated systems have been employed to automatically re-time a source as the final destination changes and the source's timing has to change to keep it in time. It is more usual to add delays in the various paths to provide the correct timing requirements simultaneously at the various locations.

Timing simply means synchronizing the three basic parameters of the signal to match a reference. First is vertical sync or field timing, second is horizontal sync or line timing and the third is color sync or subcarrier timing. In addition to these parameters the correct subcarrier-to-horizontal relationship should also be maintained, especially if any editing is required. Timing is most often measured by externally referencing a waveform monitor to the house reference and adjusting the waveform monitor/vectorscope to display the appropriate parameter to be measured using the timing reference (usually color black). The source to be timed is then selected and adjustments made to make the new waveform match the reference. Care needs to be taken when making adjustments to allow for amplitude, width and shape differences between analog waveforms, plus make sure these parameters themselves are within specifications.

NTSC signals include the basic para-meters to easily measure or compare two video signals. The first of these parameters is the vertical sync sequence, which includes the pre- and post-equalizers and the serrated vertical pulse itself, plus the remaining lines in the vertical blanking period. The next parameter is the horizontal blanking period that includes the horizontal sync pulse, the front and back porch, and the color burst. The last parameter to check is the color subcarrier. The subcarrier burst phase is viewed in the vector display mode, along with the decoded video information plotted as R-Y against B-Y. The burst phase is adjusted to be coincident with the reference axis (-B-Y). The phase of the color burst to video color information is also a critical measurement to prevent color errors at the decoder. The position of video within the line and field is also important but does not itself disqualify the signal as valid (unless the video encroaches into the vertical or horizontal blanking periods) and would only produce a positional offset at the display device. However, positional offsets are very important if there is a need to switch between two versions of the same source, such as a direct source against the same source through a DVE device in a production switcher. Special equipment is also available to measure a source's SC/H phase relationship either via a numeric readout or an indicator dot on a vector display.

There are many other parameters that can and should be checked when comparing two video signals, but it's the timing parameters that are of importance here. Three other areas, though, require a mention: frame synchronizers, audio timing and component analog video (CAV). Frame synchronizers are sometimes used to re-time video within a facility, especially when a production switcher output is needed to feed back its output as a source into a routing switcher or into a master control location. This is normally acceptable unless many frame-delayed paths are used when lip sync between video and audio becomes a problem. (Normally one or two frames of video to audio delay are not noticeable, but a delay of three or more frames is noticeable to the average viewer.) New frame synchronizers can include an identical delay in a companion audio channel to help reduce the lip sync problem, but this may require additional audio routing to use it.

Analog audio timing on its own is not a problem as delays are short. In fact, it is difficult to measure the timing delays as there is no absolute reference in an analog signal to provide a means of measuring the delay. As the audio delays are short compared with the frequencies involved, path delays are not usually critical when compared to the related video signal. Stereo channels necessitate monitoring of audio phase between the two channels, but comparing timing between two individual signals is typically not needed.

CAV facilities also deserve a mention as they have the same vertical and horizontal requirements of NTSC facilities but trade off the need to time three video paths (R, G and B or Y, R-Y and B-Y) against the need to perform subcarrier timing. The tradeoff is usually worth it because component processing preserves the color bandwidth and removes the losses incurred during NTSC encoding and decoding. Component analog equipment such as routers, production switchers, DAs and cabling, though, does cost more to purchase and install.

Digital timing requirements Frame synchronizers and DVE equipment started the digital video revolution and provided features not otherwise available with analog equipment. Early equipment was analog both in and out and sometimes composite digital within. This equipment was treated in the same way as analog equipment as far as timing was concerned.

The early adoption of digital composite as a video format initially made some economic sense. However, this was soon overtaken by the cost reduction of LSI circuits that could just as easily process component digital as composite digital. As component digital offers higher bandwidth component signals and incurs none of the losses introduced by NTSC encoding and decoding, it has become the de facto standard for most production and television broadcast facilities around the world. Referred to by the standards of CCIR 601 or SMPTE 125M, it is also sometimes referred to as 4:2:2 or incorrectly as D1 (D1 is a tape format).

In order to discuss the requirements of digital timing, a quick review of the component digital signals is in order. The luminance and band limited chrominance components are sampled at 13.5MHz at 10-bit resolution. The 720 luminance (Y) samples and the 720 co-sited color difference chrominance samples of B-Y (Cb) and R-Y (Cr) are time multiplexed into a parallel stream at 27MHz. The output then becomes a serial stream at 270Mb/s with the sequence of Cb, Y, Cr, Y, Cb, Y, Cr ... (These parameters are the same for both 525 and 625 systems.) This sequence is repeated for all active lines in the picture. The sampling sequence is also maintained through the horizontal and vertical blanking intervals so as to maintain compatibility with the timing of analog signals.

To synchronize the video sample streams of each line, a sequence of four synchronizing bytes is added to define the start and end of the video line. This sequence is referred to as SAV and EAV (start of active video and end of active video) and each four-byte group consists of the hexadecimal sequence 3FF, 000, 000, XYZ. The XYZ byte provides additional data bits to define the start or end of line, Field 1 or Field 2, and the start of an active or vertical blanking line. This is all the synchronizing data that is sent or needed.

Checking digital video timing Let's look at what this means for digital timing measurements. First, a digital video waveform monitor often offers the option of suppressing or displaying the SAV/EAV bytes. When displayed this sequence appears as large positive pulses on either side of the horizontal blanking period and gives a visual reference for H timing. (See Figure 1.) The pulses appear with some associated ringing due to the single byte of 3FF producing an illegal risetime signal in the analog domain (74ns) which causes the analog reconstruction filters in the waveform monitor to ring. Digital H timing can be performed in the same way as analog H timing by using the SAV pulse as the point of reference.

Other than the visual SAV/EAV reference pulses during vertical blanking, no apparent information is provided on vertical timing. If we examine the vertical blanking period, all that is visible is the series of SAV and EAV pulses of the H blanking periods and a DC level corresponding to reference black (040 Hex) (See Figure 2). There are no vertical reference pulses. The initial thought might be to look at the first line of video as the timing reference, but this can cause measurement errors if the video information itself has been delayed by a line and is not in the correct vertical location within the digital stream. In addition, the digital signal does not provide any visual clue as to which field interval that you are looking at. (The digital standard does not include half lines, so if you see video with a half line it usually means that the signal was converted from analog.)

Luckily there is a solution to this dilemma. Most waveform monitors include readout of the field and line that is selected (in-line select mode). The waveform monitor obtains this information from the SAV and EAV data by decoding the `Field' and `Active' bits within the XYZ byte of SAV. The extracted information is then used by the waveform monitor to determine which line and field to indicate in the display.

How then do you compare the vertical timing of two signals? First, adjust the signal for the first line of video on a signal, say Field 1, Line 20 (or a vertical interval line if VITS or other data is present) and note the location on the display using external reference. Then switch to internal reference and determine if the display changed. If it did not, then the signals are in vertical time. This method does not work with a black signal unless there is some VITS present, as all of the lines will look the same. Some waveform monitors (such as the Tektronix WFM 601 M) provide another way to look at the SAV/EAV information directly as data. (See Figure 3.) In this mode the data bytes are displayed as binary, decimal or hexadecimal values with the field, line and sample number indicated. The active and blanking lines are also indicated allowing the top line of any picture (even color black) to be determined. Examining this data compared with external sync does allow the vertical timing to be checked, but does not provide an easier method of finding the vertical relationship between two signals.

Digital equipment offers both pluses and minuses when it comes to timing a facility. The plus with digital is that most production switchers have automatic timing circuits that will keep signals timed to the switcher's reference as long as it arrives at the input within the capture window. This window can be from a few microseconds to a full line. The minus with digital is the need for vertical timing due to the delays encountered with digital processing. It is also possible that vertical timing errors are completely ignored by some equipment and that video is passed through and synchronized horizontally with new sync (SAV/EAV) added from the switcher's external reference. If the information coming into a switcher is one line off vertically going into a switcher, it will be one line off coming out. In addition to this, the video at the output of the production switcher itself is also delayed by about one line from the reference because of the processing delays and auto-timing circuits used within the production equipment. This can produce signals that are both one line late in time and have the video position off by a line as well. Careful consideration of the delays involved and keeping both vertical timing and video placement correct will allow a facility to produce video without information shifts when switching between sources.

Hybrid complications The problem of vertical delays becomes even more of a concern in a hybrid facility. When converting analog NTSC to digital component video, the main concern is the decoding delay. A notch filter decoder adds little delay and can usually be accommodated by the auto-timing circuits of digital production switchers. To provide better NTSC decoding, a comb filter is usually chosen. Depending on the type of comb filter employed, either one or two lines of delay can be introduced to the digital output relative to the input. This poses a significant problem if the analog source must also be timed into analog equipment at the same time as it is available for digital equipment.

Figure 4 shows a facility with a camera source being used by an analog system and a digital switcher simultaneously. The delays encountered can quickly add up. In this example, two lines are incurred in the NTSC decoder plus one more line through the production switcher. In this case there is a solution to the dilemma. Replace the NTSC-to-component serial digital converter with an RGB-to-component serial digital converter. This has short delays when compared to the NTSC decoder and uses the full bandwidth of the RGB signals to produce the serial digital video without incurring NTSC losses. As a rule always use the component analog outputs (when available) of a device for conversion to digital for the best results and fewer timing problems.

A blanking test signal that produces a half amplitude pulse on the first and last vertical lines in the picture and two parallel lines on both the left- and right-hand edges of the picture helps with timing checks and video placement problems. (See Figure 5.) The outside lines define the exact edge of the digital signal and should not be lost when passing the signal through a digital video channel. The inner lines on the left- and right-hand sides are the edges of the NTSC analog picture. Digital pictures are wider than analog pictures by a few samples on each side (six samples in 525/60 systems). This difference in active line width causes black edges on analog-originated pictures. A problem related to these black edges on a digital picture is that they should be black and not "lower than black." When an NTSC signal is converted to digital a black clip should be employed to remove the blanking level step from the converted luminance signal. (SeeFigure 1.) As there is no setup in digital, the setup step on the edges of the analog signal becomes a negative pulse if not clipped when the conversion to digital is made. Another solution would be to reduce the horizontal blanking width of the source to remove the black edges from the original signal prior to conversion to digital. This could be achieved with existing source material by a small amount of picture expansion through a DVE device to fill in the black edges.

A hybrid facility can consist of many different paths, and these should be timed to match the longest path. This is typically the "NTSC-to-digital" path. The example in Figure 5 is one example to illustrate the problems involved.

Video networked timing requirements The advent of video networking does offer solutions to audio and video timing problems. Video and audio information stored in file servers has corresponding tags to indicate the correct video-to-audio relationship of the files to each other. As video file servers do not store sync information, the video file server simply has to play out the audio and video correctly synchronized to each other when needed with the correct sync inserted. Any processing required is performed either directly on the data or by playing out the video and audio to a production suite for combining and processing then re-recording. All equipment used should maintain the correct relationship between signals during the play and record processes so timing problems should be few. The timing problem with file servers is normally more one of file transfer bandwidth than of individual signal timing.

Timing an entire facility as it changes from analog to digital and then to a networked facility can be overcome, provided all of the problems are understood from the outset.