The Superheterodyne Concept and Reception

The first television receivers were designed to receive BBC transmissions from the Crystal Palace transmitter in London. This was, incidentally 405-line all-electronic television, then regarded as HDTV by those who saw it. As there was only one transmitter in all of the United Kingdom, extremely simple receivers were practical. This simple receiver topology encouraged home constructors to do it themselves, and many did.

All later television receivers were of the superheterodyne type. In fact, all radios, television receivers, terrestrial and satellite, and all radar sets employ this superheterodyne principle, invented in 1916 by Edwin Howard Armstrong when he was serving in the U.S. Army in France. The concept came as a solution to the problem of trying to amplify the weak "wireless" signals with the primitive triode vacuum tubes then available. He hoped to detect approaching aircraft by hearing the impulse noise of the ignition systems; that would require a lot of sensitivity before the detector tube. Tubes in those days were ill suited to amplify high-frequency signals, so Armstrong reasoned, "Why not heterodyne the high-frequency signal to a much lower frequency where it could be efficiently amplified as much as one might wish?"

This is the fundamental concept of all superheterodyne receivers.

The concept might have worked out given more time, but World War I ended too soon. Major Armstrong went home and invented the superhet radio receiver and went on to also invent the super-regenerative detector (the circuitry of the proximity fuse used in World War II) and in 1933, wideband frequency modulation radiobroadcasting.

His FM transmitting tower is still in use at Alpine, N.J. atop the Jersey Palisades. After 9/11, TV stations operated from that historic tower to serve the New York area.

Today we don't use vacuum tubes in receivers, but all radio and TV receivers use Armstrong's superheterodyne receiver principle. The strengths and weaknesses of this invention are important to the future of terrestrial TV broadcasting, so please read on ; you can quickly become an expert on superheterodyne receivers and amaze your boss.

INTERFERENCE REVIEW

This column has said a lot about signal distortion, especially IM3 (third order inter-modulation) distortion. This is because third-order distortion is responsible for most interference between signals; it is both inherent and always bad in amplifying devices. But there is a very useful order--second-order distortion--that is also inherent in active devices, and without it, we would not have radio or television.

Second-order distortion produces sum and difference frequencies of pairs of frequencies present at the input. Let us call the signal frequency Fs and for the frequency of a local oscillator (LO), we'll use Fo. Second-order distortion produces two new frequencies: Fo + Fs and LO - Fs.

The difference frequency is usually the useful component in the output of the frequency mixer. You might wonder how the feeble radio signal can produce any kind of distortion? Alone, it cannot. The much stronger power from the LO drives the mixer into nonlinearity and causes it to generate the sum and difference frequencies. The signal carrier and its sidebands are shifted in frequency, but not otherwise altered by the mixer. The modulation of the signal is not affected in this frequency conversion process.

Armstrong's superheterodyne receivers were revolutionary in 1923. They were so sensitive they could be used with a loop antenna instead of the usual long-wire antenna needed for TRF and regenerative receivers. The only tubes available were triodes and these could only amplify signals far below the broadcast radio frequencies. Early superhets used an intermediate frequency (IF) between 30 kHz and 90 kHz. And herein lies a problem.

Consider the receiver was tuned to 800 kHz by tuning its LO to 845 kHz. The useful mixer output was at 45 kHz and this was amplified in a cascade of tuned IF amplifiers.

But alas, what if there was a second station at 890 kHz? This would also be heterodyned to the IF and it would be heard too. The only way around this is to provide RF selectivity ahead of the mixer. The RF filter would be tuned to the desired signal, and thus the undesired signal would be attenuated.

The higher the IF, the greater the attenuation of the undesired signal. But until pentode amplifier tubes were introduced, an IF frequency above 90 kHz was not practical. This form of interference was named "image response." This naturally causes a lot of confusion in the TV world, but this name is still being used.

ADDING PICTURES

Jumping from radio to TV, the first pre-World War II RCA television receivers were superheterodynes with a picture IF of 12.75 MHz. The sound IF was 8.25 MHz.

The frequencies used for television broadcasting were 44 to 90 MHz.

The picture carrier was 1.25 MHz above the lower channel edge, the sound carrier 0.25 MHz from the upper channel edge, and each channel was 6 MHz wide. The lowest picture carrier frequency was 45.25 MHz. The picture IF was 12.75 MHz so the LO was tuned to 58 MHz. Such receivers were vulnerable to image interference from a signal at 58 + 12.75 MHz = 71.00 MHz (there were no TV channels between 72 and 78 MHz then). An aural carrier at 71.75 MHz would have produced a signal just outside the IF bandpass at 13.75 MHz, which would not have reached the second detector.

After World War II and up to 1952, television receivers used a higher IF near 26 MHz for the picture, and went 4.5 MHz lower for the aural IF, which was by then using the Armstrong's frequency modulation. This increase in the receiver IF was forced by the introduction of the high VHF band, from 174-216 MHz. Suppose the earlier IF had been retained in the post-World War II period. The Channel 7 picture carrier, 175.25 MHz plus the picture IF of 12.75 MHz would equal the LO frequency of 188 MHz. The receiver also would have responded to an interfering signal at 188 + 12.75 MHz = 200.75 MHz, but this is in Channel 11, so interference would have resulted from such a low IF. The receiver IF had to be moved up again. With a picture IF = 25.75 MHz, the LO to receive channel 7 = 201.00 MHz and the image frequency is 201.00 + 25.75 = 226.75 MHz, which is why the high VHF band could not have been extended.

When the UHF band was opened in 1952, the receiver IF had to be increased again, this time to 45.75 MHz for the picture. It can be shown that Channel 29 will cause image interference to receivers with this new IF, so the UHF taboos, n+14 and n+15, were adopted by the FCC to prevent IF interference. The FCC did not mandate the new IF, but all manufacturers voluntarily adopted it to avoid marketplace disasters.

You might be asking, "Why didn't they go to a sufficiently high IF in 1952 as they had done in 1946?" Let's do the numbers.

The UHF band in 1952 extended to Channel 83, (884-890 MHz). The IF would have to be above Channel 13, e.g., 230 MHz. The limitations of vacuum tube technology at that time prevented a high enough IF except for military radars, a few of which had 200 MHz IF late in World War II.

Even today, the cost of an IF amplifier flat over 6 MHz and centered around 230 MHz would be unacceptable. So the FCC did what it had to do, establish the UHF image taboos we still know as n+14 and n+15.

DUAL TUNER DESIGNS

This brings us to double conversion tuners like the type Zenith designed for testing its 8-VSB DTV modulation technology. It was the outstanding interference rejection of that double-conversion tuner that led the FCC to conclude that the UHF taboos did not apply to DTV. The performance of that double-conversion tuner became the basis for the DTV planning factors that enable DTV broadcasting in those vacant channels within existing broadcast spectrum.

A double-conversion tuner is a cascade of two single-conversion tuners. The first frequency conversion results in the desired signal being upconverted to the first IF. The first IF signal is filtered to reject image frequency interference. Zenith chose 915 MHz for its first IF, a frequency above the UHF broadcast band.

To tune Channel 2, (54-60 MHz), the first LO was tuned to 915 + 57 = 969 MHz. The image frequency would be 969 + 57 = 1,884 MHz up there in the microwave region. This was easily filtered by a low-pass filter at the tuner input port because of the 2:1 difference in image frequency and the filter cutoff frequency (800 MHz).

After the first IF filter, the desired signal was heterodyned to the receiver's second IF of 44 MHz by a second LO operating at a fixed frequency of 915 + 44 MHz. Most of the signal amplification is done at the lower IF.

However, to my knowledge, no TV manufacturer uses a double conversion tuner, probably because of cost and the fact that need for such interference rejection has not yet become apparent in the marketplace.

To be fair, tuner designers know that the choice of the first IF frequency is always a compromise. It must be above the UHF band, but no matter what frequency is chosen, there lurks the danger that somewhere that frequency is being used, but not by broadcasters, and that the undesired signal will pass through the first IF filter to cause interference.

REDUCING TUNER COSTS

The only way to reduce tuner costs is by integrating the circuitry. In the case of a tuner, the active elements, transistors for amplification and diodes for mixers, are readily integrated along with interconnections. What cannot be integrated are the inductors for tuned circuits, so tuner designers seek inductorless circuit approaches.

The first IF filter in the Zenith double-conversion tuner was a bandpass filter at 915 MHz realized as a surface acoustic wave filter, which can be fabricated as an IC on a piezoelectric substrate, usually Lithium Niobate. This is a two-chip solution to the tuner problem, but the market demands a "tuner-on-a-chip" solution for cost reasons. Single chip solutions are known.

One of these is called the zero IF superheterodyne, also known as a direct-conversion receiver. DTV signals cannot be detected by a simple envelope detector; they must be detected synchronously. Locked-in LO frequency and phase to the received signal's pilot carrier is required.

Tektronix first employed synchronous detection of TV signals in its famous NTSC measurement grade demodulators in 1973. All DTV receivers have circuitry to produce the needed carrier locked to the pilot carrier, so why not feed the locally generated pilot carrier frequency into the mixer?

The output of this mixer would be a baseband (demodulated) DTV signal. Its spectrum would extend from DC up to 5.38 MHz.

What about the IF filter? There is no need! Its function is replaced by the low-pass (anti-aliasing) filter used to filter the baseband signal before it reaches the analog-to-digital converter. But without an IF filter, what about first adjacent channel signals--what rejects them? First, lets deal with the upper adjacent channel interference problem.

The pilot frequency of that undesired DTV signal will be at 6 MHz in the baseband signal. The low-pass filter should attenuate this undesired pilot at 6 MHz and all of its sidebands, which extend up to 11.38 MHz.

The lower adjacent channel presents a very different problem. Its pilot is 6 MHz below the local oscillator frequency, so it appears at 6 MHz in the baseband signal. The low-pass filter should remove both adjacent channel pilots, but the sidebands of the lower adjacent channel DTV signal will appear within the spectrum of the demodulated signal and cannot be removed. This is a critical drawback of the direct conversion topology for our 8-VSB DTV signal, which was brought to my attention by my friend, Dr. Oded Bendov.

But why worry? There is at least one direct conversion tuner-on-a-chip IC commercially available for the European DVB-T signal. Is it only a matter of time before someone introduces a direct conversion tuner-on-a-chip IC for use in North America? Would it work?

Yes, except where there is a lower adjacent channel also in use in the locality. Would such a DTV tuner meet the requirements set forth by the FCC? I have no idea; I'm not a lawyer. Perhaps this column will alert some of the people who might be working on such a tuner-on-a-chip for 8-VSB based upon direct conversion.

Is there any other way to build a DTV tuner-on-a-chip that would eliminate image interference? I believe there is! Special mixers have been built in the form of a microwave monolithic IC device with image rejection mixer circuitry on the chip. In a conventional frequency converter, the LO frequency is above the desired signal frequency, so it is below the image frequency.

The difference frequency spectrum at IF is therefore inverted, i.e., the pilot is at the high end of the IF spectrum and the sidebands are now inverted too, so what you have at IF is a lower sideband signal, plus the pilot offset in frequency.

However the image frequency signal at the mixer output is not inverted, therefore it should be possible to separate the desired signal at IF from the image frequency interference. An IC designed for our 8-VSB system could offer a powerful solution at a very low cost, a cost which should follow Moore's Law downward 2:1 per 18 months.

Is anyone out there working on an image rejection tuner for 8-VSB? I hope so. Is anyone out there working on a zero IF or direct conversion tuner for 8-VSB? I hope not.