I remember the first time I ever saw a fiber optic cable. I was in the eighth grade and for assembly, the school had invited a person from Ma Bell to speak about technology and careers. The telephone guy brought with him large cables with hundreds of pairs of copper wire. He said that most phone calls required one pair of wire from the caller to the receiver. He then talked about how the cables were all color-coded and the importance of not being color blind if you wanted to be a “cable splicer”.
I was only mildly impressed because I’d discovered a gold mine where the local phone installers cleaned out their trucks. It was a place filled with technological wonders for an 12-year old. A pile of old phones, mysterious grey Ma Bell boxes and all the wire I could ever want.
Then phone company guy pulled out a small diameter cable and a flashlight. Placing the flashlight on a lens he had connected to one end of the cable, he showed how we could see light coming out of the other end. He called this new cable Fiber Optics. I was amazed. Light moving through a wire.
While I was thinking this might be a new kind of flashlight, he talked of how this small cable could carry thousands of phone calls, many more than the huge copper cable he had just shown us. Voice over light. Really?
Years later that same fiber optic technology has matured a bit and today fiber is becoming a common interconnection for both video and high-speed IT signals. Video engineers need to be as familiar with fiber connections as they are with coax. Let’s examine some common fiber optic installation practices.
Fiber cables consists to two components, the core, which carries the light and the cladding, which surrounds the fiber. The cladding is made of different compounds, but serves two functions, one to trap the light within the glass and two, to form a strong physical protective cover for the delicate fiber. The cladding may be covered with additional layers depending on the intended application. Attenuation happens with the optical signal, much as you expect to happen with copper. With fiber, there’s no ‘resistance’ like copper, but the signal becomes weaker as it moves through the glass medium by scattering and absorption. See Figure 1.
Like coax, attenuation through the cable is a function of wavelength. That’s one reason most fiber systems operate at 850nm, 1300mn or 1550nm. At these frequencies, the light is invisible to the human eye. Because of this fact, never look into a fiber optic cable or connector. You won’t be able to ‘see’ if a signal is present, but if one is present, you may permanently damage your eyesight.
Signal reliability is highly dependent on received signal strength, With digital signals, Bit Error Rate (BER) remains the key measurement. BER can be considered the inverse of signal to noise ratio (SNR), which is a measure of performance for analog systems.
If you’ve ever tuned an analog receiver, you may be familiar with SNR. Typically one tunes the various system stages for maximum SNR. In digital, we strive for lowest BER, which is highly dependent on signal strength. However, too much signal strength will also increase BER. See this relationship in Figure 2. As shown, it is possible to have too much of a good thing.
Okay, we know the system needs enough, but not too much signal, but what do we measure? The answer is that optical systems are typically designed and tuned by measuring “optical power”. Sound familiar, it should.
Optical power is a function of decibels (dB), just like sound. A logarithmic function. 10dB means a change factor of 10, 20dB means a power change of 100. And, similar to audio, absolute optical power is measured in dBm or dB referenced to 1mW.
Let’s run through a simple example.
In this TV plant, we have an optical transmitter located in the basement of the building with a 0dBm output level. Three floors above, master control has an optical receiver capable of operating over an input level range of -10dBm to -30dBm. Over this distance, perhaps 400ft, the cable will have virtually no loss. So, a 0dBm transmit power will result in a signal level of 0dBm at the receiver. That’s 10dB too hot for the receiver. The solution is an optical attenuator installed at the receiver input.
Why not just dial back the transmitter? You could, but suppose you that same signal has to be fed to the transmitter some 5 miles away? Reducing the optical transmitter to perhaps -15dBm, solves the MC receiver problem, but it also could result in the transmitter location having too weak a signal. In most cases, it’s better to pad a receiver input, than dial back a transmitter output.
Future US's leading brands bring the most important, up-to-date information right to your inbox
Thank you for signing up to TV Tech. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.