It's not easy being green, part IV

In the first three parts of this series, we examined some of the key components in broadcast centers related to efficient operation. While the goal was to emphasize “green,” the practical results are to save money. Being green and being efficient are not divergent principles.

Broadcasters are among the larger users of electricity. Studio lights, transmitters, 24-hour operation, larger staffs and kilowatt-hungry equipment all contribute to TV stations and production houses (to a lesser extent) requiring large amounts of power. When electricity cost two cents per kilowatt hour, few were concerned with leaving the studio lights on. However, as the cost of electricity has increased to an average of 10 cents per kilowatt hour nationwide, everyone is looking for ways to save on power costs.

Photo: Cooling broadcast racks is becoming increasingly complicated. As devices get smaller, they produce more heat. See Broadcast Engineering’s November 2008, KMBC-TV/KCWE-TV showcase article on building HD news facilities.

Having to maintain a 24-hour operation places video facilities and stations in a unique situation. They can’t just turn off the lights and go home at 5 p.m. Instead, operations continue around the clock. So where can these facilities save money while still providing their audiences the high-quality programming viewers have come to expect?

Develop a plan

Without knowing where you’re going, you’re bound to get there, someone once said. We can’t afford to wander around the machine room, turning off servers, processors or other equipment with the goal of saving electricity. Instead, engineers need to develop a systemized examination process to:

��� Discover where the power is being consumed;

• Calculate how much power is being used by rooms and by racks;

• Note any especially critical or heat-producing device, which may need to be relocated; and

• Calculate the operational efficiency of key power-consuming devices (UPS systems, cooling/heating units, lighting, ect).

Armed with this information, a carefully-targeted plan can be developed to improve overall efficiency.

Keep in mind that many power consumption improvements may be possible with little or no cost. Examples include adding motion detectors in rooms that are seldom used. Another option is changing out incandescent lighting with more efficient CFL fixtures.

The biggest return on efficiency is often in reviewing the air-conditioning system. This is where we’ll next turn our attention.

Air handlers, chillers, power distribution and backup power are mechanical processes required to keep broadcasters operating 24/7. And, these systems typically account for more than one-half of a building’s electrical bill.

Cooling systems

In our last article, we looked at the basics of machine room cooling. Typically this involves a computer room air-conditioner or special ducting, often combined with a raised floor as the cool-air plenum. We’ll now look closer at potential improvements.

Many of today’s control rooms and machine rooms are more a design of chance than plan. The TV station may have been expertly designed — some 15 years ago, but after many equipment and technology changes, there is less plan and a lot more chance. Let’s see what happens.

Figure 1. Be sure to prevent airflow leakage by sealing areas where cables break a plenum. This usually occurs at the floor or ceiling. Courtesy The Green Grid.

Figure 1 above illustrates the basic cooling approach for a machine room. It’s called hot-aisle/cold-aisle cooling. This cooling example uses a raised floor as the room’s cool air plenum. The floor must be raised a minimum of 12in. The more racks, the more the floor must be raised. Data centers with hundreds of servers may require a 24in or higher raised floor. Broadcast spaces seldom have this many racks or concentrated heat generators, so 12in may be sufficient. Your HVAC consultant can help you make this decision. Unfortunately, some older buildings may not permit raising the machine room floors more than about 12in. New spaces won’t suffer this limitation.

Remember that thousands of cables will also occupy this sub-floor space, so the area is not just for cooling. In fact, without proper cable management, airflow can be severely disrupted, creating hot spots and the potential for failure.

Just a quick note about maintenance in these spaces. If you’re currently using a raised floor system as a cool air plenum, when was the last time you had it cleaned? Some facilities may want to consider calling out Mike Rowe from the Discovery Channel to help with this “Dirty Job.”

Figure 2. Note the racks are located front-to-front and rear-to-rear. This creates a hot-aisle/cold-aisle cooling system. The key to an efficient system is to keep the hot and cold air separated. Courtesy Liebert Corporation..

Figure 2 above shows the racks mounted rear-to-rear and front-to-front. Because most equipment draws in air from the front and exhausts it to the rear, cooling air is applied to the front of the racks. The hot zone occurs at the back of the racks where the air is ducted back to the air-conditioner for cooling. Experienced broadcast engineers may find this configuration unusual. However, it works well for concentrations of modern solid-state and hard drive-based equipment. Remember, the hot-aisle/cold-aisle works only if every row of racks is configured in this manner. Also, it is not advised to intermix other equipment. Placing a UPS system between a pair of racks, or adding storage between racks, will complicate the planned airflow.

The bottom line is that the hot and cold air flows must be kept separated. Unfortunately, this sounds easier than it turns out to be in actual practice. Note again Figure 2. See the perforated tiles indicated in the front of the racks. This is the cool side. Air is taken into the equipment via the device’s fans and exhausted to the device’s rear or side panel. Overhead air handles scoop up the hot air, return it to a chiller or other heat exchanger where the cooled air is then recycled back via the machine room’s floor vents.

Sounds easy enough. Unfortunately, few facilities execute the plan effectively. Here’s why.

Maximizing the efficiency of the air-conditioner requires that the hot and cold air be kept separated. Unfortunately, engineers often attempt to solve a problem by creating a larger issue. Here are some common mistakes and their solutions:

• Using a floor fan to move air toward a particular device running a bit hot merely reduces available cool air for other devices. Instead of adding a fan, consider moving the equipment to another location.

• Another mistake is removing rack filler panels so cool air can get in to a particular device. All rack fronts should be totally sealed with blank panels. Without them, the hot exhaust air will mix with the cool, front-of-rack air.

• Storing equipment or cable below the raised floor is another mistake. Practice good cable management. Don’t allow a large bundle of cables to block airflow at the top or bottom of the rack.

• Adding more raised-floor perforated tiles to increase spot cooling is wrong. The typical perforated floor tile is 25 percent open, but some tiles provide more than 50 percent openings. Intermixing these tiles can result in overheating of some devices and under-cooling of others. And, without careful examination, you might not even notice your mistake.

Figure 3. An example of a cable cut out grommet

• Another issue is using airflow sealing at all cable entrances and exits. Use brush grommets whenever cables pierce any airflow chambers. (See Figure 3 above.)

Remember that more air is not necessarily better. Keep in mind that you have a fixed amount of resource (air) to use. If more of the air is directed to cool rack three because it has an especially hot device, you have less volume of air for the rest of the racks. Balance the airflow so all areas that need cooling receive it. Here are some ways to achieve that.

Consider where the equipment is located, especially gear that runs hot or may be heat-sensitive. The best place from a cooling standpoint is to mount the gear at 24in to 40in above the floor. This is about the middle of the rack. Fill those higher and lower areas with less sensitive equipment.

Don’t put critical or heat-producing equipment at the top of a rack. While this sounds like a good idea, after all, the device will be near the air return plenum, don’t forget about the heat-generating equipment located below the device. All the heat from those devices will rise, adding to the difficulty of keeping that one device cool.

Focus on developing a balanced airflow. Putting too many cooling tiles in the front of some racks will reduce the available airflow to other racks. Recall that the cooled air is a finite resource. Spread it out through the machine room along the fronts of all racks.

Finally, just setting the thermostat to a colder temperature is a recipe for increased maintenance and electrical costs. Trying to maintain a 60-degree cool area means the air-conditioner will continually cycle on and off, which creates at least two problems. First, the compressor will wear out earlier because of the increased on/off cycles. Second, the humidity won’t be properly controlled because the air-conditioner isn’t on long enough to remove the moisture. A properly-designed air-conditioner will run continually. This relates back to a point made earlier in this series — don’t oversize the air-conditioner system. Repeated on/off cycles waste energy and shorten the life of the cooling equipment.

In the next column, we’ll examine developing clean power systems for broadcast and content-producing facilities.

Additional resources:

Guidelines for energy-efficient datacenters

The green data center

Fundamentals of data center power and cooling efficiency zones

EPA data