dcsimg

Hardware Today: When the Weather Inside Is Frightful

By Drew Robb (Send Email)
Posted Dec 19, 2005


Despite all the advisories and plain old common sense about the importance of maintaining a steady temperature in the data center, Uptime Institute had a nasty surprise when it measured cooling in 19 computer rooms. The institute concluded that most server rooms cannot properly handle their installed equipment loads.

Frosty the snowman may be frolicking outside, but many server rooms are a little too toasty.

Even more surprising was that the racks had an average power consumption of only 2.1 kW. A full rack of servers, however, produces 12 kW to 14 kW of heat.

Researchers noted several other surprising findings: 10 percent of cooling units had failed, and no one noticed; air intake temperatures at 10 percent of the racks exceeded maximum reliability guidelines; and rooms were using 2.6 times more cooling capacity than they required, yet were still 10 percent filled with hot spots. Hot spots accounted for 25 percent of one room, which had 10 times more cooling than was required for the heat load. Further, according to Robert Sullivan, a staff scientist at the Uptown Institute, only 28 percent of the available supply of cold air was directly cooling equipment. The other 72 percent bypassed the computer gear completely.

ASHRAE has determined that the maximum cooling load occurs less than 5 percent of the time.

Clearly, then, cooling is not a core strength of server room administrators. To educate them, Uptime Institute issued some basic guidelines in a paper entitled, "How to Meet '24 by Forever' Demands of Your Data Center."

Similarly, the American Society of Heating, Refrigeration and Air-Conditioning Engineers (ASHRAE) released detailed instructions to help resolve heating and cooling issues. Published by ASHRAE Technical Committee 9.9, "Thermal Guidelines for Data Processing Environments" offers vendor-neutral advice and standards that can be applied to server rooms and data centers.

Some important points worth paying attention to:

Walking Down the Aisle

Arrange racks using a hot-aisle/cold-aisle configuration. This includes a 14-foot cold-aisle-to-cold-aisle separation when cabinets are 42 inches (or less) deep or a 16-foot separation for larger cabinets. See the documents referenced above for an in-depth rundown on spacing of servers, perforated tile placement, and more.

Minimize Bypass Air

Bypass airflow is a significant waste factor in most computer rooms. It represents the presence of cold air that doesn't directly influence the cooling of equipment — i.e., it wafts around the room instead of being channeled where it needs to be.

The Uptime Institute found 58 percent of the cold air to be lost through unsealed cabling holes in the raised floor. This allows cold air to circulate directly into the hot aisle. Another 14 percent of the cold air available is allowed to escape through incorrectly installed perforated tiles.

"Use blanking panels to prevent hot air from escaping through empty spaces in the rack and into the cold aisle, and vice versa," says Fred Stack, vice president of marketing at Liebert (a part of Emerson Network Power).

Consider Variable Capacity Cooling Units

ASHRAE has determined that the maximum cooling load occurs less than 5 percent of the time. Cooling systems, therefore, should effectively operate at varying loads without the compressor cycling on and off, as this shortens the life of an air conditioner. Fortunately, cooling units are available with variable capacity capabilities. There are also systems available that permit unit-to-unit communication to prevent machines from unknowingly competing with each other (e.g., one unit heating while another cools). This is even more critical in rooms with high-density loads, as hot spots are more prevalent.

Think of the Future

A server might be deployed for only a couple of years, but the server room holding it is built to last decades. So don't design the cooling system by thinking only of today. Five years ago, for example, the average rack consumed about 1 kW. Today, a rack with six IBM BladeCenter servers has a load of 24 kW. And who knows what the cooling demands will be in the next five? Multicore chips will be commonplace and rack density is likely to be even more compressed.

"Tomorrow's servers will vastly exceed the heat emissions of servers manufactured today," says Stack. "It's important that any cooling solution you apply today is scalable and allows for more capacity to be added."

Racks Need Special Attention

Racks obviously require close attention where cooling is concerned. There is so much packed into a tiny space that hot spots are hard to avoid. Fortunately, vendors are now paying far more attention to rack cooling. Understand, however, that regardless of rack design, the racks are in big trouble if the cooling system isn't functioning properly.

"Racks have been both a problem and a blessing — the first racks created massive heat problems, as they had power at the bottom, storage next, and CPUs at the top," says Clive Longbottom, an analyst with U.K.-based Quocirca.

As hot air rises, the bottom of the rack tends to get too much cold air, and the top of the rack ends up short. This explains why equipment in the top-third racks tends to fails twice as often gear lower down.

According to the Uptime Institute, the tops of racks typically don't receive nearly enough cool air. As hot air rises, the bottom of the rack tends to get too much cold air, and the top of the rack ends up short. This explains why equipment in the top-third racks tends to fails twice as often gear lower down.

Add Supplemental Cooling

The solution to hot spots and top-of-the-rack failures is supplemental cooling. As it is situated close to the load, supplemental cooling consumes 17 percent less energy than do traditional sources. It complements raised floor systems, which cool from below.

The Liebert XD Piping System, for example, delivers cooling to the edge of the rack from above. It can draw the hot air directly from the enclosure and cool the air before discharging it into the room. Alternatively, it can draw discharged hot spot air from the hot aisle through the cooling coil and distribute it into the cold aisle. Such units can be added quickly to resolve a specific issue and redeployed elsewhere later based on heat patterns changing or new racks being added.

Don't Let Green Give You the Blues

The refrigerants traditionally used in cooling systems have come under fire in recent years as environmental hazards. As a result, green refrigerants have come on the market. Further adding to their appeal, these elements have earned a reputation for lowering cooling efficiency by up to 15 percent.

"The day is coming when green refrigerants must be employed in the field" said Ron Spangler, product manager at Liebert. "Many organizations want to standardize on green refrigerants today or they want a simple way to retrofit in the field with minimal performance impact."

To resolve this issue, Liebert just announced four new models of the Liebert DS system with a new condenser that eliminates the performance penalties associated with switching to green refrigerants. The Liebert DS is a cooling system used to control temperature, air quality, and humidity in controlled environments, such as data centers and computer rooms. It is available in 53 kW, 70 kW, 77 kW, and 105 kW downflow units.

More Than an Afterthought

Cooling is little more than an afterthought in many server rooms, and IT departments often take the lazy approach, relying on inadequate metrics, such as overall room temperature or the presence of a few underfloor tiles to reassure them. The result is soaring electricity bills and hot spots left all over the room. But with CPU density getting higher ever year, server room cooling merits more attention.

"Servers are being packed closer and closer together these days," says Longbottom. "The old approach of just trying to keep the temperature in the room at a steady level is no longer valid."

Page 1 of 1


Comment and Contribute

Your name/nickname

Your email

(Maximum characters: 1200). You have characters left.