– Wally Phelps, chief technologist and airflow evangelist for the AdaptivCool division of DegreeC (www.adaptivcool.com), says:

Advanced web applications and exploding use of smart mobile devices drive an insatiable need for centralized computing and database power. In order to meet this demand in the data center, power and cooling must stay in lockstep. Adding more IT load without having adequate cooling to support added heat load can lead to disastrous results.

To correctly asses the potential for improving capacity, or if new cooling units are a must, a few simple measures can be made. The first is IT KW divided by cooling Tons. The second is DeltaT (warm air AC return temperature – cold air AC supply temperature). The third is typical and worst case rack intake temperatures.

For typical cooling systems, the results of the above readings will fall into several categories:

1. High Efficiency / Over Capacity
IT KW / Cooling Ton >3.0
Average DeltaT (of all cooling units) 14-18°F
Rack inlet temperature – most racks at 72-78°F a few racks > 82F

2. High Efficiency / Maximum Capacity
IT KW / Cooling Ton 2.5 – 3.0
Average DeltaT (of all cooling units) 10-13°F
Rack inlet temperature – most racks at 70-78°F a few racks > 82F

3. Medium Efficiency / Medium Capacity
IT KW / Cooling Ton 2.0 – 2.5
Average DeltaT (of all cooling units) 7-9°F
Rack inlet temperature – most racks at 72-78°F a few racks > 82F

4. Low Efficiency / Low Capacity
IT KW / Cooling Ton Average DeltaT (of all cooling units) Rack inlet temperature – most racks at 70-75°F a few racks > 82F

Category 1 indicates that more cooling units may be required to support more IT load. Category 2 shows that the cooling system can support more IT load, but cooling redundancy could be a potential issue if not addressed. Data centers that fall into category 3 and 4 are candidates for improved efficiency through improvements in airflow and cooling distribution. Typical results on these types of sites show increases of 20-40% in cooling capacity while addressing any thermal or redundancy issues that exist through dynamic airflow technology.

The biggest challenge to keeping up with data center cooling is matching cooling distribution to newer high density IT equipment which can dissipate 10-20 KW or more in the same space as previous low density racks did. The biggest challenge to keeping up with data center power is adequate planning and budgeting when significant power increases need additional utility feeds, transfer switches and UPS capacity. Power equipment is expensive relative to cooling and utility feeds can have long lead times.