…By consolidating data centers, relocating them strategically and investing in green technologies, companies can fight rising energy costs.

Robert Offley

CEO of CentriLogic, says:


Lately, there’s been some renewed discussion on the Web concerning data center waste and costs. By some estimates, data centers are consuming around 2% of all electricity used nationwide. While data centers are still perceived as being inefficient, consuming billions of kilowatt hours every year, the issue is not about corporate waste. Most IT departments are doing everything possible to save money by leveraging various strategies and technologies such as IT outsourcing, virtualization and the cloud to reduce infrastructure spending and utility consumption. The “green data center” movement has introduced several new practices, such as free air cooling, to cut back – sometimes by more than 50% – on energy use.

Data centers are considered “energy hogs” because our digital world is exploding. The impact of the social web, cloud computing and the ever-growing move toward automating manual processes in even the smallest of companies means terabytes upon terabytes of data for servers to analyze and store. Every day, we rely on data centers to power the apps that we require for work, to manage our personal business and for consuming information and entertainment.
Fortunately, there’s at least a partial fix to the data explosion as relates to energy use. From a facilities standpoint, consolidation makes ample sense. Ideally, the IT industry needs to focus primarily on locating equipment within purpose-built data centers to achieve the greatest efficiencies in infrastructure and utilities usage. Across North America there are currently thousands of small, retrofitted data centers that are 10 or 20 years old or older; these facilities aren’t being used to full or even half capacity. By outsourcing equipment to purpose-built facilities through services like co-location, virtualization and cloud computing, and replacing hardware with newer, energy-efficient equipment, we’ve got a start toward a greener global computing infrastructure.

Data center operators work hard to reduce power usage effectiveness (PUE), the core measure of data center energy efficiency, in these aging centers. PUE measures the relationship between kilowatts of imported energy with the kilowatts emitted by the internal infrastructure. It’s not uncommon for a smaller or older data center to have a PUE of between 2.0 and 3.0, meaning that it is using two or three units of energy to produce one unit of power. The PUE target for new data centers is 1.2, indicating a roughly one-to-one relationship between power purchased and power produced. Data centers with reduced PUE ratios are able to extend associated cost savings to the end users.

New data centers are being designed from the ground up with energy efficiency in mind, and the first step is to find the cheapest source of reliable power. Semi-rural areas with low cost power such as in areas of North Carolina and in Quincy, Washington are drawing the likes of big tech companies including Google, Microsoft, and Apple. Our company has also strategically chosen to open two purpose-built data centers in Western New York; one in Buffalo and another in Rochester. Western New York is an ideal location for data centers because of its immediate proximity to adequate power and pre-existing copper and fiber network infrastructure. The area’s attractiveness as a location for data centers was further validated in 2010 when Yahoo! opened a 155,000 square-foot data center in Lockport, New York.

Fortunately, the focus on IT outsourcing and the increased adoption of hybrid hosting and cloud computing infrastructures has made it both acceptable and viable for companies to locate their data centers farther out from major metropolitan areas. While companies used to require that the data center be housed near downtown areas and company headquarters, Hurricane Sandy is just one example as to why that thinking is faulty. When you combine the risk of political and terrorist attack to natural disasters that can bring an urban area to its knees, there’s an even stronger case for relocating data centers away from large cities. Mitigating factors in location include what kind of savings the local utilities will provide, quality of the high-speed fiber connections and the talent pool in those areas.

Beyond efficient power and risk management, operators should also consider reliable access to renewable energies when designing facilities. Our company is looking into the US West Coast as a future site, because of the relatively plentiful access to geothermal and wind energy. If we can use renewables for the primary power source and The Grid as the backup power source, we know we will be able to cut 50% or more from our energy bills – savings that we can pass along to our customers.

Finally, a critical piece of the greener data center movement is making optimal use of new cooling practices and energy-efficient hardware. Free air cooling, an integrated system of heat exchange which uses outside air to cool the inside of the data center, means that a data center can obtain up to 90% of its cooling needs without using electricity. There is also the concept of hot aisle containment, which improves the airflow in a data center to isolate hot air and prevent it from recirculating throughout the room.

Modern data centers use purpose-built equipment, including servers, UPS systems, air handling and cooling units, and backup generators to maximize efficiencies throughout all areas of operations. Instead of retrofitting old hardware, companies have the opportunity to purchase and provision hardware that has been designed specifically for data centers – machines which are getting faster and “cleaner” every year. Of course, virtualization is now table stakes in enterprise IT; data centers are buying one-fifth or less the number of physical servers that they did five or 10 years ago.

Building the modern-day data center will require capital investment, of course. Yet considering that the average life of a data center is 20 years, the payback will be sooner than later. Let’s say that a traditional data center is spending $100,000 per month on its utility bill. If it could cut those costs by half through some of the strategies mentioned in this article, it will save $600,000 in just one year! Over time, those numbers add up to many positive benefits – and not just from a financial perspective.

Corporate America is not known for having a preponderance of tree huggers, but reducing energy use also means reducing pollution and preserving the beauty of open space and wilderness areas for this and succeeding generations. That’s a future that is worth investing in today – and modern data center strategies will help get us there.

Robert Offley is CEO of CentriLogic