Zahl Limbuwala, CEO of Romonet (, says:

We all agree that cutting energy costs and cutting carbon are both vital in the datacenter – but the way that datacenter user costs are measured and portioned out needs to radically change if we want to fully realize the benefits of the latest energy efficient technologies.

For me, the usual way that costs are divided out is badly flawed – and this is something that occurs in businesses across the world. It fails the user who might want to reap the returns of buying the latest energy efficient IT and it fails the industry as a whole because it discourages people from using IT in a more innovative, responsible and energy efficient manner. In turn, this means that datacenters (and the IT equipment within them) are often far more wasteful and carbon intensive than they should otherwise be – and that’s no good for any of us.

So what’s going wrong? Imagine you’re a manager running an enterprise datacenter for a large bank. Within your datacenter, you are servicing multiple internal clients – from the bank trading floor to customer services. Each client ‘buys’ services from you and you have to decide how much to charge them in return – and this is where the problems start.

In the past, this ‘charge-back’ as it’s called has been relatively straight forward. Finance will typically use physical space as a proxy for how much datacenter capacity a client uses. If they have them at a granular enough level, they may then look at their electricity meters to see how much metered energy they’ve used and then they will multiply this figure by a single averaged multiplier (normally Power Usage Effectiveness) to account for the shared overheads – which can be very large. It’s a model similar to the one used to figure out costs in a serviced office. In a shared office space, a tenant pays for how many square feet they use. Then they pay a variable fee for the utilities consumed and a portion of the fixed costs of the common areas you share with other tenants – the phone lines, kitchen area, cleaning services, etc.

Up until fairly recently, that’s worked fine because everyone has been buying roughly the same kind of IT equipment and using it in the same kinds of way. But over the past few years, things have significantly changed in the datacenter because the technology’s advanced.

Modern virtualization technology can vastly increase the utilization and therefore energy efficiency of datacenter servers and today, a single piece of IT kit can perform the job of multiple machines. This means that physical space is no longer an adequate indication of how much of the shared fixed costs are being consumed by any one particular client within a datacenter – costs can be vastly different depending on the technology you use and how you are using it. This also means that how many – or few – boxes a particular datacenter client buys is no real indication of how energy efficient – or otherwise – their behavior is.
Unfortunately, whilst the technology has certainly advanced, the method a datacenter manager uses to measure and then divide up datacenter costs has remained unchanged.

To explain this point further, let’s return to our fictitious bank datacenter. Let’s say there are two internal business units within the bank using the same datacenter. One of these clients is happy to invest more in modern compute servers and software virtualization technology – and therefore will be more energy efficient – and they’re happy to pay a premium in the upfront capital costs. Let’s also say that there’s another department that doesn’t care so much about their energy efficiency or the overall energy bill. They’re just interested in buying the cheapest technology they can and if they need to buy more capacity – they’ll just buy more boxes later down the line. Now, these two departments might take up roughly the same physical space – so they’d be charged similar amounts on that level. Their electricity costs would be divided out depending on how much each uses – so far, so fair. But what about the share of those fixed costs? Here’s where the unfairness arises.

Whilst the department using the more energy efficient kit will have a lower overall utilization of that overhead (because they are achieving more with much less) they won’t be given the benefit of that lower impact because fixed costs are shared out equally. The more energy efficient department won’t be rewarded for their low carbon behavior and conversely, the energy hungry department won’t be penalized for underutilizing their IT and wasting expensive shared resources.

If a department can’t see a cost benefit in choosing energy efficient kit, they’re unlikely to continue in the long term. If a department doesn’t think it will be penalized for carbon hungry activity, then they’re unlikely to change either. If we want to influence buyers to make more cost and energy efficient buying decisions we must get the chargeback methodology right to incentivize the right behavior.

So, what’s the answer? In order to present any sort of meaningful or fair charging mechanism and to get the market moving towards producing more energy efficient equipment, we need to be able to differentiate between a client’s fixed costs (building the capacity) and their variable costs (using the capacity). Those clients with better and more highly utilized efficient equipment must pay less of the bill for the overheads because proportionally, there are actually using less. We must fix the system to ensure that whether you are running an enterprise datacenter or a co-location center, costs are allocated proportionally and fairly. Only then will we see businesses understand the real cost benefits of using the latest energy efficient technologies – and only then will we see longer term behavior start to change.