Originally posted on Data Center Frontier by Bill Kleyman

CostOfDowntime-03_featured

Cost of Downtime is a popular number and a useful metric to have when making the case for additional resources. (Photo courtesy of Emerson Network Power)

Today’s business is tightly coupled with the capabilities of its data center. We’re seeing more users, more virtual workloads, and a lot more use-cases for greater levels of density and compute requirements. There’s no slowdown in data growth in sight, and data center requirements will only continue to grow. Through it all, organizations have been working hard to improve data center economics with better underlying data center ecosystem technologies.

When working with data center technologies – there are numerous considerations around the business, the workloads, and the users. Data center administrators must use the right tools to help them deliver a truly optimal data center experience. Professionals in the area constantly struggle with core management and control challenges like:

  • Which DCIM tools are the most critical?
  • How do you control and optimize your PUE?
  • What should my average rack density be?
  • What happens during an outage and what does it cost?
  • What are some ways that I can speed deployment?

To read the full article, view it on the Data Center Frontier website here.