By Tom Cook, CEO, Permabit

Screen Shot 2016-07-25 at 3.14.11 PMChange is hard to explain, especially when you are in the middle of it.  Take the massive shift to cloud computing occurring today.  For quarters on end, major infrastructure vendors were missing their numbers and claiming that they were merely experiencing “temporary slowdowns” or “regionally soft demand.”   Rather than challenge these claims, the financial and industry analysts fell in line, suggesting turn-arounds were, “right around the corner.”  But, the turn-arounds didn’t happen.  While infrastructure giants were sailing directly into the wind, Amazon quietly expanded Amazon Web Services (AWS), Microsoft reinvented itself, VMware repositioned, and open source heavy-weights such as Red Hat, realigned products and service offerings around hybrid cloud business models.  The Hybrid Cloud exploded right before our eyes – and with it has come the Age of IT Efficiency.

IDC reports 82% of enterprises have a hybrid cloud strategy and are actively moving workloads to the cloud.  How did the shift to cloud happen so fast?  Simple.  Just follow the money.

Jeff Bezos of Amazon has said, “Your margin is my opportunity.”  The shift to cloud occurred and an entire ecosystem (not just Amazon) aligned to support the shift. Open software defined infrastructure vendors like Red Hat, SuSE and Canonical, white box hardware vendors such as Foxconn and Quanta, public cloud providers like Amazon, Microsoft and Google, and services companies like Mirantis came to provide low cost and highly efficient compute, network and storage solutions. As a result, hybrid cloud demand surged.  In fact, IDC estimates the hybrid cloud will account for 80% of IT spend by 2020. And owing to higher utilization rates, lower pricing, and greater density, hybrid cloud solutions cost a fraction of proprietary hardware products.

Hybrid cloud or more specifically open hybrid cloud is on the way to becoming the leading enterprise IT architecture, ushering in the Age of IT Efficiency.

Hybrid Cloud Changes Everything

The flight to IT efficiency started just 10 years ago with Amazon Web Services. Today, large public cloud services deliver extreme IT agility at significantly lower cost when compared to yesterday’s data centers.  Because of this flexibility and efficiency, many smaller organizations have moved entirely to the cloud to meet their day-to-day IT business needs.

While larger organizations, use the public cloud for some projects, they face challenges around data proximity, security, and long term project/investment lifecycles that make on-premises (and/or private hosting) data center infrastructure the correct fit for other applications.

As Fortune 1000 companies explore hybrid cloud, they discover that they can substantially lower their costs if they simply “do it like Amazon.” Like Amazon, they can build hybrid cloud data centers that derive efficiency from four key technologies: Virtualization, Open Source Software, White Box Hardware and Data Reduction.

-Virtualization drives up utilization rates by supporting more workloads per server.  The net effect is that cloud IT organizations are able to increase the density of their data centers saving substantial costs in real estate while gaining a tremendous amount of operational elasticity. The same hardware used at 2 p.m. for one workload, can be repurposed at 3 p.m. for another.

Open Source software and open collaboration via Open Source frameworks has established
a huge ecosystem of developers (spanning across industries and academia) driving innovation in massive scale-out infrastructures of the cloud data center. These projects are focused on scalability, reliability, security and manageability. OpenStack and Linux itself are two great examples of open source projects that contribute tremendously to cloud implementations.

The availability of commoditized “white box hardware” facilitates the cloud revolution.  In the past, traditional IT environments required “branded hardware” to ensure IT had the reliability and performance it needed to operate. Today, as industry analyst Howard Marks of DeepStorage.net notes, “If you care about whose equipment is in your cloud… you’re doin’ it wrong!” Advancements, both in commodity hardware components and software, have enabled cloud IT organizations to use lower cost white box hardware in the largest data centers on the planet. And, every year the cost of those components drop as competitive market forces and technical efficiency gains drive better economics. This phenomenon has enabled cloud data centers to build extremely cost effective and efficient IT infrastructures.

The final frontier of the hyper efficient data center is data reduction.  These data centers combine fast direct-attached storage solutions with modern object-based cloud storage facilities (low-cost, bulk data storage). As a result, software defined hybrid cloud deployments benefit substantially from data reduction that combines inline deduplication, compression, and fine-grained thin provisioning to increase data center density and dramatically decrease compute and storage costs in the hybrid cloud. The net result of increased density is that more data is stored in less space and consumes fewer resources reducing total costs.

From the storage perspective, hybrid cloud data centers are moving to multi-petabyte scale. At that level, savings isn’t just about spending less on HDD or flash.  Instead the big savings from data reduction are derived from increased data density. With data reduction optimizing density of existing data centers is simple, fast and far more compelling than bearing the cost of new data center space.  This density increase also dramatically cuts the cost of power, cooling and administration. Once the infrastructure is optimized for hybrid clouds with virtualization, open source operation software and white box servers, the next step in efficiency is modular data reduction!

The Data Center Density Challenge

High density data centers are part and parcel of the new hybrid cloud infrastructure landscape.  Data Center Journal’s What Does High Density Mean Today?  points out the challenges of high density data centers including questions about power and cost.  Gartner predicts that by 2015 50% of data centers will have high density zones enabled by high density storage arrays and servers. Are we already there?

Data Center Journal’s Is Cloud Computing Changing Data Centers? describes the economic drivers behind the data density issue by discussing IT infrastructure budget limitations, the variable cost of today’s capital expenditures for data storage and the business agility needs. It also highlights the power and cooling challenges as data centers continually expand to meet data storage needs that are beginning to reach critical mass.

Recent research from Peerless Business Intelligence highlights the importance of data reduction to high density data centers.  In Cloud Data Center Economics: Efficiency Matters, Lynn Renshaw discusses the need to “rise above” hardware and the physics of space and power/cooling and to look at the bigger picture of data center costs on a square foot basis.  While data centers are reaching power and cooling density limits, the cost per square foot of building data centers continues to increase and is becoming prohibitive for most businesses.  Today, there are over a billion square feet of data centers. As we continue to store more information, consume more storage and processor cycles and utilize more power, there are limited physical options available to increase data center density.

Renshaw takes us to the obvious next step of leveraging software and in specific data reduction software to store more data in less space, thereby reducing the square footage demand.  As her “back of the napkin” sample calculation demonstrates, cloud data centers can realize substantial space savings, by leveraging data reduction software. Her example shows how a 100,000 square foot facility can save over $74 million in costs. Data reduction software not only reduces the amount of data stored, it also lowers the number of storage arrays and as a result power/cooling costs and the square footage they consume in a data center.

Taking her thesis a step further, data reduction increases data center density and as a result reduces the need for data center construction!  At today’s costs of $3,000 a square foot, that’s a compelling argument!

Renshaw states the obvious: “Cloud growth is inevitable, but let’s do it with a smaller footprint.”

Conclusion

IT infrastructure is at an inflection point and change is all around! We saw infrastructure giants under extreme business pressure from hyper scale cloud providers that grabbed market share because they delivered lower price points, simplicity and business agility. The Age of IT Efficiency had arrived.

Led by Amazon, “the cloud” evolved rapidly as a business option for data storage and compute. As a result, open software players such as Red Hat, Canonical, and Mirantis (to mention a few) rose in prominence and are seeing rapid growth because they deliver efficiency in cost and operation and higher data density.

The hybrid cloud is now the implementation of choice for IT infrastructure because the combination of data in the public cloud and on-premises create a solution that delivers increased agility at the lowest cost. This has been enabled by white box hardware, software for virtualization, open source operating software and data reduction software. IT infrastructure will be open, flexible and highly efficient as The Age of IT Efficiency is now upon us!