By Paul Speciale, Chief Product Officer, Scality

The rapid emergence of applications that create and use unstructured data like video, documents, logs and even social media sentiment has driven an exponential increase in the data we collect, store and use to make critical business decisions. But the underlying architecture of enterprise storage systems has largely remained the same since the days when we measured storage in mere terabytes.

This has created a crisis for IT to deliver greater degrees of data availability to meet 24/7 business demands, while dealing with ever-growing pools of all types of data in hybrid, multi-cloud environments. Today’s economy demands unprecedented performance from systems and fuels the appetite for data across the enterprise. In some instances, an organization’s existing storage deployments can’t handle the data deluge.

IT budget pressures can make it extremely difficult to fund modern technology solutions. The lack of a clear upgrade path for many technologies means that forklift upgrades are often the norm, even when everyone knows they aren’t a great idea. As the data ecosystem continues to evolve, it’s imperative that organizations invest in optimizing their data strategy and systems if they want a fighting chance of keeping up with customers, competitors, and emerging challenges.  

IT must keep the data ship afloat as many organizations are in a dramatic shift to very large objects with huge scaling requirements, making them rethink how they deploy and use storage devices. They often augment existing SAN and NAS devices with object-oriented storage devices in an attempt to leverage the same kind of flexibility and cloud economics that virtual machines do. This has led major data players like Facebook, Instagram and Shutterfly to take an object-first approach to storage.

As companies rise to the data transformation challenge, many begin by replacing purpose-built storage controllers with industry-standard, scale-up and scale-out hardware that is readily available and easily procured. Utilizing standard hardware components enables enterprises to adopt the software-defined (SD) approach that the rest of the data center has already implemented. They then reapply SD principles to storage, with an emphasis on a high degree of scalability.

Object storage can also separate data, connectivity and management planes to allow each to scale and grow as needed without impacting the other elements of the system. Using industry-standard hardware for each of those planes in combination with innovative storage solutions offers several advantages like increased uptime, inherent connectivity and a supervisor server.  

This approach to object storage provides the ability to scale capacity and performance while providing exceptional availability. It enables an enterprise to remove bottlenecks anywhere in the system and add capacity only where it is truly needed. Having the flexibility to support any type of data, from any client, anywhere in the world, enables object storage to integrate smoothly with existing tier-one SAN or NAS storage, data, applications and users.

As organizations of all kinds increasingly turn to a hybrid IT environment that encompasses both on-premises systems and multiple clouds, the ability to move data easily between hosted and local environments means storage stays in step with enterprise strategy as business objectives change. The economic benefits are multifold as well. Utilizing industry-standard hardware enables organizations to get the best value from every object storage dollar they spend. The roll-as-you go model of object storage gives IT teams the capability to scale storage, connectivity and management independently of one another. IT and departmental budgets aren’t wasted on unnecessary upgrades to every plane simultaneously.

Follow @Scality

 

About the Author

Paul Speciale is Chief Product Officer at Scality leading product management. Before Scality, he was part of several cloud computing and early-stage storage companies, including Appcara, Q-layer, and Savvis. In the storage space, Paul was VP of Products for Amplidata focused on object storage, and Agami Systems, building scalable, high-performance NAS solutions. Paul has over 20 years of industry experience that spans both Fortune 500 companies such as IBM (twice) and Oracle, as well as venture-funded startups, and has been a speaker and panelist at many industry conferences. He loves backpacking, cars, and is a long-standing UCLA football and basketball fan, from which he holds Master’s and Bachelor’s degrees in Applied Mathematics and Computing.