New Data Architecture
– Stefan Bernbo, CEO and founder of Compuverde, says:
With billions of devices connected to global networks, the Internet of Things (IoT) is rapidly transforming work and leisure. The phrase was coined in 1999 in reference to connecting Proctor & Gamble’s supply chain RFID to the Internet, and in just 15 years it has become a worldwide phenomenon spanning all industries. In fact, 12 billion mobile devices alone will be part of the IoT by 2020. An increasing number of organizations – from start-ups to cities – are exploring opportunities that this level of connectivity affords and creating processes, products and services designed to increase productivity and improve user experiences.
These billions of connected devices are generating volumes of data on a scale never before seen, all of which must be stored somewhere. Current approaches to storage are proving insufficient for the demands of the IoT. Service providers are considering changes to their data center architecture that will increase storage capacities without breaking the bank.
Current Architecture: Redundant and Slow
The current data center paradigm is founded on appliances that come with proprietary, mandatory software. The software is designed for the hardware and vice versa, and they come “baked in” together in a package. This configuration is convenient and easy to use..
As is the case with all things, hardware will eventually fail. Accordingly, appliances come with redundant copies of expensive components to anticipate and prevent failure caused by reliance on a single point of entry. These redundant extra components bring with them higher hardware costs, greater energy usage and additional layers of complexity. When companies, in anticipation of growth events like the IoT, begin to consider how to scale out their data centers, costs for this traditional architecture skyrocket.
Traditional appliances have another drawback in how they handle incoming data. They have a vertical construction, wherein requests come in via a single point of entry and are then re-routed. Think about a million users connected to that one entry point at the same time. That’s a recipe for a bottleneck, which prevents service providers from being able to scale to meet the capacity needed to support the Internet of Things.
Software-Defined Storage: Scalable and Fast
To overcome these drawbacks, organizations are looking for alternative storage approaches. Software-defined storage is one of these alternatives. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates the dependency on server “appliances” with software hard-wired into the system. This option provides the scalability and speed that the IoT demands.
The concept of software-defined storage has only recently come into the limelight, but a variety of devices have been “software-defined” for years. The PC is a perfect example: software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. The average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand.
Software-defined storage is an option that frees the software from the hardware, allowing administrators to choose inexpensive commodity servers. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users’ growing demand for storage.
Administrators are also freed to consider what their businesses truly need and select only those components that further growth goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company’s needs.
This method of storage is also more scalable, which serves the diverse needs of data centers across industries. A telco servicing one particular area will have different storage needs than a major bank with branches in several countries, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, fully uncoupling the software from the hardware can extract substantial gains in economy of scale.
An additional advantage of software-defined storage is its use of a horizontal architecture that streamlines and redistributes data. This eliminates the potential bottlenecking problems of vertical, single-entry-point models. Data is handled faster and more efficiently, and this non-hierarchical construction can be scaled out easily and cost-effectively.
Distributing the Storage Layers
The distributed approach to storage infrastructure will complement IoT as well. With millions of devices needing to access storage, the current storage model that uses a single point of entry cannot scale to meet the demand. To accommodate the ballooning ecosystem of storage-connected devices all over the world, service providers, enterprises and telcos need to be able to spread their storage layers over multiple data centers in different locations worldwide. It’s becoming increasingly clear that one data center is not enough to meet the storage needs of the Internet of Things; storage must instead to be distributed in a way that lets it be run in several data centers globally.
Scaling For The Future
Every day, humans and machines create 2.5 quintillion bytes of data. Companies need to find innovative ways to store this avalanche of data while keeping costs down. If they continue to use traditional appliances for their storage needs, they will have to buy more and more costly appliances that lack flexibility and are prone to bottlenecks. However, that is no longer the only option. Software-defined storage provides a scalable, horizontal architecture to help meet the demands of the Internet of Things while providing cost savings. This storage alternative will help organizations to handle current needs while providing peace of mind that they will be able to more easily and cost-effectively handle tomorrow’s storage challenges as well.
About the Author:
Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, Stefan has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.