Now that requirements from 5G, the Internet of Things (IoT) and other next-gen technologies have expanded so extensively, high speed and low latency capabilities in and of themselves are no longer considered cutting-edge — they are expectations. To meet these skyrocketing demands, the data center industry has shifted its approach to favor a more distributed topology, putting the edge at the center of the conversation.
The exodus to the edge of the network has stood as a defining strategy in the data center sphere for some time now. However, as new trends and infrastructure approaches continue to emerge onto the scene, one can’t help but wonder — is the edge still a hot topic?
To check in on this question, gauge business sentiments and get in touch with industry thought leadership, the Independent Data Center Alliance (IND-DCA), an industry-led group of independent data center operators with a mission of advancing the market presence of smaller providers, opened up a poll. Here are some of the answers the IND-DCA has received:
When asked if the edge is still a widespread topic of important discussion, Damian Ehrlicher, Chairman and President of ProtectedIT, comments, “Yes, because solving for the edge isn’t just about building a data center and deploying internet bandwidth near your users. The innovation of 5G can’t be realized in the U.S. market until carriers deploy bandwidth to the edge as well as infrastructure. Right now, edge data centers that utilized internal networks are very effective, but until we deploy public internet bandwidth, it’s a moot point to have computer memory and storage at the edge if the network is still going back to a PoP or carrier hotel.” He continues, “So, the edge problem is still relevant in the U.S, especially since we have been solving this problem the same way for 20 years — through a CDN (Content Delivery Network) service.”
So, where is change happening? Ehrlicher adds, “The innovation is coming with network providers and how they are solving for applications to perform optimally in a hybrid model. Micro services have solved part of the equation, but the application stack performance is driving the conversation. How are hybrid models solving for application performance? How are data centers deploying CDN services that allow for business continuity and performance?”
Meanwhile, Phillip Koblence, COO and Co-Founder of NYI, sees it like this: “Like most buzzwords in our industry, the “edge” as a word has become increasingly overused to describe any infrastructure or application deployment, whether particularly relevant or not. There’s no doubt that as capacity requirements increase with more content streaming to mobile devices, 5G networks starting to become a reality, etc., the idea of supporting those applications as close as possible to the end users is a very real need. In that sense, the requirements at the “edge” are only now beginning to take shape, and the demands for data centers (including private and public cloud) and critical infrastructure will continue to grow from urban centers to centralized aggregation and compute nodes.”
He continues, “From our standpoint, the data center space is evolving from a simple commodity requirement (space, power and bandwidth) to a centralized ecosystem. This ecosystem will allow companies to deploy workloads across a spectrum of infrastructure providers including public/private clouds, SaaS providers and traditional data center facilities to achieve seamless connectivity from one to the other. More and more companies are finding that in order to provide relevant offerings, a data center needs to become more of an infrastructure facilitator than a simple facility.”
What are your thoughts on the edge as the industry continues to forge ahead? We’d love to hear them. Click here to join the conversation.
To learn more about the Independent Data Center Alliance, please click here.