By: Russ Elsner, Consulting Architect at ScienceLogic
The pace of today’s business activity requires that IT move faster. The traditional approaches to technology, tools, process, and organization are not capable of achieving the necessary velocity and agility. A massive wave of automation will be required to satisfy the requirements of the business. This automation will be achieved largely through software. Increased use of Application programmable interfaces (APIs) and massive data center virtualization will make such speed possible. And the trend to greater velocity is not going away. With such IT pace now the norm, physical IT workers on a data center floor are a dying breed and programmers rule the roost. IT workers must continue to hone their programming skills or face obsolescence.
The virtualization of the datacenter started with server virtualization. But this was only the beginning. The move towards Software-defined Networking, Software-defined Storage, and the Software-defined Datacenter is being referred to as Software-defined Everything (SDE). This trend continues the move of IT from a hardware-centric model toward a software-centric model. In addition, the movement to Network Function Virtualization (NFV) where all network services become virtual and modular further continues this trend. The distinction between a network skillset and an application skillset will blur. Everything becomes software-defined and will be controlled through software orchestration.
Another extremely powerful IT trend that’s taken hold over the last few years is the massive growth of API use. APIs allow systems to talk to one another in order to share information across physical and logical boundaries and solve for the interdependencies inherent in many business scenarios. Modern management tools now expose their services and data through APIs. By integrating distinct tools and process through these APIs to automate common workflows, organizations can greatly increase overall efficiency and agility. Public APIs have doubled in the past 18 months. Companies like Netflix receive more than 5 billion daily requests to its public APIs.* In fact, the majority of major IT infrastructures expose data, services, and transactions – creating assets to be shared and reused. The interplay between IT infrastructures and their partner/customer/consumer constituencies will fuel high demand for programmers able to smartly and efficiently connect the systemic dots.
Traditional data centers were populated with many physical IT workers – or those charged with racking equipment, cabling, installing software, configuring equipment and services, monitoring performance and troubleshooting. Next-generation data centers will be short on people and long on automation. The rationale is straightforward. For starters, people make mistakes, mistakes cause outages, and outages are expensive. Additionally, physical work is time and resource-consuming. Moving, wiring, and configuring equipment takes time and limits velocity.
Next-generation data centers will be almost completely virtual and largely automated. Very few people are allowed on the data center floor. Servers can be spun up virtually, network connections can be established virtually, disks can be allocated remotely. Entire applications deployed and updated through software. People will rarely interact with physical hardware and operations will performed remotely.
Data center virtualization has triggered a major shift in those IT skills needed to run today’s data center — namely, programming. Historically, programming was an IT skill that only hardcore application developers needed. The closest a network engineer might get to code is issuing lines of IOS command at a CLI prompt. Today, trends like server virtualization, SDN, VNF make it possible to spin up and down entire apps automatically.
As everything becomes software-defined and increasingly “programable” and interconnected, programming capability is not just a critical skill for an IT professional, it’s become table stakes.