One of the most critical aspects of Information Technology (IT) that often gets overlooked is the interface for configuring IT infrastructure. The journey from "write-only" interfaces to the highly complex multi-cloud environments we find ourselves in today is a tale of innovation, oversight, and the constant quest for efficiency.
In this article, I’ll delve into the evolution of systems design and distributed systems, focusing on the challenges and opportunities that lie ahead.
The write-only era: A look back
There was a time when configuring IT infrastructure was akin to writing in a diary that you couldn't read. Systems started in an unconfigured state, and administrators applied commands to set up connections, services, and processes. However, there was no way to query the cumulative state of all these configurations.
The original Cisco Internetworking Operating System (IOS) serves as a classic example. Administrators had to rely on their memory or documentation to understand the system's state. If multiple people were responsible for administering the device, the inherited state became a high risk bet on ad hoc communication to keep things synchronized. This lack of visibility made automation a far-off dream, as no one could reliably manage the system's behavior without a complete teardown and rebuild.
The cloud era: A step forward yet backward
Fast forward to today, and we find ourselves in a highly evolved era of not just cloud computing but multi-cloud environments. Companies are leveraging services from AWS, Azure, Google Cloud, and more, often simultaneously.
However, in the rush to adopt cloud technologies, I believe we've regressed to the write-only paradigm, albeit in a different form.
When you log into a shared AWS account and start deploying services or instances, you're thrust into a labyrinth of system states related to your account, Virtual Private Cloud (VPC), hosts, and so on. This information is scattered across various cloud provider APIs and instance APIs like kubectl for Kubernetes infrastructure.
The sheer volume of information, coupled with its high-frequency changes, creates an environment of information overload. In essence, we've returned to a write-only state, masked by the illusion of a read/write environment.
The future: Multi-cloud automation and beyond
The next logical step in this evolutionary journey is the development of higher-level automation systems capable of managing this complexity. These systems would continuously query the state of each virtual entity, ensuring real-time situational awareness. This "eye in the sky" would allow administrators to make informed decisions based on the real-time, end-to-end state of the system.
Once we achieve this level of automation, the possibilities are endless. We could create consistent automation protocols across multiple cloud domains, manage security controls at various abstraction layers, and even pave the way for innovations akin to autonomous vehicles – imagine autonomous multi-cloud security controls in your future infrastructure solutions.
Imagine a world where your multi-cloud environment self-optimizes based on real-time data, where security protocols adapt to emerging threats instantly, and where system administrators can focus on strategic initiatives rather than getting lost in a sea of configurations.
Embracing multi-cloud automation — but not cloud blindspots
When systems can self-optimize, it’s a smart move to bring in checks and balances. Cloud environments are complex ecosystems with numerous services, configurations, and access points. Whether humans or automation make changes to applications and infrastructure, it can still happen at a rapid pace.
Even with automation, adaptation, and analysis, this complexity and speed can lead to misconfigurations, leaving vulnerabilities that may go unnoticed until exploited. Recent research by Vanson Bourne validates this, with survey results showing that nearly half of all breaches in the last year originate in the cloud, costing organizations an average of $4.1 million.
In traditional on-premises environments, organizations had more information and control over their infrastructure and data. In contrast, cloud environments may lack the same level of information, particularly about the traffic flow of applications, data, and workloads, making it challenging for teams to monitor and detect security threats effectively.
So, along with self-optimization, the system needs to continuously check that security controls are doing what they’re meant to do, particularly with the thousands of APIs, applications and their workloads that constantly spin up and down.
Get cloud visibility and speed up automation with Illumio CloudSecure
Illumio CloudSecure allows organizations to visualize cloud workload connectivity from a single view. Security teams can gather insights with the interactive map of application deployments, resources, traffic flows, and metadata.
Teams need to stay on top of fast changing communications between cloud resources that constantly spin up and down. They need to be able to visualize their connectivity, and need a plain language description of each asset. When cloud assets are under attack, these tools make it easier to quickly understand the connectivity between cloud applications and resources that is allowing attackers to move throughout a network. This supports rapidly blocking such movement and protecting the estate.
That's the future we're building towards, and it's closer than you might think.