In the classic opening to the original Star Trek series, William Shatner, playing the iconic Captain James T. Kirk, provides the now-famous voiceover about “space, the final frontier.” He describes their mission:
“To explore strange new worlds.
To seek out new life and new civilizations.
To boldly go where no man has gone before.”
Those of us in the computing trade are now venturing into a new frontier, whether it is the cloud, mobile, or the Internet of Things. This new frontier is a shift from today’s well-understood client-server computing environment. Computing is becoming:
- Heterogeneous, and
- Fast (as in warp factor 10).
The entire stack is morphing as applications, rather than infrastructure, have moved to the forefront of IT. To respond to this, “new stack” players are evolving: a new cohort of server/compute capabilities is emerging in Linux containers from companies like Docker, Mesosphere, and CoreOS; distributed software is now driving storage architectures from Coho Data and Datera; and software architectures instrumented by strong APIs are being inserted into the traditional closed-box hardware network by companies such as Arista, Cumulus, and VMware. To top this off, every decision is increasingly predicated on rent/buy decisions (e.g., SaaS, IaaS).
So why hasn’t security changed? Why are we still inserting 20-year-old perimeter models into a fluid and cyberchallenged security environment? In a world of porous borders and cross-cloud applications, why is a network-centric approach still the primary security option?
Enterprises already know about this problem. In a recent Death of the Perimeter poll, Dark Reading editor Marilyn Cohodas points out: “the classic view about what constitutes a network boundary has given way to a new metaphor. For 55% of respondents, the network perimeter has evolved to a seemingly boundless space ‘anywhere and everywhere data is located’ that incorporates what is on the device, in a cloud, or on a server (51%) and how the data gets there and back (4%).”
In a world of porous borders and cross-cloud applications, why is a network-centric approach still the primary security option?
From a security perspective, if perimeter defense cannot prevent unauthorized access to compute resources, how will it keep the interior of the data center safe? You can try to drive firewalls (physical or virtual) deeper into the interior of the data center, but then you carry all the baggage and complexity of the networking paradigm (e.g., arcane fixed addressing, traffic steering, and limited policy options) into this new space, whether you “automate” the configuration processes or not. Something that is complex and fragile at the perimeter does not improve with proximity to the computing’s crown jewels. One must ask: how can you write a firewall rule without an IP address? How can you stretch a chokepoint across three data centers or clouds for an N-Tier application?
Space: The final frontier
We’ve been operating this way because there has not been another option. We have been duct taping our security together with the network-centric approach of the client-server era. We need a new option: one that requires thinking about the components within the data center and the cloud as space. To wit, securing the next generation of computing is a “physics” problem. It requires an inside-out perspective. The atomic unit of computing is the workload—a physical server, a virtual machine, or a container—and workloads are the core building blocks of applications. Applications and data are what we are actually securing. The infrastructure is there to support them.
Security must offer visibility and control without being burdened with the gorp of the infrastructure. It must mirror the dynamic and distributed computing environment, moving and changing when the computing instance does. It, finally, requires a single security operating model that effectively removes the risks, and lowers the costs, of having one security paradigm for the data center and another for the public cloud.
Putting traditional firewalls into distributed computing is the equivalent of putting an electric motor into a horse and buggy—it will never turn it into a Tesla.
At the end of the day, putting traditional firewalls into distributed computing is the equivalent of putting an electric motor into a horse and buggy—it will never turn it into a Tesla. If we are going to get a handle on the tremendous risks inherent in computing today, we are going to need a clean piece of paper. To quote John F. Kennedy (who led America into space): “those who look only to the past or present are certain to miss the future.”