Seattle is the city of coffee, Kurt Cobain and cloudy skies, so it was a fitting extension of our moody weather that the theme of GeekWire’s inaugural, recent one-day Cloud Tech Summit was “Seattle, Cloud Capital of the World.” While fascinating and crucial things are happening in the industry all over the world––and it’s critical not to lose that global perspective––it’s hard to deny that something cloudy is happening here.
GeekWire is a local tech publication that curated the day’s content, inviting speakers from Amazon Web Services, Microsoft Azure, Google Cloud Platform, Apptio, Cloud Foundry and Docker to take the stage. If you were following the sound bytes on Twitter, you might have seen multiple cloud providers claiming the same customer: But wait, I thought that was a GCP user? And weren’t they an AWS user?…Hold on, how do BMW and Adobe Marketing use Azure, but are also OpenStack users?…What is going on here?!
The most powerful take away from the day is that all of those statements can be correct. You can be an AWS user and an OpenStack user. You can be an Azure user and an OpenStack user. It isn’t an exclusive relationship: it’s multi-cloud.
Multi-cloud was the giant elephant in the room that Apptio CEO Sunny Gupta and Google Cloud director of product management, office of the CTO Greg DeMichillie brought up, that we could have, and maybe should have, spent all day talking about.
At the Barcelona OpenStack Summit in October 2016, Jonathan Bryce, OpenStack Foundation executive director, said, “The future is multi-cloud.” Months later, that’s not just the future, but what’s already happening.
Keeping an eye on Twitter hashtag #GWCloudTech, I could tell some audience members were confused. Yes, this means there will be multiple cloud providers in the marketplace, but that’s not really what we’re after here. What we’re after is the reality that different workloads belong in different homes for reasons like cost, performance, compliance and different business requirements. Some things need to go public; some things need to go private. What goes where for who is still a major question that organizations are figuring out for themselves, and in open environments like our OpenStack community, they’re happily sharing what they’ve learned.
But while we figure out the optimized “what” and “where,” it’s a matter of fact that none of it can happen without the “how,” and the “how” is open APIs. We can’t ignore the realities of the landscape and will be hurt if we ignore the technical demands of those realities: interoperability and compatibility.
Google’s DeMichillie drove this point home over and over again during his fireside chat with Senior Writer at Fortune Barb Darrow: “There’s no doubt we’re going to live in this world where we have applications running on an on-prem environment and a cloud environment.”
Open APIs matter because, as DeMichillie reiterated, they allow you to find the right mix, running what you need to on-prem and in the cloud. As demands change they allow you to change your mind, move providers, or move technologies, without the costliness and headaches of migrating from scratch. How do you adopt your architecture to emerging technologies, like Kubernetes? With open APIs.
The open, cross-community collaboration in the Kubernetes and OpenStack Working Groups is a perfect example. Both groups have mutual users, and those users can benefit from access to emerging technologies on both their cloud platforms. As DeMichillie put it, “Cloud is a really interesting opportunity to not repeat mistakes of the past. Take the openness that Linux brought to the world. We need to make sure that we don’t take a step back and we don’t make that mistake.”
Want to join the conversation around openness and multi-cloud? You can join us at an upcoming OpenStack Seattle MeetUp, or find me for a cup of Seattle coffee at anne at openstack.org, IRC annabelleB, or Twitter @whyhiannabelle.