When a sports team makes the playoffs a sense of excitement builds in their home city and people who didn’t follow the team all season long come out of the woodwork to jump on the bandwagon and cheer on their local squad.
It almost feels like something similar is happening in the cloud and broader tech world with container technology right now. While containers are not a new technology, in the past month or so many of the major tech vendors have made announcements supporting the operating-system level virtualization process known as containers and Docker, the darling container management open source project/company.
Containers are a lightweight application packaging model in which an app, along with all its dependencies can be assembled in a container and run on physical, virtual, bare metal, public or private clouds. During the past 18 months as the container technology has reached buzzword status some have questioned what the rise of containers will mean for hypervisors, which virtualize hardware infrastructure at a lower level creating virtual machines. Containers can run on VMs but they don’t have to. So could containers spell the demise, or at least the dramatic reduction of VMs? That would be bad for a company like VMware, a company that still makes a lot of money selling hypervisors.
So, one of the most surprising companies to weigh in on the container madness has been VMware. At VMWorld last week executives from the company argued that VMs and containers can live happily together. There are different use cases for VMs and containers, says Chris Wolf, VMware’s CTO of the Americas. “I’ve always thought that there is room for containers and VMs to live together for the next several years. I see value in two layers of encapsulation, one at the OS and one at the app and we cannot ignore the enterprise readiness of VM security and VM management tools. Container management and security still needs improvement so why not combine the two worlds?”
At VMWorld the company announced plans to support Docker - which automates the creation of containers - in the company’s management software. Basically Wolf’s argument (which he outlines in a blog post) is that if customers want to use containers and Docker, they should manage it using VMware’s software. Doing so will provide users a “single pane of glass” for managing VMs and containers all together. For applications that need rapid scale-out and expansion, containers could be a good fit. For a whole variety of more traditional workloads, VMs do just fine. With containers and VMs, it’s not an either or, it’s an “and” situation.
VMware isn’t the only company talking containers recently though. It’s easier to find cloud companies that aren’t talking containers than those who are.
Microsoft this week affirmed its commitment to containers, particularly for how they can be used in Azure, the company’s public cloud IaaS and PaaS. Microsoft first announced in July that it would be working with Docker and therefore containers. This week, the company announced a new tool for Azure named the Kubernetes Visualizer that helps graphically show how containers are created, managed and being used in Azure. Kuberntes is a cluster management tool open sourced by Google that basically allows groups of containers to be managed.
Cisco is even jumping in. The company’s chief technical officer for cloud infrastructure services announced plans to use containers in the company’s recently announced Intercloud. In a blog post Kenneth Owens runs through many of the reasons why containers have increased in popularity recently, focusing specifically on their uses in a devops environment (that’s where developers and operators work more closely together). Buried in the post was a tidbit of news too though: “Cisco Cloud Services is creating an Intercloud of container and micro-services in a cloud native and hybrid CI/CD (continuous integration/continuous delivery) model across Openstack, VMware, Cisco Powered, and Public clouds. Look for availability early next year.”
Seemingly feeling a desire not to be left out of the party, a blogger at IBM jumped on to the container bandwagon too, issuing a blog post this week discussing the idea of containers but not releasing any news from Big Blue about its container strategy. The company’s PaaS, named Blue Mix is based upon the open source Cloud Foundry platform which uses containers heavily though.
Other companies like Red Hat have been playing in the container game for a while. The company’s PaaS named OpenShift was built on container technology when it launched more than three years ago. Red Hat began supporting containers in its Red Hat Enterprise Linux in version 6. Version 7 supports Docker and the upcoming release of OpenShift will support Docker as well. “Docker has made it easy to use containers just at the same time that many developers have turned to using containers,” says Joe Fernandes, a product manager for OpenShift at Red Hat. The company even has some new initiatives like Project Atomic, which is a Linux version being developed by Red Hat optimized for running containers.
IaaS provider CenturyLink released Panamax, an internally developed tool for managing many containers all at once. Docker’s partner page lists companies from Rackspace to Google and Canonical. Docker CEO Ben Gollub said the company is willing to work with just about anyone who will extend the reach of containers more broadly.
While the idea of containers have been around for a long time, the interest among developers for using containers has piqued recently, driven by new cloud-first, mobile and web-scale applications that benefit from running in containers. As that’s happened new tools like Docker, Kubernetes and others have emerged to more easily manage containers. And as developers have embraced containers now so too are big IT vendors.