Americas

  • United States
Contributor

The history of cloud: a fairy tale

Opinion
Aug 17, 20174 mins
Cloud Computing

This playful look at the history of cloud computing starts before the internet bubble, covers virtualization, and touches on containers. It identifies all the key constituents and discusses how their power has shifted over time from the CFO and CIO to an army of developers, as software architectures have gone from thinking of compute resources as pets to cattle to chickens.

Once upon a time, in a magic, faraway land called “The 1990s,” every application had its own set of physical servers. Citizens of this land, who sometimes called themselves “developers,” feared getting fired for not having enough capacity to handle peak loads. New physical servers took months to be delivered, so developers ordered more data center hardware than they probably needed. Because it was so difficult to get new machines, developers treated them like pets, gave them names and took great care to keep them up and running at all times. Everybody was so excited about the “Internet Bubble” and the land grab that was going on that no one seemed to care about underutilized hardware. 

Beginning to share servers

Then one day, people realized that profitability mattered more than cool factor, so the Internet Bubble burst. CFOs became the kings and queens of data center spend and demanded to know why two applications would sit on the same aisle in the data center running at 20 percent utilization most of the time. They decreed that applications should share hardware—even if they had nothing to do with each other. 

Under this regime, some citizens were not as responsible as their new physical server neighbors. When one application had a memory leak, it slowed down everybody they shared a server with. How naughty of those noisy neighbors!

The rise of virtual machines

As the land changed its name to “The Late 1990s,” wizards from a tribe that came to call themselves “VMware” created a new form of magic to combat those noisy neighbors called a “hypervisor.” This created a separation of resources not possible before. These “virtual machines” could be created in minutes and offered better physical server utilization without any additional work from the citizens. 

After some time had passed, the citizens realized that they could use the ability to create new virtual machines to their advantage when it came to high availability and handling peak loads. They stopped treating their virtual machines like pets that needed care and feeding and instead treated them like cattle that they could create and destroy at their very whim. No longer did citizens need to update patches on an existing virtual machine. They could simply create a new one that already had the patches installed and let it replace the old one, which would be killed. 

But this approach could give citizens only so much flexibility. At the end of the day, their CFO still owned the hardware that the hypervisors—and by extension, the virtual machines—ran on. That’s true even when you call it private cloud. 

Making the switch

That would change when a tribe that survived the bursting of the Internet Bubble would, based on the magical hypervisor, allow citizens to rent virtual machines by the hour. What was once The 1990s became “2006.” The tribe, which called itself AWS, would give citizens the ability to pay for only what they used. While some C-suite kings and queens were initially worried about how secure it really was to use virtual machines that ran in AWS data centers, eventually a set of improvements came that rendered that argument moot, especially in comparison to how quickly citizens could innovate. Soon, other tribes like Azure and Google joined in, and collectively, they called it “public cloud.”

Today, most rotary and citizenry realize that a little bit of both techniques is needed to meet the varying needs of a modern application portfolio. New applications or applications without sensitive data run great on public clouds, while older applications or applications with more stable demand are perfect for private clouds. Cloud management platforms help citizens govern and monitor applications across all clouds. 

Planning for the future

On the horizon are smaller and more portable types of magic—some call these “containers,” and others refer to them as “chickens” (when comparing them to earlier pet and cattle approaches to application architectures). Not far beyond that are what some call “serverless,” but that’s just silly—there will still be servers there even if a citizen never has to log into one. Others call this approach “Function-as-a-Service,” or FaaS, and might compare the magic to feathers rather than to pets, cattle or chickens. 

Nobody knows for sure what will happen next, but power has certainly shifted to the citizens, who can innovate more quickly than others. Instead of living in “IT,” increasingly they live in “Line of Business,” where they can have a direct impact on revenue. Those who can enable those citizens with better products will live happily ever after.

Contributor

Pete Johnson is a Technical Solutions Architect at Cisco, covering cloud and serverless technologies. Prior to joining Cisco as part of the CliQr acquisition, Pete worked at Hewlett-Packard for 20 years where he served as the HP.com Chief Architect and as an inaugural member of the HP Cloud team.

The opinions expressed in this blog are those of Pete Johnson and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.