How will the cloud be able to handle the emergence of IoT

Cloud computing and the Internet of Things (IoT) have spent the last several years in a sort of maximum-acceleration race where they’ve lapped the other players several times over and have only one another to measure against.

smart city - wireless network - internet of things edge [IoT] - edge computing
Thinkstock

Cloud computing and the Internet of Things (IoT) have spent the last several years in a sort of maximum-acceleration race where they’ve lapped the other players several times over and have only one another to measure against.

IOT Expansion and Cloud Capacity

Neither is slowing down, particularly the IoT. According to analysis firm Gartner, the number of IoT devices will hit 20.8 billion by 2020. The world population is expected to reach 8 billion in 2020, meaning there will be 2.5 IoT devices per person on the entire planet. In 2016, the IoT was growing at the rate of 5.5 million new things getting connected every day.

The ability of cloud computing to take on and interpret vast amounts of data is the catalyst that has seen the rapid expansion of IoT. Imagine a municipal power company deploying smart thermostats to all of its customers. If it were to send all that raw data back to its own servers, the chance of a crash would be enormous. But sending all the data to the cloud to be stored until analysis can be performed lowers the risk to the company considerably.

But as more and more data is poured into the cloud – from the IoT, from corporations and companies and individuals and governments and every other sort of organization in the world, there are cracks beginning to show in the system.

In February of 2017, Amazon’s AWS cloud computing department suffered a four-hour outage that either knocked offline or significantly slowed down hundreds of thousands of websites that use AWS for hosting databases, images, videos and web services.

Among major players affected the outage – Buzzfeed, Pinterest, Spotify and Netflix – all went down. In a bit of bitter irony, Amazon could not update its AWS dashboard because the dashboard was also hosted by AWS.

Four hours might not seem like a long time to the average observer, but consider in 2015 AWS had a combined downtime of just 2 hours, 30 minutes for the entire year. This sort of outage opens the eyes of many to the fact that so much of the Internet is being kept up and running by a very small number of providers. As of February 2017, AWS was the market leader in public Infrastructure as a Service (IaaS) with 40 percent of the market share. The other Big Three providers – Microsoft, Google, and IBM – combine for about 23 percent. Do the math and we’re talking about four entities responsible for two-thirds of all cloud-based storage. One of them goes down for four hours and hundreds of thousands of websites are affected. What happens if more than one cloud provider goes down at the same time?

Potential problems between IoT and cloud technology

Devices carrying the IoT label are inherently sensors that collect data and send it to be processed, usually via multiple mathematical components. Consider a smart car sending fuel economy data back to its manufacturer. Not only is each car’s raw data being sent, but also factors like road surface, tire quality, outside temperature and average speed. All this data is uploaded to the cloud where the car manufacturer’s business intelligence (BI) tools perform computations and analysis, then produce massive data sets. These data sets are then downloaded to the car manufacturer for analysis.

The IoT device plays little more than the role of the messenger here; a problem in of itself which we will touch on later. You have two highly sophisticated machines in the IoT device and the cloud, but one of them is doing almost all of the heavy lifting, which can leave it showing considerable strain for its efforts.

Despite like-minded technologies at their cores, IoT and cloud computing have several properties that conflict with one another that are factors in this strain.

In general, cloud computing resources are fairly inexpensive in terms of availability, can perform tasks rapidly and are quite flexible to the needs of each user they serve. And user location is irrelevant to using with the cloud; as long as you have the Internet, you can connect.

Conversely, IoT devices are more expensive (in terms of development and deployment), they are not nearly as flexible, and they are generally stuck in one location.

Lining up these incompatibilities is something that falls on the shoulders of both IoT designers and cloud programmers. For IoT devices, it means virtualizing the physical sensors into virtual sensors before the data is uploaded to the cloud to make said data more easily distributable. On the cloud side of things, programmers must introduce the means for the cloud to discover sensors that are located in different places. However, a new way of analyzing and storing data is on the rise that can mitigate the dependency on cloud storage and enhance the role of IoT in performing computations and making decisions while staying close to the end-user.

Fog/edge computing

Instead of simply pulling raw data and sending it off to the cloud to be disseminated and analyzed, a new push has begun for the IoT device to have an enhanced role in storing data and performing analytics on it as well. This is known as both fog and edge computing and is being driven (pun intended) by the likes of self-driving cars that require instantaneous decision-making from their sensors in order to perform correctly.

Self-driving cars will create a new subsection of machine-to-machine communication in the form of vehicle-to-vehicle (V2V) communication. These interactions will need to happen as close to real-time as possible.

To effort this, fog/edge computing is the process of moving data a far shorter distance - from the sensors themselves to local gateway device such as a switch or a router. This edge device can then perform the necessary processes and analysis and send back decisions to the IoT device quicker than via cloud computing.

The term "fog computing" was first coined by Cisco in 2014. Cisco explained fog computing as a “highly virtualized platform that provides compute, storage and networking services between devices and cloud computing data centers.”

Cisco’s idea is to have fog computing handle simpler tasks for IoT devices while leaving the more major undertakings to its cloud computing components. Other designers are more interested in the IoT devices themselves performing the analysis and rendering decisions without any communication at all. An overwhelming majority of current IoT devices in use do not have capabilities beyond collecting and transmitting data. Only new versions of said devices would be capable of performing computational analysis “in-house.”

The security of data sent from IoT devices to fog computing sites is also in question. One of the biggest proponents of cloud computing is its insistence on layers of security, yet big breaches are still happening every week. While many users favor fog computing because they fear privacy breaches in cloud environments, there are no corporate firewalls in place for IoT devices, meaning they are also at risk for being hacked or hijacked.

Predicting the future relationship between the IoT and cloud computing even three years down the road is hazy guesswork at best. Regardless of how it will happen, the more certain statement is that change must come to ensure both technologies perform to the fullest extent of their capabilities in the years to come.

This article is published as part of the IDG Contributor Network. Want to Join?

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Now read: Getting grounded in IoT