The funny thing about pendulums is that they love to swing. There are plenty of examples, from politics to nutrition, but computing cycles just might illustrate it best.
The mainframe computer was a centralized model, with access through dumb terminals. PCs enabled distributed client/server computing. Now we are swinging back to centralized computing, with dumb smartphones connecting back to robust cloud-services. Enterprises are shuttering their data centers and moving to cloud services.
Most of yesterday’s applications, such as CRM, ERP and UC, are moving toward the cloud. But the swing back to a distributed model is inevitable. If not for those applications, then something else. The Internet of Things (IoT) just may be the killer app that does it.
+ Also on Network World: Your network, IoT, cloud computing and the future +
Today, cloud-prem models conjure servers in data centers and/or co-lo facilities working with public cloud infrastructure. But the concept of premises-based equipment is changing from racks of servers to little things. Connected things are appearing across enterprises from factory floors to oil fields to warehouses.
IoT devices are often very dumb. They have very limited processing, power, storage, and networking capabilities, which is exactly why so many IoT devices connect to the cloud. IoT is like E.T.—they’re both information collectors that need to phone home.
Most of today’s IoT conversation focuses on this cloud-centered strategy. Manufacturers use sensors to improve products and create new revenue models. Most agree this is generally going to be a big deal. But this centralized model has a problem: latency.
Latency is a relative term. Take satellite communications. The round-trip delay can be distracting with voice interactions but unnoticeable with email. For most terrestrial applications, network latency isn’t a big problem.
Autonomous vehicles are a hybrid application
Yet that’s changing, and the best example is self-driving cars. An autonomous vehicle generates 10 GB of data per second today. That number will only increase as we continue to add and improve the sensors on these vehicles.
This data “drives” numerous decisions per second—decisions that can’t wait for distant servers to respond. The car itself must be able to collect, analyze, process and take appropriate action. The self-driving car is a data center on wheels that must function; it cannot tolerate network delay or connectivity blind-spots. Yet the uploaded centralized data is critical for analysis and improvements.
Autonomous vehicles, therefore, are a hybrid application. The vehicle is where data are collected and evaluated for immediate actions. The data also travel to the cloud for analysis and to be applied to machine learning heuristics. In a hybrid application, the cloud gets redefined from the role of server to the role of teacher.
This hybrid model is bigger than autonomous cars. It can apply to many applications, including most types of robotics, energy systems, medical equipment and safety devices. A burger-flipping bot in a diner may be the future of hybrid computing.
Amazon understands the forces of pendulums. The one-time pure-cloud vendor is opening physical bookstores. More to the point, Amazon Web Services (AWS) launched Greengrass that enables code to execute in the cloud and on distributed IoT devices. Code can be developed and tested in the cloud and then deployed locally.
Amazon is not a trend, but it’s not alone either. Other cloud providers, such as Microsoft Azure, have launched similar concepts. Fog computing is a similar concept, as it processes information at the network edge and sends the results to the cloud.
New requirements are emerging that require local processing. The number of devices and the data they generate are growing exponentially. While that may appear to be a good fit for cloud economics, the requirement to process data locally may just be the next, new thing.