Cloud computing and the 'last mile'

While cloud computing has moved applications and data out of the corporate data center, it hasn't fulfilled the promise of a highly distributed infrastructure. Missing from the equation is the 'last mile.'

Cloud computing and the 'last mile'

What does food and beverage production have to do with cloud and the last mile?  There are several interesting parallels.

Unknown to most people, the food and beverage industry operates on razor-thin margins. As a result, producers look for any advantage they can get through automation, scale and supply chain optimization. Water is both bulky and heavy while also plentiful and cheap. As the percentage content of water increases in a food or beverage, producers are incented to remove the water during production in a way that it can be reintroduced at the point of consumption. From soups to sodas, this model keeps transportation costs low while expanding profit margins.

How data acts like water

Today large corporations invest billions of dollars in big data and analytics. Data-driven methodologies have created a voracious appetite for data. As data sets have exploded over the past few years, leading-edge firms have discovered data acts a lot like water; it's a cheap ingredient but expensive to transport. Unfortunately there is no simple, economical way available today to transport the terabytes of data being generated in stores, restaurants, stadiums and branches to data centers for analysis. This puts a significant hurdle in front of real-time analytics that promises to deliver a tremendous competitive advantage.

Data analytics is a multi-step process as shown below:

  1. Capture data at the point of origination
  2. Move the data to the data center
  3. Transform and load the data
  4. Combine the data with other data to broaden the perspective
  5. Execute analytics on the data
  6. Use the result to generate a recommendation/decision
  7. Communicate the result back to the requester

Mobile and IoT devices have dramatically expanded our step one capabilities. Steps three through seven have been the focus of enterprises for years and are well understood. Our challenge is primarily step two as the data sets grow. Why? We have reached the point where mixing a distributed model for data collection with a centralized model for analysis no longer works.

For the past several years I have advocated an alternate approach: moving the analytics from the centralized data center out to highly distributed analysis points on the network “edge” or "last mile." I still believe the best long-term solution is to move the analytics into the end user's device (what I refer to as serverless computing), but even then at least some analysis will occur outside the device. As the volume of data increases and tolerance for latency decreases, moving analytics closer and closer to the last mile will emerge as a key enabler of analytics.

Considering how much data is being collected today, from the GPS location in your phone to the RFID tag on your razor blades, we are living in the Data Age. Every connectable device in our IoT world is creating new data constantly and adding to the pile of data that already exists. To deal with the onslaught, companies need to filter what's coming in, remove the noise and then execute their normalization routines. Since compared to the cost of moving data everything else is free, there is an economic incentive to move data as short a distance as possible. We live locally, and we're served locally. Why not compute locally?

Where would these analysis points be built? Perhaps in those hardened data centers that already sit at the origination point of the last mile. What if these locations, built to house massive telecom computing power for routing voice calls and data packets, were loaded with high-density compute and storage? With the advent of mobile phones and the pending deployment of 5G, these local exchange carriers (LECs) and competitive local exchange carriers (CLECs) can be the enabler for bringing cloud out of the super-regional data centers of public cloud providers and down to the local infrastructure. 

Cloud at the last mile could invert the value propositions of telecom and cable providers that own the last mile and the large public cloud providers. I wonder if perhaps this is part of the reason Google created Google Fiber. Imagine a legacy infrastructure being resurrected to meet an emerging need.

I'm reminded of an unattributed quote that seems to apply every time a new idea pops up in the world of technology: "Look back to where you have been, for a clue to where you are going."

This article is published as part of the IDG Contributor Network. Want to Join?

Must read: Hidden Cause of Slow Internet and how to fix it
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies