Why edge computing is critical for the IoT

Everyday devices are becoming more powerful, reducing data center loads and complementing—or in some cases leapfrogging—cloud capabilities to drive exciting new IoT applications.

wireless mobile network - internet of things edge [IoT] - edge computing
Thinkstock

While many of today’s always-connected tech devices take advantage of cloud computing, Internet of Things (IoT) manufacturers and application developers are starting to discover the benefits of doing more compute and analytics on the devices themselves.

This on-device approach helps reduce latency for critical applications, lower dependence on the cloud, and better manage the massive deluge of data being generated by the IoT. An example of this trend is the recently announced Nest Cam IQ indoor security camera, which uses on-device vision processing to watch for motion, distinguish family members, and send alerts only if someone is not recognized or doesn’t fit pre-defined parameters. By performing computer vision tasks within the camera, Nest reduces the amount of bandwidth, cloud processing, and cloud storage used versus the alternative of sending raw streams of video over the network. In addition, on-device processing improves the speed of alerts while reducing chances of annoying, recurrent false alarms.

The ability to do advanced on-device processing and analytics is referred to as “edge computing.” Think of the “edge” as the universe of internet-connected devices and gateways sitting on the field — the counterpart to the “cloud.” Edge computing provides new possibilities in IoT applications, particularly for those relying on machine learning for tasks such as object detection, face recognition, language processing, and obstacle avoidance.

The rise of edge computing is an iteration of a well-known technology cycle that begins with centralized processing and then evolves into more distributed architectures. The internet itself started with a limited number of connected mainframes in government facilities and universities — it didn’t reach mass scale and affordability until “dumb” terminals that interfaced with mainframes were replaced by more capable PCs, which were able to render the graphics-rich pages of an emerging world wide web. Likewise, the mobile revolution largely accelerated when smartphones substituted feature phones at the edge of the cellular network. Edge computing will have a similar effect on the IoT, fueling strong ecosystem growth as end devices become more powerful and capable of running sophisticated applications.

Edge computing delivers tangible value in both consumer and industrial IoT use cases. It can help reduce connectivity costs by sending only the information that matters instead of raw streams of sensor data, which is particularly valuable on devices that connect via LTE/cellular such as smart meters or asset trackers. Also, when dealing with a massive amount of data produced by sensors in an industrial facility or a mining operation for instance, having the ability to analyze and filter the data before sending it can lead to huge savings in network and computing resources.

Security and privacy can also be improved with edge computing by keeping sensitive data within the device. For example, new retail advertising systems and digital signage are designed to deliver targeted ads and information based on key parameters set on field devices, such as demographic information. Edge computing in these solutions helps protect user privacy by anonymizing, analyzing, and keeping the data at the source rather than sending identifiable information to the cloud.

Processing at the edge also reduces latency and makes connected applications more responsive and robust. Avoiding device-to-cloud data round trips is critical for applications using computer vision or machine learning — for instance, an enterprise identity verification system or a drone tracking and filming its owner or an object. On-device machine learning can enhance natural language interfaces as well, allowing smart speakers to react more quickly by interpreting voice instructions locally, run basic commands such as turning lights on/off, or adjust thermostat settings even if internet connectivity fails. Moreover, edge computing brings “future proofing” to these systems by allowing over-the-air updates for the device software and the list of local commands it can run.

The proliferation of machine learning for IoT applications is a powerful driver for increased edge compute capabilities. Devices not only need to run complex deep learning networks quickly, they need to do so while consuming very little power since many IoT devices run on battery. This is prompting adoption of heterogeneous compute architectures — integrating diverse engines such as CPUs, GPUs and DSPs — in IoT devices so that different workloads are assigned to the most efficient compute engine, thus improving performance and power efficiency. In fact, DSPs have shown a 25X improvement in energy efficiency and an 8X improvement in performance versus running the same workloads on a CPU.

With edge computing, the opportunity for system architects is to learn how to harness the benefits of the available distributed computing power from end to end — tapping into the capabilities of field devices, gateways, and cloud altogether. Edge devices are being created with increasingly sophisticated compute capabilities. Couple that with not-so-far-off advanced connectivity technologies such as 5G, which will deliver faster, more robust, and massive connectivity, and it becomes obvious that we are about to witness the emergence of a new breed of smart devices and applications. It’s truly a fascinating time to watch and participate in this space.

This article is published as part of the IDG Contributor Network. Want to Join?

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Now read: Getting grounded in IoT