Sometimes you really are on your own. And calling for directions isn’t feasible.
It’s the same with many IoT systems. Centrally processing large volumes of sensor data slows decision making and increases bandwidth demand. Many decisions are better made close to the source.
Which decisions should be made close to the network edge and which centrally? Where are the trade-offs? Which applications are best suited for local decision making? Three fog computing experts share some insights.
- Helder Antunes, senior director of Cisco’s Corporate Strategic Innovations Group and chairman of the OpenFog Consortium
- Rhonda Dirvin, director IoT vertical markets at ARM
- Matt Vasey, who focuses on IoT strategy at Microsoft
Dirvin and Vasey are both board members of the OpenFog Consortium.
What is the OpenFog Consortium?
Microsoft, ARM, Cisco, Dell, Intel and Princeton University founded the OpenFog Consortium. Cloud and edge-only IoT designs become blockers as the volume of sensor data increases. Fog computing allows compute, storage and communications resources to be placed in a continuum between the cloud and the edge. This reduces technology blockers and accelerates digital transformation.
The OpenFog Consortium defines and extends the application of fog computing. It’s an independent, open membership ecosystem of companies, end users and universities.
“Cloud technology must work seamlessly with fog computing to provide a seamless end-to-end customer experience,” said Antunes. “More than ever, fog computing requires a highly scalable and collaborative approach, with deep expertise in a wide range of industries and technologies. No single company can do it alone.”
Where does fog computing work best?
"The ideal use cases require intelligence near the edge where ultra low latency is critical, run in geographically dispersed areas where connectivity can be irregular, or create terabytes of data that are not practical to stream to the cloud and back," said Vasey. "Fog computing works well in a cloud-based control plane to provide control and broader insight across a large numbers of nodes. These include transportation, agriculture, wind energy, surveillance, smart cities and buildings."
Smart cities and fog computing
Large cities face challenges from traffic congestion, public safety, high energy use, sanitation and in providing municipal services. These challenges can be addressed within a single IoT network by installing a network of fog nodes.
A lack of broadband bandwidth and connectivity is a major issue in establishing smart cities. While most modern cities have one or more cellular networks providing adequate coverage, these networks often have capacity and peak bandwidth limits that barely meet the needs of existing subscribers. This leaves little bandwidth for the advanced municipal services envisioned in a smart city. Deploying a fog computing architecture allows for fog nodes to provide local processing and storage. This optimizes network usage.
Smart cities also struggle with safety and security, where time-critical performance requires advanced, real-time analytics. Municipal networks may carry sensitive traffic and citizen data, as well as operate life-critical systems such as emergency response. Fog computing addresses security, data encryption and distributed analytics requirements.
Smart buildings and fog computing
Building automation demonstrates the need for edge intelligence and localized processing. A commercial building may contain thousands of sensors to measure various building operating parameters: temperature, keycard readers and parking space occupancy. Data from these sensors must be analyzed to see if actions are needed, such as triggering a fire alarm if smoke is sensed. Fog computing allows for autonomous local operations for optimized control function.
Each floor, wing or even individual room could contain its own fog node that is responsible for performing emergency monitoring and response functions, controlling climate and lighting, and providing a building-resident compute and storage infrastructure to supplement the limited capabilities of local smartphones, tablets and computers.
Fog computing works with cloud computing, so the long-term history of building operational telemetry and control actions can be aggregated and uploaded to the cloud for large-scale analytics to determine operational aspects of buildings. The stored operational history can then train machine learning models, which can be used to further optimize building operations by executing these cloud-trained machine learning models in the local fog infrastructure.
Visual security and fog computing
Video cameras are now used in parking lots, buildings and other public and private spaces to increase public safety. The sheer bandwidth of visual (and other sensor) data being collected over a large-scale network makes it impractical to transport all of the data to the cloud to obtain real-time insights. Imagine a busy airport or city center with many people and objects moving through an area at a time. Real-time monitoring and detection of anomalies pose strict low-latency requirements on surveillance systems. Timeliness is important for both detection and response.
Privacy concerns must be addressed when using a camera as a sensor that collects image data so that the images do not reveal a person's identity or reveal confidential contextual information to any unauthorized parties. Fog computing allows for real-time, latency-sensitive distributed surveillance systems that maintain privacy.
Through a fog architecture, video processing is intelligently partitioned between fog nodes co-located with cameras and the cloud. This enables real-time tracking, anomaly detection, and collection of insights from data captured over long intervals of time.
How does fog computing reduce security risks?
Security is a fundamental concern of any deployment that uses IoT, network and cloud technologies. The OpenFog architecture specifies a secure end-to-end compute environment between the cloud and the fog nodes that connect to IoT devices. These devices use a hardware-based immutable root of trust, which can be attested by software agents running throughout the infrastructure.
Interoperability among platforms
“Interoperability among heterogeneous platforms is key to IoT reaching its potential,” Dirven said. "The OpenFog Consortium is addressing this by creating an interoperable way to take advantage of the computing, storage and networking resources available from the cloud to the edge.”
'Houston, we have a problem'
That famous quote is from the crew of the Apollo 13 moon flight when reporting a major technical problem back to NASA's base in Houston. Sometimes central help is absolutely critical. Centralized expertise and bandwidth is preserved for critical decisions by handling most other issues locally.
Astronauts work autonomously where possible and get central guidance when needed. Fog computing does the same for mission-critical IoT applications.