The OpenFog Consortium developed IEEE 1934, a standard shaped by ARM, Cisco, Dell, Intel, Microsoft and Princeton University, to handle the massive data generated by IoT, 5G and artificial intelligence. Credit: Robert Couse-Baker Looking to seriously amplify the use of fog computing, the IEEE has defined a standard that will lay the official groundwork to ensure that devices, sensors, monitors, and services are interoperable and will work together to process the seemingly boundless data streams that will come from IoT, 5G and artificial intelligence (AI) systems. The standard, known as IEEE 1934, was largely developed over the past two years by the OpenFog Consortium, which includes ARM, Cisco, Dell, Intel, Microsoft, and Princeton University. Fog computing definition IEEE 1934 defines fog computing as “a system-level horizontal architecture that distributes resources and services of computing, storage, control and networking anywhere along the cloud-to-things continuum. It supports industry verticals and application domains, enables services and applications to be distributed closer to the data-producing sources, and extends from the things, over the network edges, through the cloud and across multiple protocol layers.” “We now have an industry-backed and -supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Helder Antunes, chairman of the OpenFog Consortium and senior director at Cisco, said in a statement. According to the OpenFog website: “The sheer breadth and scale of IoT, 5G and AI applications requires collaboration at a number of levels, including hardware, software across edge and cloud, as well as the protocols and standards that enable all of our ‘things’ to communicate. “Existing infrastructures simply can’t keep up with the data volume and velocity created by IoT devices, nor meet the low-latency response times required in certain use cases such as emergency services and autonomous vehicles. “By extending the cloud closer to the edge of the network, fog enables latency-sensitive computing to be performed in proximity to the data-generating sensors, resulting in more efficient network bandwidth and more functional and efficient IoT solutions. Fog computing also offers greater business agility through deeper and faster insights, increased security and lower operating expenses.” Related content how-to Doing tricks on the Linux command line Linux tricks can make even the more complicated Linux commands easier, more fun and more rewarding. By Sandra Henry-Stocker Dec 08, 2023 5 mins Linux news TSMC bets on AI chips for revival of growth in semiconductor demand Executives at the chip manufacturer are still optimistic about the revenue potential of AI, as Nvidia and its partners say new GPUs have a lead time of up to 52 weeks. By Sam Reynolds Dec 08, 2023 3 mins CPUs and Processors Technology Industry news End of road for VMware’s end-user computing and security units: Broadcom Broadcom is refocusing VMWare on creating private and hybrid cloud environments for large enterprises and divesting its non-core assets. By Sam Reynolds Dec 08, 2023 3 mins Mergers and Acquisitions news analysis IBM cloud service aims to deliver secure, multicloud connectivity IBM Hybrid Cloud Mesh is a multicloud networking service that includes IT discovery, security, monitoring and traffic-engineering capabilities. By Michael Cooney Dec 07, 2023 3 mins Network Security Network Security Network Security Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe