• United States

How edge networking and IoT will reshape data centers

Jul 24, 20187 mins
Data CenterMobileNetworking

With the rise of edge computing to process a surge in data produced by the internet of things, the function of enterprise data centers will shift to handling long-term data aggregation and analysis.

wireless mobile network - internet of things edge [IoT] - edge computing
Credit: Thinkstock

The Internet as we have all known it mirrors the design of old mainframes with dumb terminals: The data path is almost entirely geared toward data coming down the network from a central location. It doesn’t matter if it’s your iPhone or a green text terminal, the fast pipe has always been down, with relatively little data sent up.

The arrival of IoT threatens to turn that on its head. IoT will mean a massive flood of endpoint devices that are not consumers of data, but producers of it, data that must be processed and acted upon. That means sending lots of data back up a narrow pipe to data centers.

For example, an autonomous car may generate 4TB of data per day, mostly from its sensors, but 96% of that data is what is called true but irrelevant, according to Martin Olsen vice president, global edge and integrated solutions at Vertiv, a data center and cloud computing solutions provider. “It’s that last 4% of what’s not true that is the relevant piece. That’s the data we want to take somewhere else,” he said.

So does this mean a massive investment in rearchitecting your network for fatter pipes into the data center? Or can the advent of edge computing take the load off central data centers by doing much of the processing work at the edge of the network?

What is edge computing?

Edge computing is decentralized data processing specifically designed to handle data generated by the Internet of Things. In many cases, the compute equipment is stored in a physical container or module  about the size of a cargo shipping container, and it sits at the base of a cell tower, because that’s where the data is coming from.

Edge computing has mostly been to ingest, process, store and send data to cloud systems. It is the edge where the wheat is separated from the chaff and only relevant data is sent up the network.

If the 4% Olsen talks about can be processed at the edge of the network rather than in a central data center, it reduces bandwidth needs and allows for faster response than sending it up to the central server for processing. All of the major cloud providers – like AWS, Azure or Google Compute Engine – offer IoT services and process what is sent to them.

In many cases, the edge can perform that processing and discard the unneeded data. Since cloud providers charge by how much data they process, it is in the customer’s financial interest to reduce the amount they send up for processing.

“We need much more compute out at the edge of the network. This drives profound change, but interesting in that while we’ll see far more data generated out at the edge, a very limited amount of it needs to travel very far,” said Olsen.

“Edge data centers tend to aggregate data, and perform actuation functions to give an answer in low latency,” said Jim Poole, vice president of business development for Equinix. “What most companies are still doing is aggregating metadata from all these edge locations at a central location to do machine learning and analytics.”

Prashanth Shenoy, Cisco’s vice president of marketing for enterprise networking and IoT, agrees that more computing should be pushed out to the edge.

“Compute has gotten cheaper and faster than the network, which suggests that compute should now be at the edge,” he said. “Also, in cases where bandwidth is at a premium or users are in remote locations, like offshore or a mine, and you don’t have connectivity, you need compute and analytics at the edge.”

Artificial intelligence in edge networks

Another important element to reducing the data load will be the use of artificial intelligence in edge networks, said Jeff Loucks, executive director at the center for tech, media and telecom at Deloitte.

“The use of AI in edge networks will reduce data needed in data centers. When you think about all the data collected by an autonomous vehicle, even if you make the pipe bigger, that still makes a lot of data to be processed. So adding AI will be key to that,” he said. “We’re already seeing machine-learning algorithms in low-cost devices, like a security camera that can tell the difference between a cat and an intruder. We don’t need high cost devices, just the algorithms on lower cost and more ubiquitous devices.”

5G wireless can help edge networks

Another element in making the IoT flood manageable will be the advent of 5G wireless technology. Wi-Fi is useful in some scenarios, such as Industrial IoT, where the gear is in a closed, relatively confined space such as a factory floor, and Wi-Fi access points can handle the traffic. But for many scenarios, Wi-Fi just doesn’t provide enough range or throughput, although it could in the future with new high-speed protocols like 802.11ax.

For outdoor IoT, like autonomous vehicles or remote sites like industrial work sites or off-shore oil rigs, the cellular network is the network of choice for its range and bandwidth. That’s why edge computing containers are placed at the site of a cellular tower.

5G, currently in trials in the U.S with expected rollout beginning next year, was designed with business use in mind, as opposed to the more consumer focus of 3G and 4G. 5G is 20 times faster than 4G, with a peak download speed of 20Gbits/sec vs. 1GBit/sec for 4G.

“5G will be very helpful in increasing the amount of data that can be sent,” said Loucks. “The pipe can be bigger, so it will increase the amount of data that can flow both ways. 5G also helps because it reduces latency, which will help industrial apps that require a lot of precision because they are so low latency. Where there have been latency problems 5G will help correct that.”

“5G is absolutely key to making this architecture work,” said Olsen. “Today a very small part of Internet traffic goes over wireless networks because of bandwidth and latency. We are all far more mobile and would like to have more capacity. 4G is ill equipped to handle all this traffic and solve for speeds.”

But Equinix’s Poole isn’t fully sold on 5G as a solution. “The industry hasn’t shown the need for ultra-low-latency apps. Very few use cases need latency below 5 milliseconds. Never say never, but there is nothing viable in the market that needs that kind of latency,” he said.

Paying for edge networks

There are several challenges to moving compute to the edge, starting with the cost of edge-computing infrastructure. The edge-network containers that hold all the compute equipment aren’t cheap, so the question is, who will pay for them?

“Right now the business model is not clear,” said Olsen. “They have the eyeballs, but it’s not clear how they make money off it. Maybe Uber or insurance companies can fund it to see how you are driving. But the biggest challenge is how do they monetize that.”

Olsen also thinks the data center will have to grow just to store all the data coming in, even if it’s a sliver of what is generated. “A lot of people say the edge is the end of cloud data centers but I would be pretty hard pressed to say that. There would be no reason to believe there wouldn’t be a need for enterprise data centers,” he said.

“There will be net need for more. Even at a single-digit percentage of what is generated out at the network, when you get to things like security and privacy, all this [extra data] has to be stored somewhere. For long-term storage you go back and look at this data to do analysis,” he added.

Poole said some early adopters of edge computing are repurposing their data centers for long-term computational use. “The IT deployment model has been turned on its head. Now the edge is everywhere and the corporate data center is repurposed for long-cycle analytics. Financial services firms have moved their daily trading work to Equinix and use their own data center for long-cycle analytics, which they still have to do,” he said.

Andy Patrizio is a freelance journalist based in southern California who has covered the computer industry for 20 years and has built every x86 PC he’s ever owned, laptops not included.

The opinions expressed in this blog are those of the author and do not necessarily represent those of ITworld, Network World, its parent, subsidiary or affiliated companies.

More from this author