The Xeon D-2100's system-on-chip design and low power are key for constrained environments. Credit: Intel Corp. Intel has launched its brand-new lineup of Xeon processors designed specifically for edge computing needs, where space, heat, and power are all of greater concern than in a traditional data center design. The Xeon D-2100 processors are the successor to the 1000-D series that Intel introduced last year. They are high-powered SoCs with anywhere from four to 18 Skylake-generation cores and sport the full range of Skylake features, including VT-X/VT-d for virtualization, RAS features and the entire TXT, AVX-512, TSX Instruction sets. The platform supports up to 512 GB of memory, up to 32 PCI Express 3.0 lanes and up to 20 Flexible High Speed I/O. TDP ranges from 60 to 100 watts, slightly lower than the traditional Xeon design. All told, there are six processors in the Xeon D-2100 family, ranging from four cores to 18 and from 2.3Ghz to 2.8Ghz in speed. Xeon D-2100 processors suited for edge computing environments In Intel’s announcement of the Xeon D-2100 line, the company said the processors bring advanced performance to edge computing environments, as well as other applications that have space and power constraints, such as web-tier compute and storage infrastructure. Edge computing is an important, if very early stage, development that seeks to put computing power closer to where the data originates, and it is seen as working hand in hand with Internet of Things (IoT) devices. IoT devices, such as smart cars and local sensors, generate tremendous amounts of data. A Hitachi report (pdf) estimated that smart cars would at some point generate 25GB of data every hour. This can’t all be sent back to data centers for processing. It would overload the networks and the data centers. Instead, edge computing processes the data at its origin. So, smart car data generated in New York would be processed in New York rather than sent to a remote data center. Major data center providers, such as Equinix and CoreSite, offer such services at their data centers around the country, and startup Vapor IO offers ruggedized mini data centers that can be deployed at the base of cell phone towers. Amazon also has its own edge services, called Greengrass, which lets you build your own local edge computing server using Lambda and other Amazon Web Services (AWS) cloud computing services. The Xeon D-2100 processors are ideal for this kind of server. Low power, but high performance The low power design is for these possibly power-constrained environments, but it doesn’t mean low performance, and the new processors are designed specifically for communications service providers that offer multi-access edge computing (MEC), which allows software applications to tap into local content and real-time information about local-access network conditions. The Xeon D-2100 line is also targeted at communication service providers offering customized enterprise networking services, such as virtual private networks (VPN), firewalls, routing through software-defined WANs (SD-WAN), and workload optimizations through network function virtualization (NFV). Edge computing is still in a very early stage; there isn’t even a common platform for it because every edge network is different. An edge network for wearable or smartphone data would be different from smart cars, and every car maker will have their own network. But for now, at least, it has a pretty decent processor. Related content news analysis AMD launches Instinct AI accelerator to compete with Nvidia AMD enters the AI acceleration game with broad industry support. First shipping product is the Dell PowerEdge XE9680 with AMD Instinct MI300X. By Andy Patrizio Dec 07, 2023 6 mins CPUs and Processors Generative AI Data Center news analysis Western Digital keeps HDDs relevant with major capacity boost Western Digital and rival Seagate are finding new ways to pack data onto disk platters, keeping them relevant in the age of solid-state drives (SSD). By Andy Patrizio Dec 06, 2023 4 mins Enterprise Storage Data Center news Omdia: AI boosts server spending but unit sales still plunge A rush to build AI capacity using expensive coprocessors is jacking up the prices of servers, says research firm Omdia. By Andy Patrizio Dec 04, 2023 4 mins CPUs and Processors Generative AI Data Center news AWS and Nvidia partner on Project Ceiba, a GPU-powered AI supercomputer The companies are extending their AI partnership, and one key initiative is a supercomputer that will be integrated with AWS services and used by Nvidia’s own R&D teams. By Andy Patrizio Nov 30, 2023 3 mins CPUs and Processors Generative AI Supercomputers Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe