MemVerge bridges two technologies that are already a bridge. Credit: monsitj / Getty Images A startup called MemVerge has announced software to combine regular DRAM with Intel’s Optane DIMM persistent memory into a single clustered storage pool and without requiring any changes to applications. MemVerge has been working with Intel in developing this new hardware platform for close to two years. It offers what it calls a Memory-Converged Infrastructure (MCI) to allow existing apps to use Optane DC persistent memory. It’s architected to integrate seamlessly with existing applications. Optane memory is designed to sit between high-speed memory and solid-state drives (SSDs) and acts as a cache for the SSD, since it has speed comparable to DRAM but SSD persistence. With Intel’s new Xeon Scalable processors, this can make up to 4.5TB of memory available to a processor. Optane runs in one of two modes: Memory Mode and App Direct Mode. In Memory Mode, the Optane memory functions like regular memory and is not persistent. In App Direct Mode, it functions as the SSD cache but apps don’t natively support it. They need to be tweaked to function properly in Optane memory. As it was explained to me, apps aren’t designed for persistent storage because the data is already in memory on powerup rather than having to load it from storage. So, the app has to know memory doesn’t go away and that it does not need to shuffle data back and forth between storage and memory. Therefore, apps natively don’t work in persistent memory. Why didn’t Intel think of this? All of which really begs a question I can’t get answered, at least not immediately: Why didn’t Intel think of this when it created Optane in the first place? MemVerge has what it calls Distributed Memory Objects (DMO) hypervisor technology to provide a logical convergence layer to run data-intensive workloads at memory speed with guaranteed data consistency across multiple systems. This allows Optane memory to process and derive insights from the enormous amounts of data in real time. That’s because MemVerge’s technology makes random access as fast as sequential access. Normally, random access is slower than sequential because of all the jumping around with random access vs. reading one sequential file. But MemVerge can handle many small files as fast as it handles one large file. MemVerge itself is actually software, with a single API for both DRAM and Optane. It’s also available via a hyperconverged server appliance that comes with 2 Cascade Lake processors, up to 512 GB DRAM, 6TB of Optane memory, and 360TB of NVMe physical storage capacity. However, all of this is still vapor. MemVerge doesn’t expect to ship a beta product until at least June. Related content news analysis AMD launches Instinct AI accelerator to compete with Nvidia AMD enters the AI acceleration game with broad industry support. First shipping product is the Dell PowerEdge XE9680 with AMD Instinct MI300X. By Andy Patrizio Dec 07, 2023 6 mins CPUs and Processors Generative AI Data Center news analysis Western Digital keeps HDDs relevant with major capacity boost Western Digital and rival Seagate are finding new ways to pack data onto disk platters, keeping them relevant in the age of solid-state drives (SSD). By Andy Patrizio Dec 06, 2023 4 mins Enterprise Storage Data Center news Omdia: AI boosts server spending but unit sales still plunge A rush to build AI capacity using expensive coprocessors is jacking up the prices of servers, says research firm Omdia. By Andy Patrizio Dec 04, 2023 4 mins CPUs and Processors Generative AI Data Center news AWS and Nvidia partner on Project Ceiba, a GPU-powered AI supercomputer The companies are extending their AI partnership, and one key initiative is a supercomputer that will be integrated with AWS services and used by Nvidia’s own R&D teams. By Andy Patrizio Nov 30, 2023 3 mins CPUs and Processors Generative AI Supercomputers Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe