Don’t play games with your data center: Shift from Intel CPUs to NVIDIA GPUs

Digital technologies such as deep learning, AI, augmented reality and the IoT are driving the need for a new model of computing beyond the capabilities of CPUs

Shift from Intel CPUs to NVIDIA GPUs
Credit: Chris Dag via Flickr

Central processing units (CPUs) from vendors such as Intel and to a lesser extent AMD have been staples in the data center for decades. Both companies have done an outstanding job making CPUs faster and containing more cores so businesses can run computationally intensive processes on them. 

However, digital technologies such as deep learning, artificial intelligence (AI), virtual reality (VR), augmented reality (AR) and the Internet of Things (IoT) are driving the need for a new model of computing beyond the capabilities of CPUs.

+ Also on Network World: Nvidia GPU-powered autonomous car teaches itself to see and steer +

Data centers are entering the era of graphics processing units (GPUs) as the “brains” of digital applications. NVIDIA pioneered the market in the late 1990s with its G80 processor that provided co-processing capabilities for graphics-heavy applications such as gaming. Some vendors, such as Intel, have integrated graphics controllers on the CPU, and these work great for applications such as Microsoft Office, Solitaire and Minesweeper. However, any game that was even moderately graphics-intensive required a separate GPU to deliver a high-quality experience. 

A good way to think of the difference between the two is that the high end of the integrated Intel GPU isn’t even equivalent to the low end of an NVIDIA GPU. Now, as customers have started adopting VR for an immersive gaming experience, the GPU is flexing its muscle and distancing itself from the CPU and integrated GPUs. 

GPUs not just for gaming 

Today, GPUs aren’t just for gaming. The NVIDIA general purpose GPUs are designed for a wide variety of data-intensive workloads as opposed to just graphics. These processing units really shine when massive amounts of data-crunching power is needed for workloads such as autonomous cars, high performance computing (HPC) workloads, scientific analysis and search that require real-time machine learning. 

Why are GPUs so good at handling these compared with CPUs? CPUs have 2, 4, 8 or even 16 large cores, where as GPUs have literally thousands of smaller processors. It’s true that a single CPU core will blow the doors off of an individual GPU-based one, but think of the GPU as “grid computing” on a chip where all the processors run in parallel optimizing them for tasks with large amounts of data. This is one reason why Tesla’s autonomous car initiative is being powered by NVDIA and not Intel or even Mobileye, a company whose entire focus was aimed at self-driving cars. 

+ What do you think? Share your comments about shifting to GPUs on our Facebook page +

It’s important to understand that I’m not saying GPUs will kill off CPUs. You need both in a computer because each is optimized for different purposes. CPUs are needed to boot the computer and are ideal for long, single-threaded, complex tasks such as Excel and Oracle. ARM processors are perfect when low power is a requirement, such as connected thermostats and home automation. However, workloads like that are run in hyper-scale data centers and/or require massive amounts of data to be searched and then decisions made quickly are much better served by GPUs. 

Below are just a few of the top machine learning-based applications that are GPU-powered: 

  • Manufacturing robots
  • Healthcare analysis
  • Autonomous cars
  • Search
  • Facial recognition
  • Star Trek’s Universal Translator, also know as real-time translation
  • Speech to text
  • Airflow analysis 

Some of you might think your business doesn’t need this kind of processing capabilities, but that is flat out incorrect. The fact is machine learning will soon power almost everything and be ubiquitous. We won’t book a reservation at a restaurant, drive somewhere or travel without some kind of machine learning interface helping us. The reality is machines can write software and make decisions faster and better than people, as today’s workloads need superhuman capabilities. GPUs can do this at a rate once possible only in the video games the technology has supported for so long. 

At the 2016 World Economic Forum in Davos, Switzerland, Salesforce CEO Marc Benioff was quoted as saying, “Speed is the new currency of business.” If you believe that, then the way computing is done must change—and GPUs are the change agent. Resistance is futile.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Must read: 10 new UI features coming to Windows 10