Higgs boson researchers consider move to cloud computing

Higgs researchers undertake two-year pilot to experiment replacing grid infrastructure with the cloud, but funding and logistical concerns pose challenges

How did European researchers working on the Higgs boson recently make one of the most revolutionary physics discoveries in recent decades? From an IT perspective, they relied on a good old-fashioned grid computing infrastructure, though a new cloud-based one may be in the offing.

The European Nuclear Energy Association's (CERN) decade-old grid computing infrastructure has been used extensively during the past few years for research that culminated with discovery of the Higgs boson, or so-called "God Particle."

WHAT IS THE HIGGS BOSON? Quick look: The Higgs boson phenomenon 

Unlike a public cloud, where data and compute resources are typically housed in one or more centrally managed data centers with users connecting to those resources, CERN interconnected grid network relies on more than 150 computing sites across the world sharing information with one another.

For the first couple of years after the grid computing infrastructure was created, it handled 15 petabytes to 20 petabytes of data annually. This year, CERN is on track to produce up to 30 PB of data. "There was no way CERN could provide all that on our own," says Ian Bird, CERN's computing grid project leader. Grid computing was once a buzz phrase similar to that of what cloud computing is now. "In a certain sense, we've been here already," he says.

CERN, where the Large Hadron Collider that is the focal point of the Higgs boson research lives, is considered Tier 0 within the grid. That's where scientific data is produced by smashing particles together in the 17-mile LHC tunnel. Data from those experiments is then sent out through the grid to 11 Tier 1 sites, which are major laboratories with large-scale data centers that process much of the scientific data. Those sites then produce datasets that are distributed to more than 120 academic institutions around the world, where further testing and research is conducted.

The entire grid has a capacity of 200 PB of disk and 300,000 cores, with most of the 150 computing centers connected via 10Gbps links. "The grid is a way of tying it all together to make it look like a single system." Each site is mostly standardized on Red Hat Linux distributions, as well as a custom-built storage and compute interfaces, which also provide information services describing what data is at each site.

Higgs Boson

Research that contributes to a ground-breaking discovery like the Higgs announcement, though, is not always centrally organized. Bird says in fact it's quite a chaotic process and one that makes it difficult to plan for the correct amount of compute resources that will be needed for testing at the various sites. For example, when there is a collision in the LHC, impacted particles leave traces throughout the detector. A first level of analysis is to reconstruct the collision and track the paths of the various particles, which is mostly done at the Tier 0 (CERN) and Tier 1 sites. Other levels of analysis are broken into smaller datasets and distributed to the partnering academic institutions for analysis. From there, a variety of statistical analysis, histograms and data mining is conducted. If a certain discovery is made, an analysis might be refined and another test may be run. "You really can't predict the workflows," he says.

That's why Bird and CERN are excited about the potential for using some cloud-based services. "We're interested in exactly what it would take to use cloud storage," he says. "But at this point, we're just not sure of the costs and how it would impact our funding structure." CERN receives money from various academic institutions that have access to the data CERN creates to analyze it. Many of those partnering academic groups have compute resources in place and want the CERN data on their own sites to run experiments on and make that resource available to their academic communities. "From a technical point of view, it could probably work," he says. "I just don't know how you'd fund it."

CERN has made some initial forays into the cloud. Internally, CERN is running a private cloud based on OpenStack open source code. Many of the partnering organizations have private clouds on their own premises as well.

In March, CERN and two other major European research organizations took steps to create a public cloud resource called Helix Nebula - The Science Cloud. It's a partnership of research organizations, cloud vendors and IT support companies that are powering a community cloud for the scientific and research community. The two-year pilot program CERN has recently kicked off will begin by running simulations from the LHC in the Helix Nebula cloud.

Bird is hopeful about the cloud, figuring that within another decade the cloud will be where grid computing is now. "It's just not obvious how we'll get to that point," he says. But even if the cloud has its challenges, Bird is confident that the scientists who made one of the most important scientific discoveries in decades should be able to figure out the cloud.

Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Copyright © 2012 IDG Communications, Inc.