• United States

When quantum computers forget: Overcoming decoherence

Jan 03, 20224 mins
Data Center

Quantum computing faces many technology challenges, but some researchers say they’ll be solved soon to usher in the Quantum Decade.

There’s no point in having a quantum computer if it’s not smokin’ fast; otherwise it’s way too much trouble, what with all the subzero temperatures and instability and such. So it’s always newsworthy when somebody sets a new standard for quantum computing processing speeds, even if quantum computers are far from common commercial use.

In this case that somebody is IBM, which recently announced its newly developed quantum computing processor, called Eagle, has broken the 100-qubit barrier. IBM

Fusion boldly (if clumsily) says it views Eagle “as a step in a technological revolution in the history of computation.” (It sounds like an algorithm wrote that sentence! Is this where you’re leading us, Big Blue? A quantum future of incoherent techspeak?)

I’m being too harsh on IBM, which humbly notes that its record-breaking accomplishment wasn’t the product of a brilliant insight or sudden epiphany. It was hard work, baby!

“Constructing a processor that breaks the hundred-qubit barrier wasn’t something we could do overnight,” IBM Fusion says. “Constructing one of these devices is an enormous challenge. Qubits can decohere—or forget their quantum information—with even the slightest nudge from the outside world.”

So true, which is why I mentioned the instability issue at the beginning of this post. Decoherence is one of the biggest challenges in quantum computing. As Scientific American explains, use of quantum states “leaves the quantum computer much more vulnerable to errors than a classical computer would be.”

“These errors arise from decoherence, a process in which the environment interacts with the qubits, uncontrollably changing their quantum states and causing information stored by the quantum computer to be lost,” Scientific American writes. “Decoherence could come from many aspects of the environment: changing magnetic and electric fields, radiation from warm objects nearby, or cross talk between qubits.”

In its announcement, IBM Fusion spends no time explaining the significance of Eagle breaking the 100-qubit processing speed barrier beyond generalizations such as “our team is solving challenges across hardware and software to eventually realize a quantum computer capable of solving practical problems in fields from renewable energy to finance and more.”

There is no inherent significance to the 100-qubit processing speed barrier, other than as a marker of progress. Indeed, IBM’s Quantum Roadmap calls for a 1,000-qubit chip by the end of 2023—10 times the processing speed in less than two years. Meanwhile, Google, Microsoft, D-Wave Systems, Intel, Toshiba, Hewlett Packard, and many other companies–along with countries such as China, Germany, Canada, the U.S., India, and Japan—also are developing quantum computing technology.

IBM has declared this the Quantum Decade, in which “enterprises begin to see business value from quantum computing.” Big Blue is describing the early adoption phase. (I’d be more inclined to call the ensuing period of mass adoption the Quantum Decade, but IBM again has neglected to ask me for marketing advice.)

Once quantum computing is widely deployed, expect it to be used, among other things, to:

  • Model complex molecular configurations to accelerate materials discovery and drug development
  • Combine with artificial intelligence to enable even faster AI and possibly result in the development of “thinking” computers
  • Quickly identify the points of failure in complex manufacturing processes
  • Process natural language far faster and more accurately than existing AI algorithms
  • Increase the speed of complex financial calculations
  • Easily crack encrypted data, making a mockery of your puny, classical (or is it Jurassical?) computer-based cyber defenses

A decidedly mixed bag. But that’s the thing with emerging technologies; they can be used and misused. Overall, though, quantum computing will be a huge net plus for businesses, scientists, researchers and anyone who has to quicky perform what Harvard Business Review calls “combinatorics calculations.” Whenever the Quantum Decade actually begins, it will be exciting.

Christopher Nerney is a freelance technology writer living in upstate New York. Chris began his writing career in newspapers before joining Network World in 1996. He went on to become executive editor of several IT management sites for, including Datamation and eSecurity Planet. Chris is a regular blogger at ITworld, where he has written about tech business and now writes about science/tech research. Chris also covers big data and analytics as a freelancer for Data Informed. When he’s not writing, editing or spending time with his wife and three children, Chris performs original music and covers in bars, coffeehouses and on the streets around Saratoga Springs, N.Y.

More from this author