New financial-trading algorithms promise quantum-computer performance improvements over classical computers within 5-10 years rather than 10-20. Credit: Sakkmesterke / Getty Images Financial traders rely heavily on computer financial simulations for making buying and selling decisions. Specifically, “Monte Carlo” simulations are used to assess risk and simulate prices for a wide range of financial instruments. These simulations also can be used in corporate finance and for portfolio management. But in a digital world where other industries routinely leverage real-time data, financial traders are working with the digital equivalent of the Pony Express. That’s because Monte Carlo simulations involve such an insanely large number of complex calculations that they consume more time and computational resources than a 14-team, two-quarterback online fantasy football league with Superflex position. Consequently, financial calculations using Monte Carlo methods typically are made once a day. While that might be fine in the relatively tranquil bond market, traders trying to navigate more volatile markets are at a disadvantage because they must rely on old data. If only there were a way to accelerate Monte Carlo simulations for the benefit of our lamentably ladened financial traders! Soon there will be, according to financial services giant Goldman Sachs and QC Ware, a quantum-as-a-service provider that develops applications to run on near-term quantum-computing hardware. Researchers for the two partners reportedly have designed new quantum algorithms for running Monte Carlo simulations on near-term quantum hardware expected to be available in five to 10 years. There’s a lot to unpack there. First, what is near-term quantum computing hardware? Basically, it’s a flawed and error-prone version of the fully realized version of quantum computing, highly susceptible to environmental “noise” that contaminates results. In practical terms, that means near-term quantum devices have high error rates and will begin returning incorrect results after only a few calculation steps. I know, sign me up, right? Fortunately, there are quantum algorithms capable of reducing errors while enabling quantum computers to perform Monte Carlo simulations 1,000 times faster than classical methods. Unfortunately, the error-corrected quantum hardware required for these algorithms to run simulations at that speed is 10 to 20 years away. Goldman Sachs and QC Ware researchers set about trying to find some middle ground between speed of implementation and optimum quantum computing performance. “By successfully sacrificing some of the speed-up from 1000x to 100x, the team was able to produce Shallow Monte Carlo algorithms that can run on near-term quantum computers expected to be available in five to 10 years,” the two companies said in a press release. So while there’s no immediate help, financial traders can take comfort in knowing the timeline for faster Monte Carlo simulations has been cut in half. A few short years from now, financial Monte Carlo simulations and 14-team, two-quarterback online fantasy football league with Superflex position shall scarcely look the same. That goes for other endeavors for which quantum computing is expected to be transformative, including healthcare, artificial intelligence, logistics, manufacturing, and national security. The ability of quantum algorithms to exponentially increase computing speeds will allow enterprises to innovate faster, respond more quickly to market disruptions, and operate more efficiently. That adds up to quite a competitive advantage. CIOs ignore quantum computing at their own peril. Bonus quantum-computing breakthrough news Meanwhile, proving there is more than one way to skin a qubit, Los Alamos National Laboratory reports it is using machine learning to develop algorithms that make today’s quantum computers less vulnerable to noise. In a new paper, the scientific research agency demonstrates how a method called “noise-aware circuit learning” can reduce error rates by two or three times. Patrick Coles, a quantum physicist in at Los Alamos National Laboratory and lead author on the paper, said the machine learning approach is similar to a person receiving a vaccine that trains the immune system to resist a virus. Machine learning allows quantum algorithms, or circuits, to build resistance to a specific quantum machine’s noise processes. There’s an analogy we all can relate to these days. Related content news 6G cellular doesn’t exist, but it can be hacked Academic researchers have found a way to eavesdrop on 6G wireless transmissions using common, inexpensive, off-the-shelf components. By Chris Nerney May 31, 2022 3 mins Cellular Networks Network Security opinion When quantum computers forget: Overcoming decoherence Quantum computing faces many technology challenges, but some researchers say they’ll be solved soon to usher in the Quantum Decade. By Chris Nerney Jan 03, 2022 4 mins Data Center opinion LoRa takes a trip to the moon and back, chirping all the way LoRa wireless that's used in IoT networks is known as a low-power, long-range technology—long-range enough to carry a message between Earth and the moon. By Chris Nerney Dec 02, 2021 3 mins Internet of Things WAN Networking news Drone demo shows it’s possible to protect 5G-managed devices from DDoS, exfiltration attacks Using software developed by the Open Networking Foundation, Stanford researchers thwart wireless attacks in less than a second. By Chris Nerney Nov 10, 2021 3 mins 5G SDN Security Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe