Quantum.Tech USA 2024

April 24, Cryptography Spotlight, Westin, Downtown | April 25-26, 2024, Main conference, Conrad Hotel

Washington D.C.

The Constancy of Errors: Quantum Benchmark’s Shortcut to Supremacy

By: Richard Wordsworth, Contributing Writer

The ubiquity of errors in the quantum computing community is an open secret. And yet, for those on the outside, the word ‘computing’ can imply reliability. It’s a simple enough mistake to make for non-experts, who, not unreasonably, conflate quantum computing with its everyday, classical forebear - where technology has progressed to the point where errors at the hardware level are so rare as to be practically inconsequential. We implicitly trust classical computers to perform calculations faultlessly - but in quantum computing, errors are rife.

“Quantum computing is fundamentally different from classical computing; the only thing they have in common, really, is that they both aim to solve problems,” says Joseph Emerson, CEO of an error detection, suppression and correction company, called Quantum Benchmark. “The engineering of quantum computing, the physics behind quantum computing… for all those things, just throw what you know about conventional computing out the window. We’re completely reinventing computing from scratch.”

Emerson became part of this reinvention almost two decades ago, when he began researching quantum computing as a postdoc at MIT in 2001. While mainstream technology companies were getting to grips with 3G communications and the techno-cognoscenti were queuing for the release of the first iPod, Emerson was analyzing some of the original experiments demonstrating quantum computing algorithms. He was also quickly realising how the promise of the entire field was fundamentally limited by the lack of reliability in the output from quantum devices that would inevitably be compromised by microscopic hardware imperfections.

By 2005, he had begun pioneering approaches to bench marking the capabilities of quantum gates that would lay the foundation for error assessment and error correction in quantum computing for the next decades. With no way for engineers to remove the uniquely quantum errors in the hardware directly, scalable methods for error detection and suppression are the essential technology that  provide Quantum Benchmark’s unique role in the ecosystem.

“What people haven’t really realised on the commercial side is that this is the key technical challenge,” says Emerson. “My co-founder and I developed new, scalable methods to identify and suppress error sources that are extremely efficient and are necessary for quantum processors to scale up to become commercially viable. We decided to form our company around that and provide these solutions professionally to the emerging quantum computing hardware community. Today there is a massive amount of equity invested in the hardware and a massive amount of equity invested in helping users develop algorithms - but for algorithms to work well on the hardware, our tools are absolutely necessary.”

For those coming from a classical computing background, the scale of the error problem is difficult to stress - even with household name tech giants publishing their own error logs online.

“IBM publishes its single qubit errors on its website, Google publishes its error metrics… And these error metrics are just the tip of the iceberg,” Emerson says. “When they’re made publicly available, [all you] know about the error iceberg is the floating tip above the water. It’s a complete underestimate. But just based on the error rates associated with the tip of the iceberg, you see that the best gates out there and the best operations produced in the world right now have error rates of anywhere between one part in a hundred to one part in a thousand [operations]. And if you think about commercial applications, you will need 10^6, 10^7 operations - maybe a lot more - then you realise with current quantum computer hardware, it’s nowhere near the level it needs to be to produce commercially useful quantum computing solutions. That’s how severe the error bar is and that’s the problem we help overcome.”

The gulf between quantum computing’s ambition and its respective hardware’s restrictions is not an easy - or even a realistic - near-time fix. Even a decade from now, Emerson predicts, the physical limitations of chip manufacture will continue to form a barrier to a great leap in computing power - unless companies like Quantum Benchmark can provide a software solution that’s smart enough to spot and discard errors in real-time.

“In the long run, if you ask the leading [quantum] groups in the world… In ten years their dream is to have 100,000 to a million qubits with an error rate per operation of one part in a million. And then you have to run one million qubits for, say, 100,000 clock cycles. After one clock cycle, you have an [almost 100%] chance of an error on that chip. That’s their dream scenario; if they’re lucky, they achieve that. 

“So why is there all this enthusiasm about quantum computing if the best case scenario in ten years is that we have a piece of hardware that’s going to fail after one clock cycle? The answer is optimized quantum error correction… As long as you can produce hardware with well-behaved errors with error rates of around one-part-in-a-thousand or one-part-in-ten-thousand, and you can scale that up across 100,000 or a million qubits, then you can do logical error correction by a logical encoding.”

However, despite the general consensus that quantum hardware will not catch up to the demands of its users any time soon, Emerson believes that many of today’s researchers are still underestimating the challenges and failing to provide solutions that will scale with quantum computing’s mainstream adoption.

“These are very, very hard technological problems that require deep knowledge of math, deep knowledge of quantum physics, deep knowledge of the experimental apparatuses - ion traps, photonic chips, defects in silicon - whatever the platform people are exploring, you need all of those pieces in order to come up with solutions. Papers come out all the time with people floating things you could do to improve the performance of quantum computers… but many of the papers are completely missing the main aspects of the problem, or proving that their method is scalable, proving that it’s robust to various imperfections in the hardware so that you can believe the data that you get out. So there are these fairly deep technological complexities to these problems. And as a result, essentially the only people in the world who know how to do this are people who came out of my research group.”

Emerson is reluctant to make overly far-reaching predictions on the future of quantum computing (“I don’t necessarily want to be in the game of making guesses,” he says, cautiously). But with error detection marshalling the output of quantum hardware, he’s still as optimistic as any quantum startup CEO about the game-changing applications that quantum processing could offer.

“Where it will come into play and have a massive societal impact first is where quantum computing provides a research, engineering and scientific tool,” he says. “Certainly in the first decade of quantum computing becoming commercially viable, it’s going to produce tremendous value in solving society’s biggest problems: in climate change, cures for cancer and other diseases… In the case of climate change [for example] what do we need? We need energy efficient materials. How do we get energy efficient materials? Well, these are quantum mechanical materials; quantum mechanics governs the transmission properties of electrical lines. It controls the absorptivity of solar panels. 

“Today, [these] problems are insurmountably hard because classical computers can’t effectively model the quantum mechanical effects. But it’s easy for a quantum computer to model those effects, because the quantum mechanical effects that are so hard to model with a conventional computer are natively present in the quantum hardware... If you have a few hundred atoms, each with a few electrons, determining some photo-absorption process, to classically simulate that requires a number of bits that’s exponentially larger than the number of electrons. If you have 200 electrons, you need 2^200 classical bits, which is inconceivably large. But a quantum computer would only need 200 qubits to solve that problem. Though they have to be 200 really good qubits; qubits that are so good that they’re beyond the conception of the best possible physical hardware we can get - which is why we need this error correction solution.”


For more information email Amit Das directly on amit.das@alphaevents.com

To visit the Quantum Benchmark website: https://quantumbenchmark.com/