2026: The Countdown to CERN’s Quantum Upgrade


By: Richard Wordsworth, Contributing Writer

Around three minutes before I’m scheduled to talk to Dr. Federico Carminati, chief innovation officer at CERN openlab, I run into a problem. Re-reading his profile on the CERN openlab website, I’ve missed a critical detail. Dr. Carminati is not just exploring novel ways to increase CERN’s computing power through Quantum computing and machine learning. He is also a psychoanalyst, with a certification in something called ‘pet-assisted therapy’.

I have never heard of pet-assisted therapy. There is also no time for me to Google pet-assisted therapy. If I want to know what pet-assisted therapy is (and I really, really do), I am going to have to ask a senior scientist with a 30-plus year-long career at CERN who is expecting my first question to be about his background in high-energy physics.

I’m too curious. I ask the question. And, to my immense relief, he doesn’t hang up the phone. In fact, the question seems to energize him.

I have a parallel life!” he says. “I am a psychoanalyst and also [qualified in] pet-assisted therapy. Pet-assisted therapy is when you have a sort of psychological therapy with the assistance any kind of animal that you could consider a pet. Usually we use dogs, but [therapists] sometimes do this with horses, rabbits and other kinds of animals. The idea is that the presence of an animal can change the dynamics of the relationship between a caregiver - particularly a psychotherapist or a psychologist - and the patient. It’s very effective, for instance, with patients with autistic spectrum disorders, and mentally handicapped patients.”

As a psychoanalyst, Carminati even has analysands  - though, due to his work at CERN, they are necessarily very few. He calls it a “sideline” - but it’s a sideline that he’s been running for three years (after seven years studying psychoanalysis and a further year for the certification in pet-assisted therapy, all while still at CERN).

My day job is here at CERN with CERN openlab, certainly,” he says, genially. “The rest is done on evenings and weekends.

Carminati’s day job at CERN, which he began in 1985, has changed a lot over the years. The first decade was spent overseeing the CERN program library, the software used for high-energy physics simulation and analysis. And as CERN was then (and still is now) a collaborative project, that meant not only responsibility for the data on-site, but also for distributing software around the world. Which is a tough job, if you’re trying to do it without the assistance of a global internet.

We distributed this software all around the world,” Carminati recalls. “There was no way to distribute the software [digitally]. So we were physically mailing these big reels of tape - you know: the ones that you see in all the sci-fi movies - shipping them all around the world, sometimes carrying them [ourselves] in big bags. In ‘86 I went to Beijing to install the CERN software library on the Chinese Academy of Science computers, and literally brought a bagful of tapes. Which was very heavy.

Carminati’s later work was important to the analysis of the LHC data . The software he helped develop to run, collect and analyse data from the Large Hadron Collider for the ALICE experiment took around ten years to write, test and rewrite, all while coordinating with some 200 software developers around the world who contributed three (or four - Carminati isn’t sure) million lines of code. Or, for a more helpful sense of the enormity of the undertaking, consider that the project also relied on the networking of 140 computer centres around the world, a web of information and communication which Carminati describes as “the workhorse that analysed the LHC data.”

It was quite an intensive job,” he says.

Today, CERN’s computing power is still an ongoing concern for Carminati. The volume of data produced by the LHC is already measured in tens of petabytes, and by the time upgrades to the collider are completed by - all being well - 2026, the resultant data will need exabytes of storage space. But the greater challenge facing CERN’s scientists and partners is processing power, which is why at CERN openlab - a public-private partnership between CERN and the world’s biggest IT companies and research centres - Carminati and the team are exploring every bleeding-edge technological possibility for speed-up: from AI and machine learning to Quantum computing.

There is a projected shortcoming of a factor of between ten and 100 for computing power,” Carminati says of where the LHC will be in 2026. “Our budget for computing will stay at best flat, because funding won’t go up by a factor of ten or 100, so we have to find other ways to handle our data and new forms of computing that are much more effective.

The upgraded LHC (which will be known as the ‘High Luminosity Large Hadron Collider, or HL-LHC) needs that extra computing power to solve a fundamental problem in physics.

We are in a funny situation in physics right now,” Carminati says. “We have two theories: general relativity and Quantum mechanics. All their predictions have been verified - which is good news - but there are a lot of questions to which we do not know the answer and for which these two theories that are both very successful in their predictions cannot answer. And the other thing, which is extremely embarrassing, is that these two theories don’t [work] together. So there’s no way to meld them together in a unified theory. And we have a strong belief in physics that one day we will come to the ‘Theory of Everything’, which would be a single formula.

Usually in physics, you make progress when something goes wrong. But here, nothing goes wrong. We have two theories that are perfectly verified, which is not a comfortable situation to be in. So we have to find some hint that something is wrong with our standard model to allow us to make progress. We have to look for finer effects - which means much more statistics, much more work to discover these very, very subtle effects.

How sure is Carminati that CERN will have a functioning, large-scale Quantum computer by the 2026 deadline? He isn’t. But over the course of such a long career, he has also seen energy physics hit seemingly unscalable walls in the past. Which is why the CERN openlab team aren’t relying solely on the assumption that Quantum computers will become stable and sufficiently scalable enough in the next seven years, but are spreading their bets across all promising new computing technologies.

“If in two or three years we have good, stable qubits, then bingo! This is a technology we can use,” he says. “But [if we don’t], we will look for other ways to speed up our computing. For instance, we’re looking into deep-learning, machine learning, artificial intelligence - all sorts of potential speed-ups… I don’t think the solution will come from just one thing; I think it will be a combination of ideas that will allow us to analyse the data. Also: it’s not really a question of being able to do it or not; it’s a question of degree. The more computing power we have, the more we will be able to do.”

In the meantime, just working on software that will (hopefully) one day run on Quantum computers is providing what Carminati calls “collateral advantages”. Developing programs for computing systems that don’t yet exist at scale might sound counter-intuitive (or worse: like tempting fate), but in doing so CERN openlab has already made breakthroughs in its algorithm designs that can be used to improve the performance of its current classical computing architecture.

And while today’s Quantum computers are too small and unstable for CERN’s work, CERN openlab has, through its partnerships, already had some successes in using small arrays of just a few qubits.

We already have a collaborative project with the University of Wisconsin where we have developed an analysis program that reproduces some of the results of the classical analysis used to find the Higgs Boson on a Quantum computer,” says Carminati. “We’re not doing better than the classical analysis, because we’re working with five or ten qubits - but we’re demonstrating the principle that we can do it.”

What Carminati and CERN openlab are doing in the field of Quantum computing is, essentially, the groundwork. The hardware that CERN needs for the HL-LHC’s operational target of 2026 isn’t here yet. But the proofs-of-concept are. CERN openlab is building a knowledge base so that when the technology is ready, as Carminati puts it: “we will be ready to hit the ground running.”



For questions or feedback on this article, please contact Amit Das: amit.das@alphaevents.com

To find out more visit CERN's website




Return to Blog