The Herculean tasks of quantum computing
Forget the hype, quantum computing is still in its experimental infancy. Researchers must overcome five big challenges before real-world applications can emerge.
- Quantum computers could resolve problems that are impossible for classical computers, like modelling the behaviour of atoms.
- One of the main challenges today is to increase the number of qubits. At the moment, more qubits in the system mean less accuracy.
A universal quantum computer could efficiently solve a whole host of problems that are simply impossible for a classical computer. Such a machine could model the behaviour of atoms or chemical reactions, help in optimisation problems like vehicle traffic flow in cities or search through huge amounts of data to find ‘needle-in-a-haystack’ answers. At the moment, those uses are based on conjecture and researchers estimate it’s 10-15 years away from becoming reality. However, a number of experimental computer architectures are emerging as promising platforms. Some use superconductors to control the states of electrons, other systems use photons or ion traps, which are a combination of electric or magnetic fields that capture ions in a vacuum system. Each technology has its own set of advantages and a long list of developmental problems, but five main challenges face the industry at large.
1. The hype machine
In 2018, no company is close to offering up a commercially useful universal quantum computer. “If you look at the sales pitches of big corporations in this field, they overemphasise the state of development and its level of usefulness,” says Hendrik Bluhm from RWTH Aachen University in Germany. “Some announce qubit numbers without publishing anything on their performance; others are aggressively marketing the commercial viability of quantum computers, even though the number of proven applications is currently zero.”
For now, the pathways towards quantum computing are either hybrid systems, where quantum processors augment classical digital computers, or else they’re analogue quantum simulators, which exploit quantum weirdness to perform very specific tasks. True universal computers are an order of magnitude more complex than the systems currently under development.
“Small superconducting quantum computers are already operational,” notes Servaas Kokkelmans, researcher at the Center for Quantum Materials and Technology Eindhoven (QT/e). “However, they’re still not in the regime of quantum supremacy – the point at which quantum computers can outperform classical supercomputers.” While funding pours in, the lack of reliable performance or real-world applications may not seem problematic but these issues need to be addressed at some point soon to ensure money doesn’t dry up further down the line.
2. More qubits, more problems
At the heart of a quantum computer’s power is the quantum bit or qubit. Where classical computer bits are always encoded as either a 1 or 0, qubits can exist in a superposition of 0 and 1. Single qubit systems were first demonstrated 20 years ago and, since then, their performance has improved dramatically. “One challenge here is to isolate individual, controllable systems well enough that they meet fidelity requirements,” says Bluhm. “I think one can fairly say that this has been solved for single qubits and that is why the field is now receiving so much attention.”
However, to exploit quantum computing’s potential, you need systems with multiple qubits. Those interacting qubits can be used to complete non-trivial logic calculations that can’t be completed by single qubits or classical processors: “This has been demonstrated in principle but current efforts by Google and IBM to increase the number of qubits have shown that as you add more qubits, the less accurate the system becomes,” Bluhm notes. “These are one to two orders of magnitude away from where one wants to be.”
3. Quantum error correction
Noise is one of the main challenges in building a universal quantum computer. Quantum states are so delicate that the slightest environmental disturbance may cause errors that render entire calculations useless. “The first error correction code was developed at the beginning of the 90s,” explains Ulrik Lund Andersen, an expert in photonics and quantum communication based at the Technical University of Denmark. “It wasn’t very robust, tolerating an error every million operations rather than every 100 operations like we see today. We want to get to a point where we can correct the errors faster than they occur.”
4. Hardware manufacturing
Researchers also have a long way to go before they can install on useful quantum software. A lack of readily available advanced hardware is a real obstacle for all architectures. “It’s not like we can go to a company to purchase ion traps; they’re not commercially available,” explains Thomas Monz from the University of Innsbruck.
One day, commercial quantum computers may be prone to the typical questions we associate with comparisons of classical technology platforms: performance, speed and affordability. “At the moment, all systems have to first demonstrate basic feasibility,” Monz cautions.
In the case of Bluhm and his fellow superconductor specialists, that means closer ties with foundries with experience in high-yield semi-conductor fabrication and complementary expertise in semiconductor technology: “There’s a hope this will result in breakthroughs relating to the reliability of qubit fabrication as well as larger, multi-qubit circuits,” he says.
Collaboration is relatively weak compared to similar fields with well-known initiatives like CERN and ITER, the international nuclear fusion research and engineering megaproject. “The whole field is typified by small groups, so everyone works in their own little kingdom,” Bluhm notes. “Open hubs and coordinating centres could act as integrating forces in the community.”
The European Commission’s Quantum Flagship may eventually boost interest in the field. So far, the 10-year, €10 billion project looks set to overlook computing in favour of materials science and cryptography research: just 10 of the 140 grant proposals submitted relate to computing. As quantum computing matures, that situation may change.
Switzerland is one of several European countries taking a leading role in building a research community with a stronger collective vision. École polytechnique fédérale de Lausanne’s Institute of Physics is plunging headlong into the field with two new research openings, a master’s course, and a partnership with IBM which offers access to their cutting-edge quantum-computer platform IBM Quantum Experience (QX).
“This year we will be in the privileged to be able to calculate with 20 quantum bits as opposed to five last year,” says Marc-André Dupertuis, who runs the master’s course. “The QX community also expects to pass the ‘quantum supremacy’ limit of quantum computing. A quantum computer will have obtained for the first time at least one result that would have been unthinkable to calculate with any existing conventional supercomputer.” These works are still experimental and yet, slowly but surely, these steps take Europe’s researchers edge closer to harnessing the power of quantum weirdness.
Munich, the new hub
Germany’s government will strongly support research in quantum technology over the next years. It has recently announced it will spend up to 70 million euros during the next seven years for promoting the work of the Munich Center for Quantum Science and Technology (MCQST).
The center gathers 40 research groups belonging to three different institutions – the Ludwig Maximilian University, the Max Planck Institute for Quantum Optics and the Technical University of Munich – focusing on increasing scientific understanding of quantum mechanics phenomena, including basic components, materials and concepts for quantum technologies. Its interdisciplinary research extends to quantum chemistry, astronomy and precision metrology.
Global lists are a key source of information for students choosing a university. But how relevant are they to the learning experience?
When Galileo is fully functional in 2020, it will provide the most precise navigation ever, even at the North and South Poles.