Scientists have achieved the lowest quantum computing error rate ever recorded — an important step in solving the fundamental challenges on the way to practical, utility-scale quantum computers.
In research published June 12 in the journal APS Physical Review Letters, the scientists demonstrated a quantum error rate of 0.000015%, which equates to one error per 6.7 million operations.
This achievement represents an improvement of nearly an order of magnitude in both fidelity and speed over the previous record of approximately one error for every 1 million operations — achieved by the same team in 2014.
The prevalence of errors, or “noise,” in quantum operations can render a quantum computer’s outputs useless.
This noise comes from a variety of sources, including imperfections in the control methods (essentially, problems with the computer’s architecture and algorithms) and the laws of physics. That’s why considerable efforts have gone into quantum error correction.
While errors related to natural law, such as decoherence (the natural decay of the quantum state) and leakage (the qubit state leaking out of the computational subspace), can be reduced only within those laws, the team’s progress was achieved by reducing the noise generated by the computer’s architecture and control methods to almost zero.
“By drastically reducing the chance of error, this work significantly reduces the infrastructure required for error correction, opening the way for future quantum computers to be smaller, faster, and more efficient,” Molly Smith, a graduate student in physics at the University of Oxford and co-lead author of the study, said in a statement. “Precise control of qubits will also be useful for other quantum technologies such as clocks and quantum sensors.”
Record-low quantum computing error rates
The quantum computer used in the team’s experiment relied on a bespoke platform that eschews the more common architecture that uses photons as qubits — the quantum equivalent of computer bits — for qubits made of “trapped ions.”
The study was also conducted at room temperature, which the researchers said simplifies the setup required to integrate this technology into a working quantum computer.
Whereas most quantum systems either deploy superconducting circuits that rely on “quantum dots” or employ the use of lasers — often called “optical tweezers” — to hold a single photon in place for operation as a qubit, the team used microwaves to trap a series of calcium-43 ions in place.
With this approach, the ions are placed into a hyperfine “atomic clock” state. According to the study, this technique allowed the researchers to create more “quantum gates,” which are analogous to the number of “quantum operations” a computer can perform, with greater precision than the photon-based methods allowed.
Once the ions were placed into a hyperfine atomic clock state, the researchers calibrated the ions via an automated control procedure that regularly corrected them for amplitude and frequency drift caused by the microwave control method.
In other words, the researchers developed an algorithm to detect and correct the noise produced by the microwaves used to trap the ions. By removing this noise, the team could then conduct quantum operations with their system at or near the lowest error rate physically possible.
Using this method, it is now possible to develop quantum computers that are capable of conducting single-gate operations (those conducted with a single qubit gate as opposed to a gate requiring multiple qubits) with nearly zero errors at large scales.
This could lead to more efficient quantum computers in general and, per the study, achieves a new state-of-the-art single-qubit gate error and the breakdown of all known sources of error, thus accounting for most errors produced in single-gate operations.
This means engineers who build quantum computers with the trapped-ion architecture and developers who create the algorithms that run on them won’t have to dedicate as many qubits to the sole purpose of error correction.
By reducing the error, the new method reduces the number of qubits required and the cost and size of the quantum computer itself, the researchers said in the statement.
This isn’t a panacea for the industry, however, as many quantum algorithms require multigate qubits functioning alongside or formed from single-gate qubits to perform computations beyond rudimentary functions. The error rate in two-qubit gate functions is still roughly 1 in 2,000.
While this study represents an important step toward practical, utility-scale quantum computing, it doesn’t address all of the “noise” problems inherent in complex multigate qubit systems.