Skip to content

Quantum Error Correction

Quantum Error Correction Report

Discover the most comprehensive review of quantum error correction (QEC), the defining challenge for today’s quantum computers. Packed with cutting-edge research, analysis and insights from top industry experts, this three-part report breaks down the complexities of QEC in an accessible way.

Download the Quantum Error Correction Report

Why do quantum computers need error correction?

Today’s quantum computers have high error rates – around one error in every few hundred operations. These errors occur primarily due to the fragile nature of qubits where environmental disturbances and decoherence affect their quantum state. 

Once we reduce this error rate to one in a million (referred to as the MegaQuOp regime), truly useful applications will start being unlocked, with larger algorithms requiring one in a billion or even one in a trillion error rates. However, it is unlikely that qubit and/or quantum algorithm improvements alone will be enough to run algorithms with billions of operations reliably. For that, we will need quantum error correction (QEC). 

This is incredibly challenging but also an essential technology that needs to be developed before the quantum computing revolution can start.

Let’s dive into why

Why is quantum error correction such a tough challenge to crack?

Scaling quantum computers to millions of qubits requires classical systems capable of processing vast amounts of data—up to 100 terabytes per second.

Quantum error correction is essential for ensuring reliable quantum operations (QuOps) by identifying and correcting qubit errors. This must scale to millions of quantum operations (MegaQuOps) and ultimately trillions (TeraQuOps) for quantum computers to fulfil their vast potential.

Learn more about the scale of the challenge

We are proud to partner with world-leading hardware partners and academic labs to deeply understand the quantum stack at all levels, from the qubits up

Frequently asked questions

What is the difference between quantum error correction and quantum error mitigation?

Quantum error correction (QEC) and quantum error mitigation (QEM) are two different schemes to deal with noise in devices, which can cause errors in computation.

Quantum error correction methods use multiple physical qubits to represent a single logical qubit. Data is preserved by distributing the information across multiple qubits. Quantum decoders can then detect and correct any errors that occur during computation. Surface code is a popular QEC code.

By contrast, quantum error mitigation methods are employed to infer less noisy outcomes of quantum computations, rather than correcting them. This is often done by repeatedly running slightly different circuits and classically post-processing the results. As an example, zero noise extrapolation is a popular error mitigation method.

QEM methods provide a reduction in noise that can be useful in the NISQ (noisy intermediate-scale quantum) era, as restraints in quantum hardware can make full quantum error correction less feasible. However, for useful computation involving many qubits and deep circuits, full quantum error correction will be necessary.

What is the difference between error corrected and fault tolerant quantum computers?

Quantum error correction and fault-tolerant quantum computers are often informally used interchangeably. To experts the terms have slightly difference meanings.

Quantum error correction is a scheme for protecting information from noise in device.

Fault-tolerance builds on this. Fault-tolerant quantum computers also prevent errors from spreading during the error correction process or during a computation. It is a richer and broader subject.

To build a useful quantum computer, we really do need fault-tolerance and not just error correction. At Riverlane, we are really solving the fault-tolerance problem, but often just call out “quantum error correction” since this phrase is more commonly known by non-experts.

Is correcting quantum errors just about adding more, better qubits?

While improved quality and quantity of qubits are critical, that's only part of the equation.

Quantum error correction (QEC) is a multi-faceted problem. On the qubit side, we need physical qubit error rates below the QEC threshold with a low ‘QEC overhead’, that is the ratio between the number of physical and logical qubits.

Once we reach a qubit fidelity of 99.9%, we can introduce a set of classical QEC technologies to solve this complex, vast, real-time data processing problem.

As quantum computers scale, an additional layer of classical QEC solutions is required to tackle the increasing error numbers and types, addressing three broad challenges:

1.  Developing complex QEC codes: QEC involves sophisticated inference algorithms (QEC codes) to determine the most likely error that might have occurred given a specific syndrome value.
2.  Performing at fast speeds: each QEC round must be fast (<1μs) and deterministic (respond promptly at well-determined times) to avoid delays that lead to uncorrectable errors.
3.  Dealing with massive data volumes: these algorithms require an extremely high instruction bandwidth, scaling to 100TB/s – equivalent to a single quantum computer processing and correcting the equivalent of Netflix’s total global streaming data every second.

So, while high-quality qubits are essential, they’re only part of the solution.

Scaling quantum computers to their full potential demands a seamless integration of quantum and classical technologies.