Quantum decoders detect and correct the vast flood of data errors unique to quantum computers. They’re a fundamental component of the quantum computing stack and vital to achieving scale up.
A new paper, published in the journal Physical Review X (PRX), has introduced a new set of quantum decoders that outperform all other fast decoders, enabling better results in today’s quantum error correction experiments and opening up new areas for theoretical case studies.
Quantum error correction is quantum computing’s defining challenge. It is a set of techniques used to protect the information stored in qubits from errors and decoherence caused by noise.
Quantum decoders rely on quantum error correction codes. These codes use multiple physical qubits, each with high error rates to encode information in a single ‘logical’ qubit, enabling the identification of errors through a decoding process.
This increases the logical fidelity (or accuracy) making the computation more reliable. However, most fast decoders neglect important noise characteristics (where noise is the interference leading to data errors). And ignoring these noise characteristics reduces the accuracy of the resulting calculation.
In the paper, we introduce quantum decoders that are both fast and accurate, which can also be used with a wide class of quantum error correction codes. This new set of decoders lowers the bar required for qubit quality. In other words, these quantum decoders make the qubits in a quantum computer look better by making them appear to produce fewer errors.
After the preprint of our paper was posted on arXiv, the Google Quantum AI team used our belief-matching decoder in their landmark paper demonstrating quantum error correction. Belief-matching was an important component in Google's experiment, as it was the only efficient decoder capable of suppressing errors in their device.
This was an amazing project to work on. It began during my time at AWS and finished after I started at Riverlane.
You can read the full paper Improved decoding of circuit noise and fragile boundaries of tailored surface codes here.