Recently, I read an excellent book about modelling the climate. For me, the powerful message was not that climate change is happening but that we cannot predict how it will affect future generations.
No matter how much classical computational power is available, you cannot make accurate predictions about the future of our planet. So, the most sensible course of action is to focus on reducing emissions to mitigate the effects of climate change.
This links to the main reason why I (together with many others) want to make quantum computing a reality, sooner. Quantum computers have the power to simulate the atomic (quantum) processes that happen in nature and tackle many of the causes associated with climate change.
The knowledge that we can gather from these simulations is key, for example, to build better batteries. Better batteries are the gateway to improved energy storage facilities, better electric cars and the electrification of commercial aircraft. And battery development is just one example. Quantum computers could also be used to support the design of new key technologies, like jet-engines, by augmenting our existing HPC capabilities.
There are many climate-related use cases that quantum computers could unlock. Individually, each of these may help reduce our equivalent CO2 emissions. If you add all the potential use cases together, the reduction will be more noticeable. But we need to move fast because the effects of climate change are happening now, and we still do not understand the implications for our future.
“Move fast” for me means to ensure the first generation of HPC integrated quantum computers are capable of efficiently simulating quantum processes. The word efficient is key: we want to compute faster and with a lower energy footprint than a classical computer.
It is interesting to see how both speed and energy consumption are related to one key aspect of quantum computations: how we perform quantum error correction.
The building blocks of every quantum computer, the qubits, are affected by errors to an extent that they are not usable to perform any valuable computation (assuming those qubits are left to their own devices).
These errors can be corrected by adding redundancy to the process and by “grouping qubits together” (i.e. mapping many physical qubits into one “logical” qubit).
Logical qubits can be made (almost) error-free and they can be used to perform increasingly complex computations.
But this quantum error correction process, whilst necessary, comes with penalties in terms of computation speed and power consumption – and we must carefully manage this overhead.
The overall computational speed is reduced significantly due to temporal redundancy. Qubits must be measured and manipulated multiple times to achieve a single step of computation.
As a rule of thumb, if we can implement the same (almost) error-free computation with less qubits, we are also reducing the total execution time.
This is not easy. To make more out of less qubits requires building improved or novel qubit solutions, sufficiently fast error identification components, understanding how to implement efficiently each step of the computation and how to optimise the process of measuring and manipulating the qubits.
Furthermore, the power consumption “per unit of time” of the system is unavoidably increased by adding the error correction layer. What matters here is how many qubits are involved in the process.
Each qubit requires energy to be operated and manipulated. For a superconducting system (one of the leading types of quantum computer), each qubit adds roughly 10W to the overall power consumption (mostly in the digital to analogue and analogue to digital radio frequency chains). We predict between 10,000 to 100,000 qubits will be required to perform efficient computations, thus requiring an additional 1MW.
We can reduce the overhead per qubit by, for example, designing low-power electronics for detecting errors and manipulating qubits. We can also make smart design decisions by measuring the overall performance of the integration.
Thus, we can build quantum computers in a way that can help tackle climate change-related problems without releasing excessive additional CO2 into the atmosphere. Any other strategy would be nonsensical.
However, this is a scenario we have seen played out with other technologies, like AI, which focused purely on computational performance at the beginning and then retrofitted remedials to sort out power consumption.
The quantum computing community must learn from this short-sightedness. We must maintain a focus on both the efficiency and performance of quantum computers from the start.
It’s a savvy, long-term strategy, which will clear the runway for quantum computers as they continue to scale now, with no nasty power consumption surprises in the future. This, in turn, will speed up the development of quantum computers, allowing them to overtake classical computers on the performance per watt metric and unlock more climate-related use cases, sooner.
This article was first published in embedded.com on 19th July 2024.