It seems only a matter of time now that an experiment demonstrating quantum advantage, a quantum computer outperforming the most powerful classical computer, will be officially announced. To unequivocally verify that this feat has been achieved, we have to find a way to compare classical and quantum machines. Here, we explain some of the challenges, particularly relating to power consumption and cost, that we face when making such a comparison.
In classical high-performance computing, there are established protocols for races between the most powerful machines: A benchmarking program called Linpack effectively measures the time it takes to solve a range of linear equations. The fastest computer wins.
However, different computer architectures are designed to solve different problems and quantum computers are entirely unsuitable for the Linpack challenge. This is where Google and the NASA Quantum Artificial Intelligence Laboratory (QuAIL) step in.
In May 2019, Google and NASA’s scientists published this paper where the world’s largest public supercomputer, Summit, completed a mammoth simulation of quantum chaos – with the aim to replicate the result on a quantum computer and compare those calculations for an unequivocal verification of quantum advantage.
The paper states: “On Summit, we were able to achieve a sustained performance of 281 Pflop/s (single precision) over the entire supercomputer, simulating [quantum] circuits of 49 and 121 qubits.”
But there’s a problem. To carry out this simulation, Summit required a 21.1 MWh of power. If you did the simulation on NASA’s less power-efficient Electra supercomputer, you would need 96.8 MWh.
As long as we want to simulate circuits smaller than 49 qubits, we can use Summit’s computation for a one-to-one comparison. However, for future comparison, we may be in a bit of a pickle. If we want to prove quantum advantage on a quantum computer with just one additional qubit, we need to increase the power supplied by a factor of two, which is almost 42 MWh of power for Summit and 190 MWh for Electra.
As you add more qubits, the situation gets exponentially worse. If you add four qubits, the power requirements go up to over 330 MWh and 1550 MWh respectively. Ten extra qubits? That’s 21610 MWh and 100000 MWh.
For reference, the entire planet uses approximately 20×10^7 MWh every year.
Now, let’s consider the financial cost. Using residential pricing, it costs around 12p for one KWh or £120 for one MWh. So, the Summit simulation cost around £2,500 to run. If you add four qubits for 330 MWh, that’s £40,000. I’m sure Google could afford that.
If you were to simulate Google’s proposed 72 qubit device with this method, you’d need to find £21 billion. Even Google may struggle to find that down the back of the couch.
The paper clearly concedes this point and states: “Today’s HPC data centers are usually built within the constraints of available energy supplies, rather than the constraints in hardware costs. For example, the Summit HPC system at Oak Ridge National Laboratory has a total power capacity of 14 MW available to achieve a design specification of 200 Pflop/s double-precision performance. To scale such a system by 10x would require 140 MW of power, which would be prohibitively expensive. Quantum computers, on the other hand, have the opportunity to drastically reduce power consumption.”
This is an important point. If Summit is at its efficiency limit, then for every additional qubit developed we will need an unfeasible amount of power. While it is certainly possible that improvements will be made in classical computing power efficiencies, these will not keep pace with quantum computing developments within the confines of the power required to run the simulations. In that context, it is worth noting that quantum computers require orders of magnitude less power than their classical counterparts. The QPU in Google’s paper had an energy cost of just 4.2×10-4 MWh (0.00042 MWh).
Behind all the fanfare and excitement around quantum advantage, it’s important to remember that future quantum advantage verification will be less straightforward. The current comparison that Summit offers is just a point in time. Once quantum computers leap firmly ahead of classical computers, we may literally be left in the dark as to how they compare.
This is why at Riverlane we are looking forward to the first demonstration of a Quantum Advantage, where a Quantum Computer is used to investigate a problem of real-world relevance that has resisted all classical attempts at solving. With many potential demonstrations from chemistry, machine learning and more, the future of quantum computing is bright.