@jschauma And it didn't do a computation. It ran an analog simulation of an analog phenomenon. We already know that this is the (likely only) thing quantum "computing" is good for.
Conversation
Notices
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Wednesday, 11-Dec-2024 11:48:58 JST Rich Felker -
Embed this notice
Jan Schaumann (jschauma@mstdn.social)'s status on Wednesday, 11-Dec-2024 11:48:59 JST Jan Schaumann What Google did was demonstrate exponential error correction, which is critical due to qubit fragility, requiring multiple physical qubits to be combined into one logical qubit to avoid a collapse into a classical state.
But the calculations they performed are not generic, general purpose calculations. Instead, they ran a specific type of computation, the random circuit sampling (RCS) benchmark:
-
Embed this notice
Jan Schaumann (jschauma@mstdn.social)'s status on Wednesday, 11-Dec-2024 11:49:00 JST Jan Schaumann This is a pretty cool step forward in quantum computing. But the reporting I've seen, or perhaps: the takeaways people assume after skimming the headlines are often wildly off the mark.
Part of the problem is in the phrasing, saying Google's chip can solve a problem in 5 minutes that a classical computer can't solve in "a quadrillion times the age of the universe".
This is misleading.
-
Embed this notice
Jan Schaumann (jschauma@mstdn.social)'s status on Wednesday, 11-Dec-2024 11:49:01 JST Jan Schaumann Google announced a new #quantum chip, "Willow":
https://blog.google/technology/research/google-willow-quantum-chip/This chip demonstrates exponential error correction:
https://research.google/blog/making-quantum-error-correction-work/Their research in Nature:
https://www.nature.com/articles/s41586-024-08449-y
-
Embed this notice