Google's Willow new quantum chip

Google’s Willow new quantum chip:

“lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.”

1 Like

“lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.”

Is this just speculative or can some strong argument be made for this fancy connection? This seems overly incredible from my totally ignorant view of quantum computing and I actually don’t understand why a so fast computational performance in respect to classical computers should hint in that direction

1 Like

Most scientists say the connection between quantum computing and parallel universes is largely philosophical and interpretative, but it’s apparently something that excites the Google researcher.

3 Likes

As a quantum physicist, I can confirm that the statement “lends credence to the notion that quantum computation occurs in many parallel universes” is BS. There is nothing going on here but standard quantum mechanics. Whether quantum mechanics is best interpreted in the “multiverse” framework, and to what extent that interpretation is purely philosophical or has an empiric component is an interesting and open question. This quantum computing setup, or any other experimental setup I’m aware of has little to nothing to say about what the correct interpretation of quantum mechanics is.

22 Likes

According to some distinguished physicists, it is just Bold Science.

Second, Willow performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion (that is, 1025) years — a number that vastly exceeds the age of the Universe.

I am very curious if the same thing repeats that happened last time IBM announced something similar sounding: In their Nature publication from last year, they state that the most complicated thing they measure would need such otherworldly amounts of RAM that they don’t even think about the potential runtime. Some theorists then took that personally and 12 days later put out this preprint where they wrote (removed from the final published version, so only in their original preprint)

Specifically, for every expectation value obtained by the processor and plotted in Fig. 3 of Ref. [7] we obtain a corresponding value to orders of magnitude better accuracy with a simulation that takes less than 7 minutes to run on a single Intel Skylake CPU and a state that takes up, at most, 0.3GB of memory.

5 Likes

I was searching for this ref, thank you

No. It’s great engineering, but only based on quantum mechanics, any interpretation of it you choose, that applies to all quantum computers (my understanding, I didn’t look into this one especially). Or even just based on classical physics, since it’s been proven equivalent if I understood this great interview and his paper correctly (that point of the multiverse is addressed at my chosen time point, but for sure watch the whole interview from the start):

I added bold below, the italics are his, and his words, this development “boringfies” quantum mechanics:

This paper introduces an exact correspondence between a general class of stochastic systems and quantum theory. This correspondence provides a new framework for using Hilbert-space methods to formulate highly generic, non-Markovian types of stochastic dynamics, with potential applications throughout the sciences. […] In addition, this reconstruction approach opens up new ways of understanding quantum phenomena like interference, decoherence, entanglement, noncommutative observables, and wave-function collapse.
[…]
In what follows, it will be important to be keep in mind the distinction between deterministic hidden-variables theories and stochastic hidden-variables theories.

Bell’s original nonlocality theorem, as formulated and proved in 1964 [71], only addressed the case of a deterministic hidden-variables theory.
[…]
Bell’s 1964 theorem therefore establishes that any purported formulation of quantum theory based on local deterministic hidden variables is ruled out empirically.

At first glance, there might have seemed to be just two available options in response to Bell’s nonlocality theorem. Either one could accept a theory of nonlocal deterministic hidden variables, or one could deny the existence of nonlocal deterministic hidden variables

I would look into p-bits rather than just qbit based quantum computers:

Bridging the gap between classical bits and quantum bits