Recent Progress in Quantum Algorithms

Quantum theory has acquired a reputation as an impenetrable theory accessible only after acquiring a significant theoretical physics background. One of the lessons of quantum computing is that this is not necessarily true: quantum computing can be learned without mastering vast amounts of physics, but instead by learning a few simple differences between quantum and classical information.

A well written article in the CACM.

There's also slides for a talk by Rob Pike on the same topic.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

See also ...

... N. David Mermin's wonderful paper From Cbits to Qbits: Teaching computer scientists quantum mechanics, which has taught me almost all the quantum mechanics I know (for what that's worth).

Actually that reputation is

Actually that reputation is equally probable to be deserved as undeserved.

I've never understood this reputation

The most conceptually difficult aspect is elementary linear algebra, not known as an especially hard subject. I encourage people to look into quantum computing and not be put off by expectations that it will be full of weird voodoo and mind-bending philosophy. It really is a quite straightforward subject.

in some ways monads aren't that hard, either

yet somehow they remain quite often befuddling? there's some lesson to learn about how things are presented, i'm guessing.

Some advanced math is useful...

Quantum mechanics and monads both need some math to understand. Or, to use a "snow clone" form of the famous joke about QM, "anyone who thinks he understands monads doesn't."

Even physics students who don't know the ways a Hilbert space differs from a vector space can get it wrong. Most physics students--and I was one of them--initially get it wrong. They read about how one cannot simultaneously measure frequency and energy, or position and momentum, or left and right polarization. They see descriptions of how some "Heisenberg microscope" which tries to "see" the position of an electron must necessarily use light or other particles to probe the electron, thus "changing its energy." They see QM presented in terms of the Uncertainty Principle, where certain pairs of measurements--like position and momentum--have an uncertainty relating to their product. Knowing a position to some precision implies the momentum is _unknown_ reciprocally.

This is then, misleadingly, related to the "measurement problem" which is usually phrased in terms of trying to measure the position of something small and the very measurement then "disturbing" it. Even Bohr, Heisenberg, and Einstein argued (fruitfully!) for a long time over this.

Einstein, usually described as being unaccepting of quantum mechanics (ironic, given his seminal work on the photoelectric effect, which proved that light is quantum), actually helped to better clarify the foundations, this via the thought experiments of the Einstein-Podolsky-Rosen paper. (Cf. "the EPR paradox")

Later experiments (Aspect, others) and theoretical work (Andrew Gleason, Kocken and Specker, John Bell, Asher Peres, David Albert, Mermin, others) showed better just how "weird" the quantum world is. I can't do it justice here, but, basically, there appears to be no "local description of reality." This very closely fits with non-standard models of logic, constructivism/intuitionistic logic, and toposes.

And this gets weirdness gets popularized as "non-locality" and "spooky action at a distance," with incorrect speculation in science fiction and even in the mainstream as allowing faster than light, or even instantaneous communication at great distances. It doesn't.

The double slit experiment really is the key. Feynman once said that all of quantum mechanics, at the core, is present in the double slit experiment. Measuring which _slit_ a photon, or electron, or even a larger particle (a Buckyball, even) goes through means an interference pattern is lost. This is NOT a "disturbing" of the particle, as the decision as to either look at interference patterns (accumulating on a photographic plate or CCD detector, for instance) OR the "which slit?" ("which way?") can be made LONG AFTER the particle must've gone through one slit or the other or _both_ slits.

And this decision can be made on individual photons, perhaps sent through the apparatus one at a time, one per second, or one per day, whatever. There is no "group effect" as water waves hitting slits in a jetty would show (the familiar water tank demonstration from high school physics).

I need to wrap this up. The math of quantum mechanics, involving Hilbert spaces, cartesian closed categories, commutators, etc. is just "different" from the ordinary math we get even in classes as advanced as linear algebra, i.e., just vector spaces.

There are some very fine introductory books on QM which fully expose the "modern" view. Mermin, David Albert, even the recent "How to Teach Quantum Mechanics to Your Dog," which I've only had the chance to skim.

Some of these present the "weirdness" by using examples of pairs like "black vs. white, heads vs. tails." A coin is sent through a "black box" which "white coins." Only white coins are let through. Another such filter will confirm that only white coins are getting through. Just as "ordinary logic" would suggest. We would say the coins "are" (really, truly) white coins.

Feed this stream of white coins through a black filter. No coins pass through. Because, as ordinary logic would have it, a coin can only be white or black.

But what happens if we insert a heads vs. tails filter in between these two filters described above?

Suddenly coins start coming through. Or 25% of them, at least. It's as if the second filter "changed reality." What were once only white coins pass through the first filter are somehow converted to an even (*) mix of white and black coins. Then, upon passing through the third filter, one in four of them emerges.

Weird. And this has nothing to do with some "the measurement apparatus disturbs the quantum process," like de Broglie's famous "microscope" example. No, things are much more fundamental than this. The math, and the logic, of the quantum world is just quite different. (Side note: one thing which has intrigued many researchers is the connection of QM to the Law of the Excluded Middle. Or lack of it. As we cannot say a particle "IS or IS NOT" in some state. We cannot say "Well, the photon must have gone through one slit or another slit, that's just plain A or not-A logic." Nope. There are some obvious connections to constructive/intuitionistic logic, and to topos theory.)

In category theory terms, we might say that QUANT, the category of quantum things, is different from VECT, SET, etc., where the classical, everyday world operates. John Baez, Christopher Isham, and others have written about this category-theoretic point of view.

Monads are often described in terms of "space suits" and "assembly lines" and other everyday kinds of metaphors, but the core mathematics is needed to grok them at the level of diagrams of mappings.

Bill Lawvere and Stephen Schanuel wrote a great book called "Conceptual Mathematics: A First Introduction to Category Theory." This book is accessible to bright high school students, with the right teacher.

I believe this kind of "post-modern" introduction to category theory, a bit of abstract spaces theory (Hilbert and Banach spaces), physics demonstrations of the sort mentioned above, and a parallel course in a modern functional programming language like Haskell would make an outstanding core curriculum.

LtU's own sigfpe (Dan Piponi) gave an enthusiastically-received talk at last summer's FP conference on aspects of this nexus, and his blog site frequently yields amazing work he's done. He also has a long explanation of quantum mechanics.

--Tim May

re: core curriculum idea

that would be a wonderfully interesting experiment.

Getting used to the weirdness...

I thought I'd add a comment on how the "weirdness" of QM came to seem _less_ weird to me over time. This may have a little bit to do with how language paradigms may seem initially weird, awkward, but can then become the "new normal" once a hump is gotten past.

I was an avid reader of physics popularizations in the 60s, things like De Broglie's "Physics and Microphysics," Dirac's "Matter and Antimatter," and, especially George Gamow's "One, Two, Three....Infinity!" I didn't fully understand the real nature of QM, the stuff talked about in this thread (esp. in Rob Pike's slides). The "strange world of the quantum" (another book title of the era) just seemed strange, but familiar after all of this reading.

(And I'd been lucky enough as a kid to get a Gilbert Chemistry Set which had a spinthariscope in it, a little cardboard tube with a zinc sulfide screen at one end and a magnifying lens at the other. Oh, a little bit of some radioactive substance. So I could watch in my darkened room the flashes of alpha particles plowing to a stop in the ZnS. This, and a cloud chamber I built, and so on, made quantum physics pretty real.)

The full weirdness of QM only hit years later. This is the stuff that a lot of professors will say is outside the scope of a classroom. The so-called "shut up and calculate" point of view.

But it was the foundational stuff, the follow-ons to Bell's Theorem, to the Kochen-Specker Theorem that really gave us the possibility of quantum computation. Feynman realized this just about before anyone, back around 1959. All of the quantum teleportation, quantum bit cloning (and the "no cloning" result, which has some resonances with our view in FP that replicating state is not a cheap or easy thing to manage--we can almost consider the evaluation of an expression to be like a "measurement" in QM: once made, it never changes. (None of the weirdness in QM ever involves "changing history." Once a measurement is made, "all honest observers" will always agree with the outcome. There may be some weirdness in the point that we can no longer think of a box (like the box containing Schrodinger's cat) has having a definite state (aliveness or deadness of the cat) until after we have opened the box (made a measurement = evaluated an expression). But once opened, all observers (perhaps in that universe?) will agree on the measurement. This is much like the referential transparency in FP.)

Finally, the weirdness in the quantum world is what allows us to exist. Things like the Pauli Exclusion Principle, which says that two particles may not occupy the same quantum state (if they are fermions, as electrons are), is why there's a maximum of two electrons in the inner orbital (e.g., hydrogen and helium), and so on through all of the usual s, p, d, and f shells. If the universe were "classical," and somehow we had particles orbiting nuclei, they wouldn't be stable enough (just as solar systems can and do change over long periods), elements would not have the chemistry they do, crystals would not exist (probably), and the whole shebang would just be a chaotic mess of billiard balls bouncing around.

There are many things about the quantum world, obviously, that make things "work" around us.

So this is the hump I jumped over, way back. It sort of surprises me that Einstein never got comfortable with this world.

I believe there are very, very deep connections with the kind of logic so often discussed in PLT. I've mentioned a few connections. One of the biggest is not just the stuff about laziness and measurement as evaluation, but is about the "postmodern" world where different sets of postulates create different worlds or realities. We now have multiple geometries, to be picked for specific purposes (a bit like loading packages). We have standard and nonstandard analysis. We can choose to work in a world, a topos, where the Axiom of Choice applies....or one where it is not used.

In my speculative opinion, sophisticated PLs are giving us tools to build these worlds, to change the very systems of logic we use (e.g., linear logic, System F), and to use these tools to understand even "weird" reality.

--Tim May


[[ It sort of surprises me that Einstein never got comfortable with this world. ]]

It surprise me that you are comfortable with a world where there are non-local objects, and even though it's possible to change the state of those non-local objects instantaneously those change cannot be used to send information faster than C..

Bell's Theorem

I'm probably naive...

... but I knew that the "measurement problem" was a red herring, ever since the Aspect experiments in the 1982 (see _In search of Schroedinger's cat_ for an excellent popularization that doesn't seem to oversimplify). I then thought a simple summary of the quantum world is just "both the past and the future influence the present," and that it's wonderful how there happens to be some understandable math that explains exactly how.


Please pardon the OT, but:

No physical theory in at least 400 years has posited that time flows in only one direction. Not Newtonian mechanics; not Special Relativity; not General Relativity; not Quantum Mechanics; not String Theory (which isn't even science); not Brane Theory (which isn't even science). Schrödinger's equation is a nice generalization of the Hamilton-Jacobi equation, and that's really about it.