Conservative Logic

Edward Fredkin and Tommoasso Toffoli from the MIT Labratory for Computer Science present Conservative Logic...

a comprehensive model of computation which explicitly reflects a number of fundamental principles of physics, such as the reversibility of the dynamical laws and the conservation of certain additive quantities (among which energy plays a distinguished role). Because it more closely mirrors physics than traditional models of computation, conservative logic is in a better position to provide indications concerning the realization of high-performance computing systems, i.e., of systems that make very efficient use of the "computing resources" actually offered by nature. In particular, conservative logic shows that it is ideally possible to build sequential circuits with zero internal power dissipation. After establishing a general framework, we discuss two specific models of computation. The first uses binary variables and is the conservative-logic counterpart of switching theory; this model proves that universal computing capabilities are compatible with the reversibility and conservation constraints. The second model, which is a refinement of the first, constitutes a substantial breakthrough in establishing a correspondence between computation and physics. In fact, this model is based on elastic collisions of identical "balls," and thus is formally identical with the atomic model that underlies the (classical) kinetic theory of perfect gases. Quite literally, the functional behavior of a general-purpose digital computer can be reproduced by a perfect gas placed in a suitably shaped container and given appropriate initial conditions.

This paper has a small discussion in a forum thread mostly saying the paper should be on the front page.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

garbage collector is a refrigerator

Re:

The second model … constitutes a substantial breakthrough in establishing a correspondence between computation and physics.

No mention of Henry Baker, the incorrigible cross-disciplinarian? Surely, his Thermodynamics and Garbage Collection [¹, ², and ³] merits a reference? To lift a quote: “Computer scientists should have a knowledge of abstract statistical thermodynamics.”

Along those lines, Mike Stay

Along those lines, Mike Stay and the inimitable John Baez on Algorithmic Thermodynamics: http://johncarlosbaez.wordpress.com/2010/10/12/algorithmic-thermodynamics/

Glen Winskel has also done a fair amount of work on symmetry in the context of domain theory for concurrency: http://www.cl.cam.ac.uk/~gw104/. He doesn't talk about physics directly, but I think pushing the work in that direction could potentially be quite promising. Domain theory for concurrency <--> Operational transformation <--> commutative patches <--> darcs <--> symmetric quantum operators.

(note, not all arrows are equally commutative and symmetric in the above sketch :-))

Programs are symmetric under translations of where they run, and how fast they run. Concurrent constructs selectively break these symmetries. Question one -- what conserved quantities do these symmetries give rise to? Question two, what properties can we recover from these broken symmetries?

'Conservative Logic' was

"Conservative Logic" was published in '81, so it is not surprising that it does not cite a paper published in '94.

Plaidoyer for Sokoban

I wonder whether these guys have ever played Sokoban instead of billiards and whether their fine conclusions then still hold?

(Don't try the link, it's addictive.)

Reversibility and Sokoban

Sure. Just press 'u'.

Ah! Meta-physics!

Hmmm, IANAP, but... I sincerely doubt that the universe saves state, and I wonder why so many physicists believe so [do they?]. To me, there doesn't seem to be even evidence for that on a macroscopic scale. If you take a signal and an amplifier, you cannot reverse the output to see whether the original signal was amplified or tuned down (x * y = 6, what are x and y). Also, if you take two gas tanks and connect them with a tube, there's no way of reconstructing the original pressures ( (x+y)/2 = 6, what are x and y).

I doubt the universe saves state, and I wonder whether there's any reason to believe so on a microscopic scale.

Though your comment makes a lot of sense on many levels, both meta-physical and the level I am playing now. ;)

[ This may, or may not, have anything to do with the fine article, which I am not even half-way through. ]

They think that because the

The basic laws of Physics are reversible. For example if you take Newtons equations of motions and come up with a solution f(t) that describes your universe, then f(-t) is a solution of Newton's equations of motion too.

The Missing Billiard Ball?

Great, so I will need a particle which doesn't behave like a billiard ball to prove that the universe is irreversible (or rather, doesn't have a unique inverse.)

Guess getting a PhD and building my private high-energy physics collider is a bit too much for one afternoon.

Ah well, back to Sokoban.

I don't know much about the

I don't know much about the subject, but the laws of Physics are not symmetric under time reversal (though Newtonian mechanics is), but they have a so called CPT symmetry. Which means that not only do you have to reverse time, you have to replace all particles by their respective antiparticles, and you have to look at the resulting universe through a mirror. However, according to Wikipedia there is some experimental evidence that even CPT symmetry is violated, because neutrinos appear to have a different mass than antineutrinos. http://en.wikipedia.org/wiki/CPT_symmetry

No idea. Looks like something different

The implication of CPT symmetry is that a "mirror-image" of our universe — with all objects having their positions reflected by an imaginary plane (corresponding to a parity inversion), all momenta reversed (corresponding to a time inversion) and with all matter replaced by antimatter (corresponding to a charge inversion)— would evolve under exactly our physical laws. The CPT transformation turns our universe into its "mirror image" and vice versa. CPT symmetry is recognized to be a fundamental property of physical laws.

As I read it, but this is just from trying to extract the meaning out of the English written, this is something different: Invert a large number of 'constants' and time, and the 'mirrored' universe will evolve like the 'original' universe in a similar manner.

You'ld still evolve two mirrored universes in a similar manner, forward in time, there's no mention whether the universe is irreflexive.

Okay, so, the mirrored universe satisfies the same laws, what I don't get out of the abstract: Does the mirrored universe evolve into the history of the original universe?

[ Confusing, now I got my forwards and backwards twisted. Uh? Fixed some stuff. ]

[ Forget it: it's all here and here and here. ]

What kind of reason are you expecting?

I wonder whether there's any reason to believe so on a microscopic scale.

Physicists have had considerable success modelling microscopic phenomena using physics that is time-reversal invariant (or CPT invariance if the weak force is involved). There are no microscopic phenomena (or possibly just rare disputable phenomena) that become explaianable if we suppose these invariances to fail. What more reason could you want for believing something in physics?

No reason, Just a hunch

What more reason could you want for believing something in physics?

Well, there is no great unified theory of everything, yet, or is there? So, I guess the game is still largely open, certainly when it comes to explaining microscopic phenomena.

And, I am not a physicist, I guess they are now at the point that they have an explanation of some microscopic (particle) behavior `by analogy to billiard balls'. As naasking states below, when one would wonder about whether the universe is computational at heart, my CS upbringing leaves me baffled with the idea that all (behavioral) laws should have an inverse, since so many (programmable) functions do not. So, that would be a good reason to believe that they are just seeing part of the picture.

As a side track, an experiment I concocted while riding my bike, named:

"Would a physicist recognize non-deterministic behavior if it was staring him right in the face?"

Say a physicist is doing an experiment where given some setting he has one switch which will trigger the experiment and he has two observers each with a light on top. Now, no matter what he does, if he throws the switch, one of the lights flashes briefly, one of the lights remains off. But there is no pattern to which one of the lights will flash. He just hits the switch, and one of the two lights flashes.

Will the physicist: A) Guess it's a non-deterministic process. B) Keep on `fixing' stuff until a hastily defined stochastic model fits. C) Walk away in frustration since the experimental setting must be flawed.

My best guess is B or maybe C, but in all likelihood hardly ever A.

So, I doubt we're there yet. (Even if most physicists are likely orders more intelligent then I am.)

Historically speaking, A.

Historically speaking, A. Our universe HAS non deterministic behavior, or at least that's how the physicists explain it (quantum mechanics).

I think you are wildly overestimating your intuition about the universe and wildly underestimating the physicists who have been studying this for their entire life ;)

The theories about elementary particles are not at all like Newton's theory about billiard balls.

Furthermore, given any computable function f, you can construct an invertible version F(x) = (x,f(x)), so you don't "lose" anything by requiring that all functions are invertible.

Time reversibility isn't something that they put in the laws because they thought that it was nice, because most phenomena we see do not seem time reversible. It just turned out that the laws that fit experiments best happen to be time reversible.

Our universe HAS non

Our universe HAS non deterministic behavior, or at least that's how the physicists explain it (quantum mechanics).

There are many deterministic interpretations of QM, de Broglie-Bohm and Many Worlds among them.

As I said, just a hunch

Historically speaking, A. Our universe HAS non deterministic behavior, or at least that's how the physicists explain it (quantum mechanics).

Hmm, as far as I gathered it. Most physicists now believe the universe is deterministic in both ways, but aren't sure.

All evidence seems to point in that direction, even quantum mechanics. My point was just that maybe, maybe, maybe, the universe is non-deterministic, and that would be very hard to observe. For anyone.

I think you are wildly overestimating your intuition about the universe and wildly underestimating the physicists who have been studying this for their entire life ;)

Nah, I am not. ;)

The theories about elementary particles are not at all like Newton's theory about billiard balls.

I abstracted from the notion particle physics that the mental model is in terms of particles, which by analogy, I called billiard balls. But I agree, this is not something I understand, the notion of reversibility just goes against my feeble understanding of the universe.

Furthermore, given any computable function f, you can construct an invertible version F(x) = (x,f(x)), so you don't "lose" anything by requiring that all functions are invertible.

Ah! A good point for discussion. When I take that abstract definition to the concrete, a number of universal laws would work that way, and I would apply it to the universe, it would mean the universe saves state. Which is something I think is unlikely.

I just find it entirely more likely that some behavior of the universe is just not reversible. And maybe that there are non-deterministic laws involved.

Time reversibility isn't something that they put in the laws because they thought that it was nice, because most phenomena we see do not seem time reversible. It just turned out that the laws that fit experiments best happen to be time reversible.

With some elaborate construction. But, yah, this is where I should shut up, since I just don't grok the math involved.

So, I'll shut up. And do what I always do, see if I can get my ole dad to supply me with an answer in lay man's terms I can understand.

Related, but not

Pondering physics as an abstract machine is a good exercise in imagination, and certain interpretations of QM throw an interesting wrench in this day dreaming. We know from Bell's inequalities that any QM theory must sacrifice either locality or counterfactual definiteness.

Non-locality is simple to model since this merely involves altering the machine's notion of time, but I haven't figured out how one could model counterfactual definiteness. Any sort of machine one would like to define would seem to necessarily have well-defined state, even when unobserved. Defining a machine in which one cannot meaningfully talk about unobserved states is a failure of my imagination.

The precise of the computation taking place in the natural world that we model with physics is also still an open question. It's not clear whether it's Turing equivalent, or a hypercomputer of some sort. If a hypercomputer, can this be exploited somehow as an Oracle?

In any case, the connections between computation and physics are only beginning to be explored, so I expect some surprising and delightful discoveries down the road.

Do we?

We know from Bell's inequalities that any QM theory must sacrifice either locality or counterfactual definiteness.
I'm not a physicist, but I found The "Chaotic Ball" model,local realism and the Bell test loopholes interesting and readable:
It has long been known that the "detection loophole", present when detector efficiencies are below a critical figure, could open the way for alternative "local realist" explanations for the violation of Bell tests. It has in recent years become common to assume the loophole can be ignored, regardless of which version of the Bell test is employed. A simple model is presented that illustrates that this may not be justified. Two of the versions -- the standard test of form -2 <= S <= 2 and the currently-popular "visibility" test -- are at grave risk of bias. Statements implying that experimental evidence "refutes local realism" or shows that the quantum world really is "weird" should be reviewed. The detection loophole is on its own unlikely to account for more than one or two test violations, but when taken in conjunction with other loopholes (briefly discussed) it is seen that the experiments refute only a narrow class of "local hidden variable" models, applicable to idealised situations, not to the real world. The full class of local realist models provides straightforward explanations not only for the publicised Bell-test violations but also for some lesser-known "anomalies".
And as long as we're drifting off topic, I also liked Clearing up Mysteries - The Original Goal:
...we must keep in mind that Einstein's thinking is always on the ontological level; the purpose of the EPR argument was to show that the QM state vector cannot be a representation of the "real physical situation" of a system. Bohr had never claimed that it was, although his strange way of expressing himself often led others to think that he was claiming this.

From his reply to EPR, we find that Bohr's position was like this:

"You may decide, of your own free will, which experiment to do. If you do experiment E1 you will get result R1. If you do E2 you will get R2. Since it is fundamentally impossible to do both on the same system, and the present theory correctly predicts the results of either, how can you say that the theory is incomplete? What more can one ask of a theory?"

While it is easy to understand and agree with this on the epistemological level, the answer that I and many others would give is that we expect a physical theory to do more than merely predict experimental results in the manner of an empirical equation; we want to come down to Einstein's ontological level and understand what is happening when an atom emits light, when a spin enters a Stern-Gerlach magnet, etc. The Copenhagen theory, having no answer to any question of the form: "What is really happening when - - - ?", forbids us to ask such questions and tries to persuade us that it is philosophically naive to want to know what is happening. But I do want to know, and I do not think this is naive; and so for me QM is not a physical theory at all, only an empty mathematical shell in which a future theory may, perhaps, be built.

Regarding detector loopholes

Regarding detector loopholes and local realism, I was just stating the generally accepted view. There are indeed many people pursuing any loopholes requiring this tradeoff, and Christian Joy even claims to have demonstrated an unjustified assumption in Bell's proof. Still, until vetted, they are not the consensus.

Nice

I enjoyed those papers. The ontological vs epistemological explanation of the debate between Einstein and Bohr I think I saw myself, but it's put to good wording.

Reversible Computing

Also related is Toffoli's Reversible Computing.

Old

I understand they want to minimize dissipation by reducing to reversible computations, and I've seen Toffoli gates also appear in quantum computing, but did this ever end up anywhere? Seems to me they lack actual physical gates with these properties.

DNA

Effcient Turing-universal computation with DNA polymers.
Bennett's proposed chemical Turing machine is one of the most important thought experiments in the study of the thermodynamics of computation. Yet the sophistication of molecular engineering required to physically construct Bennett's hypothetical polymer sub- strate and enzyme has deterred experimental implementations. Here we propose a chemical implementation of stack machines -- a Turing-universal model of computation similar to Turing machines--using strand displacement cascades as the underlying chemical primitive. More specifi cally, the mechanism described herein is the addition and removal of monomers from the end of a polymer, controlled by strand displacement logic.We capture the motivating feature of Bennett's scheme -- that physical reversibility corresponds to logically reversible computation, and arbitrarily little energy per computation step is required. Further, as a method of embedding logic control into chemical and biological systems, polymer-based chemical computation is signifi cantly more effcient than geometry-free chemical reaction networks.

Existence of gates

Seems to me they lack actual physical gates with these properties.

A gate may not have a physical reification at the present moment, but that does not mean it cannot be engineered. The big electrical engineering example here is the missing memristor, which was theorized 40 years before it was built.

As one Nobel prize winning physicist once said to my friend: Electrical engineering is a field with very few pure mathematicians, and that is why there is almost no fundamental advances in the field in the past 100 years. -- The memristor is a huge dot on an otherwise mostly spotless timeline.

I don't really know anything about Toffoli gates, but (according to Wikipedia), researchers at the University of Innsbruck, Austria have synthesized a Toffoli gate in 2009. Apparently this is the same group published in Nature 6 years ago, front cover!, for the proof-of-concept of quantum teleportation of atoms.