## Reversible generators

Python has a language feature that's been brought up here a couple of times before, whereby a function can return a value, but remember it's current point of execution, and restart execution from there the next time the function is called. Here's a simple example:

def myrange(a, b):
while a <= b:
yield a
a += 1


I'm curious if any work has been done looking into making generators more arbitrarily maniputable. If I want to define a function reverse(myrange(1, 10)), Python will have to do the entire generation at once and then perform the reversal. This eliminates one of the key advantages of generators -- making only one object at a time, and thus saving memory and potentially latency.

What I'm curious about is if anyone's done any work into what constraints would be necessary on generators (obviously they couldn't have arbitrary python code as they can now) to make it so that a reverse() would be possible that would still have each value be generated once a time, just in reverse. Presumably such a solution would be powerful enough to implement things like step too (currently to get the effect of step with a generator you have to just throw values out).

## Comment viewing options

### Very constrained...

I see two possibilities here

1. To only allow relations for which closed forms can be found automatically. That's not a lot. I'm not even sure it can be done for non-trivial nonlinear recurrence relations. (I'm assuming that the generator would have a recursive definition, otherwise, there's no real point in making it a generator, is there? :)

2. Make sure that every operation is reversible, without the need for external state. A generator would then be seeded with an initial value and a direction. Not sure how useful that would be, and I'd expect programmers to be rather surprised by a "Multiplication by Zero" error.

I'm hoping to be proven wrong here.

### "Multiplication by Zero" error

(...) and I'd expect programmers to be rather surprised by a "Multiplication by Zero" error.

Or, in any sort of imperative language, even more so by a â€œVariable Store" error.

### I wonder how the reverse

I wonder how the reverse behaviour of the counter generator should be defined:

def count(start=0):
i = start
while 1:
yield i
i+=1


Would it start with pos infinity and decrements to the start value in infinitely many steps?

### I don't believe in a practical automatic solution

It can be done by making custom types which have separately implemented iteration in both directions. If a type doesn't provide reverse iteration explicitly, the default implementation accumulates all elements and at the end yields them back in reverse order. Nothing tricky.

### It can't be done, at least

It can't be done - at least not for the general case. And this is one of those cases where 'general case' really means 'nearly every case you can possibly come up with'. There's no practical way that the language will be able to analyse the function and determine what its reverse would look like.

If you think about it in a different way, the reason becomes obvious. Our formal abilities are incredibly puny; we can't prove many interesting, or deep, properties about arbitrary programs. Without that ability, there's no chance of working out something as monstrously complex as you're asking.

In your case, you'd just have to define a rmyrange (or equivalent). In languages like Icon, Python, and Converge (all of which have generators, although Python's are slightly limited compared to the other two) you'll find that types like lists typically have generators such as iterate and riterate (reverse iterate) defined where they are useful. For those cases where you need reversal, it's really not a significant effort in practise.

As an aside, in your case, range would probably take an optional third argument "step" which would default to 1. If you want rrange, just pass -1 as the third argument. Problem solved.

### reversible logic and time travel

I don't understand this field very well, but there is an aspect in physics where, if all computations are reversible, then theoretically energy is conserved because information is not destroyed. Really way-out stuff, here is a wikipedia link:

http://en.wikipedia.org/wiki/Reversible_computing

However, they don't seem to mention the debugging benefits of being able to reverse computations (basically known time traveling). I don't know if these fields are related, here is a citeseer link to a paper on that:

http://citeseer.ist.psu.edu/booth97walk.html

### Trade cycles for memory

You could find number 10 by iterating the generator 10 times and throw away the itermediate results. Later you could find number 9 by iterating 9 times and throwing away the intermediate results, etc.

You get the memory advantage of a generator, compared to generating and retaining the entire structure prior to traversal. You take an n² hit on speed

This is half of the idea behind iterative deepening search.

### Don't forget this only works

Don't forget this only works if, between the very first and very last execution, the generator in question doesn't read or write global state (to be more accurate, it mustn't read a part of the global state which may change). One can make no such guarantees in the general case for any of Python, Icon, or Converge.