Combining lazy and eager evaluation of terms

In an attempt to combine some of the benefits of lazy and eager evaluation, I have implemented a language with an evaluation strategy which is strict with respect to divergence, but performs lazy evaluation on certain intermediate subterms to allow a more expressive use of recursion.

Does anybody have any references to work on similar evaluation schemes?

Tim Sweeney started this interesting Types-list thread. A summary of the responses he receieved is here.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Y in Haskell

I love the fact that Y in Haskell is just:

y f  = f (y f)

I think if you understand why that works (or why it doesn't work in a strict language plus why it doesn't not work in Haskell) you probably understand lazy evaluation pretty well.

Is there room in a "lenient" evaluation strategy for this kind of expression? I am being lazy here in not trying to work out the answer for myself, and hoping that somebody else will be lenient in helping me to answer it.

Not exactly Y

Isn't the Y combinator a fixpoint combinator that is defined in a language without (using) recursion?

Even in an eager language, there's usually no trick to defining a recursion combinator if you get to use recursion. As you might expect though, an eta-expansion is needed. In SML:

fun y f = f (fn x => y f x);

Y in (Parallel) Haskell

...which is otherwise known as pH and is one of the "lenient" languages mentioned. Y cannot be expressed as it is in "standard" Haskell because pH uses eager function application.

It still supports infinite data structures however, because all algebraic data types in pH are non-strict, i.e. data structures in pH are "lazy".

It strikes me as the ideal combination, and I'd be interested in learning of any counter arguments. (Are you listening Frank?)

I did bring it up on comp.lang.functional once, but I just got a reply saying that I failed to appreciate the difference between non-strict data constructors and lazy evaluation, even though my original comment was about precisely that difference, and that Haskell was non-strict (agreed) but not a lazy language (with which I disagreed).

Laziness

Actually, I'd better clarify what I have said. Haskell might better be described as using "call-by-need" rather than a purely lazy "call-by-name", in that a given expression is only computed once. (So is Haskell really lazy by the definition of laziness?)

So f x x does not evaluate both copies of x, rather the expression thunk is replaced by its value once it is evaluated.

(I feel like I have to be a lawyer in these discussions lest someone jump on any ambiguity in anything I have said).

Y, etc.

Dominic: That approach to Y works under lenient evaluation, and is actually the literal translation I use to evaluate recursive terms.

The only terms that behave differently between lenient and lazy schemes are terms with subterms which would not terminate if evaluated, such as "let x=x in 7". In a normal-order lazy language, such such subterms only cause divergence if their evaluation is actually demanded by the program's final result. In a lenient language, all subterms are always evaluated to ground values "eventually", even when their values are not demanded. So, this example evaluates to 7 in a lazy language, and diverges in a lenient one.

The "eventually" part is what enables lenient evaluation to succeed for more programs than strict evaluation. For example in "x=Pair(3,First(x))", the First(x) term may begin evaluation before x has been fully reduced to a ground value.

David: I'm not sure if pH (Parallel Haskell) uses a true lenient evaluation scheme like Id90. The reference I have on it ("Implicit Parallel Programming in pH", a book by Nikhil & Arvind) is somewhat unclear on this.

Lenient Evaluation

I think you're probably right about pH and lenient evaluation. My only knowledge of pH is also the result of having read Nikhil and Arvind's book, the main attention being on the ability to parallelise computation, which is of course a different point of emphasis.