archives

tail recursion

Hello and sorry if this question has been already answered somewhere, though i searched for it.

The question is: tail-recursion is something badly needed _only_ in Scheme?

Here's a related question: if it isn't, that is, if it's possible to generally get compilers or interpreters to correctly detect recursion and automatically apply and implicit goto to it, then why should i bother, when it just complicates matters?

I gave a though to it when a saw a recent post here about LLVM and its recent support for tail-call for Scheme and also for playing with Haskell a bit.

So, to keep things simple, let's look at the examples:

;; in Scheme
(define (fact n)
  (define (product min max)
    (if (= min n)
	max
	(product (+ 1 min)
		 (* min max))))
  (product 1 n))

That's one way to look at a factorial, but not the _natural_, i guess.

-- in Haskell
-- and just to add a bit more argument
dec n = n - 1

fact 0 = 1
fact n = fact( dec n ) * n

That's the _natural_, let's call it that, high-school definition for factorial, pristine clear. There's no tail-call anywhere to be seen, and yet, it doesn't cause stack overflow with large numbers...

Am i missing something? or is Scheme being stubborn? R6RS is coming and perhaps should address it?

Sorry for being provocative, but it's a sincere question...

Funny characters inference

Aggressive Type Inference
In talking with people at last year's Python Conference (IPC7), I mentioned the possibility of writing a Python compiler... in Python. Not content to stop there, I suggested that the idea could be taken further, to translate Python code into Perl.
[...]
What does this have to do with type inference? The translated Perl code must have the $@%&* type specifiers on all Perl variables.
Can we look at this as a constructive proof that funny characters are (mostly) not needed by compiler, and are there only to please the programmers?

I decided against posting this in the thread that mentioned ugly syntax (not of Python), as it was pretty long and losing focus already.

LLVM 1.5 released with tail call optimization

From the release notes (http://llvm.cs.uiuc.edu/releases/1.5/docs/ReleaseNotes.html):

The release now includes support for proper tail calls, as required to implement languages like Scheme... In LLVM 1.5, the X86 code generator is the only target that has been enhanced to support proper tail calls (other targets will be enhanced in future). Further, because this support was added very close to the release, it is disabled by default. Pass -enable-x86-fastcc to llc to enable it (this will be enabled by default in the next release).

Is it time to create a PLT front-end for LLVM?
George

Two Thank Yous for Peter van Roy

For FractaSketch. My kids love it!

For "Logic programming in the context of multiparadigm programming: the Oz experience". I always knew Oz was interesting. But, without a lot of time to sit down and write programs, how does one come to grips with a new paradigm? This paper does the trick.

Thanks, Peter!

The Essence of Programming: Reynoldsfest at MFPS

Having just got back from the Reynoldsfest at MFPS, where part of the emphasis was on encouraging folk to look at Reynolds work, let me write down some of what was recommended.

My personal favorites:
Definitional Interpreters (this one changed my life when I was a graduate student)
Gedanken
Towards a Theory of Type Structure
Types, Abstraction and Parametric Polymorphism
The Essence of Algol
Syntactic Control of Interference
Three Approaches to Type Structure (tutorial)
The Discoveries of Continuations (history)

Some papers recommended by others:
The Craft of Programming (recommended by Cliff Jones)
Relating direct and continuation semantics (Filinski's favorite -- this one is tough going)
Separation Logic: A Logic for Shared Mutable Data Structures

Also, while I'm at it, let me recommend
You and Your Research, by Richard Hamming. Want to do Nobel-class research? It's not just a matter of luck. Here is a nuts and bolts guide, from the third winner of the Turing Award.