## SK Calculus not Consider SeKsy?

note: thanks to Ehud's grace (or lack of good judgement, depending on your viewpoint) I have been granted powers of editor ;-) Please feel free to offer me suggestions in private or in public on how I can better address the needs of community in what I choose to post or write. I hope the contributions of a neophyte can still be appreciated.

- Christopher (cdiggins@gmail.com )

For my first post as editor I thought I'd bring up an observation of mine while perusing the literature. of what appears to be a lack of usage of the typed SK calculus in theoretical papers. In my studies I am very surprised it hasn't come up more often, since it is elegant, and easy to understand.

I encountered the typed SK calculus first in Laufer and Odersky 1993. A web search for the phrase "typed SK calculus" yielded a brief usage on the Haskell Wiki in a discussion on Generalized Data Types.

I first encountered the SK calculus without types when reading Brent Kerby's A Theory of Concatenative Combinators which is an informal examination of combinators in Joy.

As a referesher, the typed SK calculus is made up of only two operations S and K:

  K : a -> b -> a
S : (a -> b -> c) -> (a -> b) -> a -> c


So the questions I am pondering are: why is the lambda calculus so much more popular? Is it because Scheme and other similar languages more closely resemble the lambda calculus? If there was a language based on the SK calculus, would it become more popular? I also wonder what applications there are for the SK calculus, and what place it should occupy in my programming language theory arsenal?

## Comment viewing options

### There is a language...

... that makes Intercal look easy.

Unlambda

### The Types

should be

  K : a -> b -> a
S : (a -> b -> c) -> (a -> b) -> a -> c


as in

Standard ML of New Jersey v110.59 [built: Mon Aug 14 21:39:53 2006]
- fun K x y = x ;
val K = fn : 'a -> 'b -> 'a
- fun S x y z = x z (y z) ;
val S = fn : ('a -> 'b -> 'c) -> ('a -> 'b) -> 'a -> 'c


### Whoops!

How sloppy of me! Thank you for correcting that, I've updated the post.

### G Machine

My understanding is it's very popular for graph reduction.

Sorta like all those parens in lisp, all those S's and K's start confusing me after a point -- maybe you just have to be smarter than I am to use it. :-)

### Names

So the questions I am pondering are: why is the lambda calculus so much more popular?

Because it supports names as a core feature, as well as nested abstractions.

Is it because Scheme and other similar languages more closely resemble the lambda calculus?

It's the other way around. Lambda calculus makes a good model for a programming language, so many languages are based on it.

If there was a language based on the SK calculus, would it become more popular?

See the comment about Unlambda. Of course, you could implement an extended SK calculus with names - but if they're lexically scoped, then you'd either have lambda calculus or a subset of it, and the SK part would be redundant.

I also wonder what applications there are for the SK calculus, and what place it should occupy in my programming language theory arsenal?

Combinator reduction is easier than lambda reduction, so there are applications in the implementation and analysis of functional languages.

### Lambda calculus is more

Lambda calculus is more popular because the SK calculus is pointless. Er, point-free. I like names, names are good...

Cool to see you posting this though, highlighting corners of theory to poke around with's always fun.

### I'm not showing your post on

I'm not showing your post on the LtU home page. Probably needs to be promoted by one of the "senior" editors". The "create content" selection is what I think you intended?

As for the topic, the lambda and SK calculus are expressable in each other (though that can be about as useful as saying they are Turing complete, but they are closely related). As I mentioned in another thread, I put together a list of the common combinator birds with a calculator and a stripped down unlambda in javascript (one of those projects that'll probably never finish).

From a PL perspective, it was David Turner's work with the B C K W combinators that made a big breakthrough for lazy evaluation PLs.

### Should it be on the

Should it be on the homepage? My understanding was that the homepage was for posting papers or major story items, or posts from guest bloggers. This item seems more appropriate on the forum.

### It has links to papers

Being a contributing editor implies that one is writing stories to the front page, based on whatever they judge to have merit. Ehud places his trust in the editors to decide what they think is appropriate.

Personally I think we run the risk of holding the home page hostage to standards of perfection - and not enough stories get posted.

### Nevermind

I just ran across his proposal to be a contributing editor, and you're right, the post is properly grounded, so "nevermind". My apologies to Chris Diggins.

### Intentionally in the forums.

I should probably have added a note, but this was a posting which I felt was more appropriate for the forum due to its informal and inquistive nature. I intended it as a thought-proving discussion piece, and a greeting.

If there was a "letters from the editor" section I would have put it there :-)

### Welcome

I should probably have added a note, but this was a posting which I felt was more appropriate for the forum due to its informal and inquistive nature.

IMO, that's the right decision.

If there was a "letters from the editor" section I would have put it there :-)

Hmm, an editorials page...

### Unwieldy

This SASL compiler compiles to SK-style combinators. But the problems immediately become apparent. It takes many combinators to do anything. The compiler introduces a bunch of new primitive combinators to deal with the issue. Although they simplify things a little, simple pieces of SASL code can explode into very large combinataions of combinators which in turn are very slow to evaluate.

It's not just practical considerations. When you need to write so many S's and K's, theoretical work becomes hard to read. I know that when implementing SASL for myself it was hard work to read and understand combinator expressions. I won't mention debugging...

### Supercombinators

I seem to remember something by Anton Ertl where he benchmarks the efficiency of supercombinators for Forth in order to reduce stack shuffling and threading overhead. The supercombinators were created and compiled on the fly, during the compilation, to avoid the problem of combinatorial explosion. But then you're almost compiling straight to native code.

### Bird book

There is a fun book on combinatorial calculus.
To Mock a Mockingbird (Paperback)
by Raymond M. Smullyan
http://www.amazon.com/Mock-Mockingbird-Raymond-M-Smullyan/dp/0192801422/sr=8-1/qid=1164990574/ref=pd_bbs_sr_1/102-2040050-9082551?ie=UTF8&s=books

### Related to that book

I came across this a couple of days ago. It gives an interesting way to look at the SK-combinator calculus.

### Chris' Combinator Calculator

I'll save Chris Rathman the trouble of pointing out that he has a page about the Combinator Birds. Near the bottom of that page is a Combinator Calculator and a Partial Unlambda Interpreter.

### Beat you to it

^^^ in a post above ^^^.

The calculator is one that I borrowed from Chris Barker. About the only value I added was to try a naive mapping that allows you to use the other named birds. The problem I have with using SK combinators is that it doesn't have a notion of naming abstractions that you might build with it. That is you can't name things like Zero and Next (or GoTo). As someone else said in this thread, you can give names to certain SK combinations that are useful, but then you just run into the same problem with them.

Although Scheme and ML are firmly entrenched in the Lambda Calculus, it is the handful of extensions to the LC that make them interesting. I guess I'd ask the same thing of SK based languages - what extensions beyond the base manipulation of combinators can we add that would make them useful as a basis for programming languages?

### what extensions beyond the base

Can SK combinators be made to play well with types?

### Funny that you should ask.

Funny that you should ask, because I am working on a technical report for the Cat language type system. Cat is based on Joy, which is closely related to the SK calculus.

I'll share a draft of the paper this weekend.

### Types

There's a standard (and efficient) translation of data types (a la Haskell) to the lambda calculus, so that will work with SK too.

### As someone else said in

As someone else said in this thread, you can give names to certain SK combinations that are useful, but then you just run into the same problem with them.

Which problem do you feel is the most relevant? There were four points brought up I believe:

1) Too many S and K's
2) Hard to understand
3) Debugging
4) Combinatorial explosion

Of these issues I only feel #4 is really a strong argument against SK-calculus, if it indeed is true. #1 can be countered by defining new combinators. #2 seems subjective, and not really a strong argument (lambda calculus is hard as well, but I've got to bite the bullet), #3 seems like a tool problem.

Is #4 really a core issue? Is there a combinatorial explosion? I don't know the answser. If it is true, it a big problem. However my experience with the Joy language suggests that it isn't. I believe that using sophisticated primitive combiantors can allow one to manage complexity. For example consider the QuickSort implementation in Joy:

 DEFINE qsort ==
[small]
[]
[uncons [>] split]
[[swap] dip cons concat]
binrec .


This is not a proof of anything simply an argument that combinators (like binrec) can simplify things and prevent combinatorial explosions.

... what extensions beyond the base manipulation of combinators can we add that would make them useful as a basis for programming languages?

The Joy language I believe may have clues as to the answer to this question. See http://www.latrobe.edu.au/philosophy/phimvt/joy/j03atm.html for a list of more combinators.

### I think #4 is true

I implemented my own version of a SASL compiler and I was taken aback by just how quickly the size of the compiled code grew. At first I thought I was compiling incorrectly, but I found a number of papers pointing out that the combinatorial complexity is hard to avoid and that this was one of the resaons for the introduction of supercombinators in Hughes' paper "Super-combinators a new implementation method for applicative languages". That paper cites an exponential growth in the complexity of the combinator expressions.

### Building languages on top

1) Too many S and K's
2) Hard to understand

My problems were more with SK combinators as a language onto itself. The lambda calculus is a little easier, but it also is too hard to work with by hand. In both cases, it's the extensions to the calculus that make them useful as PLs (be it Scheme/ML or Joy/Cat). The trick is to extend the languages in such a way that you don't lose the simplicity of the underlying calculus but provide a means of abstraction. Or to put it in terms of SICP "Every powerful language has three mechanisms for accomplishing this: 1) primitive expressions; 2) means of combination; and 3) means of abstraction". Scheme itself is basically the lambda calculus with a handful of special operations. Of course, Forth is similar in that it can be built entirely from a handful or two of primitives.

If you provide a means to build a vocabulary and the means of abstraction, then 1 and 2 don't make that much difference from the end users standpoint. I don't know that I personally like languages that follow Forth as a basis of abstraction, but it is certainly a powerful language. At any rate, Joy is certainly an interesting PL - one worthy of exploration in PL research.
3) Debugging
This is something that's probably shared with laziness in functional PLs. But debugging is really more a problem of relating a snapshot of state at some point in time to a specific point in the source code.
4) Combinatorial explosion
That's beyond me, other than to say that Turner's insight into using combinators (SASL and Miranda) were key innovations in making lazy functional PLs feasible. Since combinators were used to make LC languages efficient, it would be ironic if Joy and Cat had to use the LC in their implementation strategies. :-)

### Size explosion

As far as I know, the size of SK expressions can explode during abstraction elemination (i.e., the process of converting lambda expressions to SK expressions). Hence it is just not a very practical way to do programming with, I think.

But SK calculus has always fascinated as being the ultimate "library" (i.e. only reuse/composition of existing parts).

### I wonder

So I wonder if there exists a better combinatorial basis than SK which does not explode during abstraction elimination?

### Abstract Machines

Sure there is: pick any of the popular abstract machine models. Abstract machines like CAM or ZINC can be regarded as combinator languages in a straightforward way. They tend to be more complex than the SK reduction machine but a lot more usable in practice.

### programming in SK-combinators

I've found programming in SK-combinators directly to be nearly impossible. I compiled \x.\y.if (< x y) y x into SK-combinators by hand, and the result took me 45 minutes to debug.

But the translation is relatively straightforward. In my DHTML lambda-calculus interpreter, the translation from lambdas into S, K, I, B, and C is about 40 lines of JavaScript, which isn't too horrible. (Except, you know, that it's JavaScript, which isn't very popular, and it's considerably longer than it would be in a language with pattern-matching.) So far I haven't written any significant programs in the lambda-calculus in it, so I don't know to what extent I've avoided the exponential blowup. (In the graph structure. I don't care about exponential blowup in the linearized expression representation of the graph structure, without sharing.)

One of the more complicated examples implements cons, car, cdr, and nil in the pure lambda-calculus, then does this:

map (+ 2) (cons 1 (cons 2 (cons 3 nil)))


or in context:

(\cons.\nil.(\car.\cdr.\map.
map (+ 2) (cons 1 (cons 2 (cons 3 nil))))
(\cons.cons (\car.\cdr.car) x)
(\cons.cons (\car.\cdr.cdr) x)
(Y (\map.\f.\list.list (\car.\cdr.cons (f car)
(map f cdr)) (\ifcons.\ifnil.ifnil))))
(\car.\cdr.\ifcons.\ifnil.ifcons car cdr)
(\ifcons.\ifnil.ifnil)


which translates to this:


(S (B C (C (B C (C (B C (B (B K)
(B (B K)
(B (B (C (C I (+ 2))))
(S (B B (C I 1))
(S (B B (C I 2))
(C (B B (C I 3)) I)))))))
(C (C I (B K I)) x)))
(C (C I (K I)) x)))
(B Y (C (B C (B (B C)
(B (B (B C))
(B (B (B (C I)))
(C (B B (B S (B (B C) (B (B (B B))
(C (B B (B B I))
(C (B B I) I))))))
(C (B C (B (B B) (C (B B I) I))) I))))))
(K I)))
(B (B (B K)) (C (B B (B C (B (C I) I))) I))
(K I))


I know I'm not using B*, C', and S', which Peyton-Jones's 1987 book explains are necessary to avoid quadratic blowups on deeply nested variables --- thus the many more levels of nesting in the output than in the input. But I don't know if there's more hidden danger awaiting. What other kinds of expressions cause big blowups?

I'm thinking that Abadi and Cardelli's sigma-calculus might be a nicer basic abstract system to program in directly; it has most of Wouter van Oortmerssen's abstractionless programming already present, and it's even less point-free than the labmda-calculus. It's more complicated than the lambda-calculus by about the same amount that the lambda-calculus is more complicated than SK; there are four expression types.