Systematic search for lambda expressions

Systematic search for lambda expressions by Susumu Katayama

This paper presents a system for searching for desired small functional programs by just generating a sequence of type-correct programs in a systematic and exhaustive manner and evaluating them. The main goal of this line of research is to ease functional programming, along with the subgoal to provide an axis to evaluate heuristic approaches to program synthesis such as genetic programming by telling the best performance possible by exhaustive search algorithms. While our previous approach to that goal used combinatory expressions in order to simplify the synthesis process, which led to redundant combinator expressions with complex types, this time we use de Bruijn lambda expressions and enjoy improved results.

Although the algorithm is slow and only works on small expressions, it looks helpful. Perhaps there could be some synthesis with Hoogλe, or whatever the name is after Google makes them change it.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Aliens...

Reading your description I was reminded of this paper by Marvin Minsky. Here's the relevant quote:

I once set out to explore the behaviors of all possible processes–that is, of all possible computers and their programs. There is an easy way to do that: one just writes down, one by one, all finite sets of rules in the form which Alan Turing described in 1936. Today, these are called "Turing machines." Naturally, I didn't get very far, because the variety of such processes grows exponentially with the number of rules in each set. What I found, with the help of my student Daniel Bobrow, was that the first few thousand such machines showed just a few distinct kinds of behaviors. Some of them just stopped. Many just erased their input data. Most quickly got trapped in circles, repeating the same steps over again. And every one of the remaining few that did anything interesting at all did the same thing. Each of them performed the same sort of "counting" operation: to increase by one the length of a string of symbols–and to keep repeating that.

Strikes me as being similar t

Strikes me as being similar to Wolfram's observations in "A New Kind of Science". Cells or functions? Doesn't seem like there's too much of a difference.

This reminds me...

of the various 'superoptimizers' used by compiler writers and madmen. You would basically link the optimizer with a sample function, and generate and test all branch-free machine code streams of length 1, 2, 3, ...

The resulting function would usually be shorter and faster, but might only work due to some quirk of the instruction set that you didn't realize beforehand. In some cases even the processor designers were surprised to hear of the results.

Original reference

ADATE

A friend of mine named J. R. Olsson has been doing research in this area for some time, see http://www-ia.hiof.no/~rolando/. He has some very interesting results. Basically he is using the terminology from Genetic Programming, but growing his population in a much more systematic (read: deterministic) way based on neutral evolution. His measurement of neutrality is based on a set of proven-neutral transforms and a problem specific fitness test-suite.

I'd like to see a less hand-tuned version of ADATE -- one which explores "program space" through ad-hoc proof-based transforms (instead of pre-determined and test-suite based).