Implementing Higher Order Messages

My colleage Nat Pryce has recently posted a challenge which given the recent discussion about 'expressions vs statements' I thought might interest the LtU crowd.

In this post: he talks what he calls Higher Order Messages. This is basically a way to use higher order functions to abstract away complex traversal code in favour of a series of chained method calls.

So instead of selecting all the elements that match a predicate and then calling a method on them you do this:

Ordinarily this wouldn't be relevant to LtU but his post: on implementing this idea in Ruby has led to alternative implementations in Java and Scala:

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I always thought...

...Dylan had the nicest implementation of the "first-class messages" idea that I've seen.

Since Dylan is generic-function-based, methods are true first-class objects, and so you don't need to pass selectors around -- generics are bound to lexically-scoped names exactly like everything else in the language, and you can manipulate the actual method rather than a handle to it.

map p f l

I think most programming languages can already do the High Order Message pattern. For example, a Haskell function can be written that uses Prelude's 'map' function to apply another function only if a predicate is true (pardon my mistakes, I am not yet proficient with Haskell):

map p f h:t = map \ -> (if p h == true then f h else []) : l

The above really says the following: for each element H of list L, apply predicate P, and if true, then apply function F to H.

It can also be done in C++, using functors:

for_each(list1.begin(), list1.end(), MyPred());

whereas MyPred can be anything that processes an element of list1 and returns true if a specific condition applies (C++ requires a lot more coding than Haskell to achieve the same result, but my post is about higher order messages not truly being something new, not about C++ being equal to Haskell or something).

EDIT:I had forgotten the cons operator...


Idiomatic Haskell:

mapIf p f l = [f x | x 

For all x in l, the set of (f x) where (p x) is true.

You can't define it in terms of map alone, or else you'll get extra empty lists in the output. A solution that composes functions is:

mapIf' p f = (map f) . (filter p)

But it's less efficient, because it does the filtering in two passes. It's my favorite, though, for clarity. You could do it with a fold:

mapIf'' p f = foldr (\x a -> if p x then (f x):a else a) []

But this is much less clear. Some people even need to work folds out with pen and paper!

And just for completeness, we can do it in the List monad:

mapIf''' p f l = do x 

My understanding is that the list comprehension notation is often translated into the List monad.

Side note: they should really have an on-page Haskell/Scheme/APL/COBOL interpreter so we can check code.

Lazy passes

(map f) . (filter p) will evaluate in something that looks much like a single pass under lazy evaluation - map'll repeatedly demand the next item (or nil) from filter which'll run along the list looking for it.

Single pass and cons cell reuse

I know that lazy evaluation should compute the elements from filter on demand, but I didn't expect cons cell reuse. The following results are very encouraging -- Haskell likes the elegant way!

Results from my unscientific test of putStr $ map show (mapIf (>10000) succ [0..20000]), with GHC 6.2.2, compiling and running on an old x86 laptop:

MethodAvg. time (seconds)Memory used (kb)
Five trials, unoptimized compilation
List comprehension2.0603,236
List monad2.0805,191
Five trials, optimized compilation
List comprehension1.7203,002
List monad1.9004,063

Wow, that's a lot of memory for the monad method. I guess list comprehension doesn't compile down to the List monad, or does so with some added efficiency. Funny that the first three use the exact same amount of memory once optimized (to the byte: 3,002,324), but have different average running times. This goes to show that my laptop is a poor testbed.

Re: list comprehension

Wonder if it would be any stupendously different in Clean, since I think they claim to be all cool and fast and stuff? C.f. misc comprehension demos in tutorial code.


I very much suspect what you are seeing is the result of deforestation. The poor behavior on the monad form is likely because GHC doesn't recognize it as a good producer/consumer. Deforestation is fairly interesting (and is another reason why functional programmers like folds); you may want to check it out.

Higher Order Messaging is not Higher Order Functions

The idea is for client code to not deal with individual elements. The collection forwards messages to all its elements and collate results allowing client code to apply operations to the elements of a collection in bulk. After all, object oriented languages already have higher order and anonymous functions. Well, the good ones do.

It's Marcel Weiher's idea and name, not mine

Credit where credit's due. I was just exploring the idea


This sort of thing comes for free in a nondeterministic language like Icon (..speaking of hard-to-Google programming language names).

receive_benefit(retired(!claimants), 5)

The ! operator generates the values of a collection. retired is a predicate function that returns a result for each claimant that is retired (i.e. the filter function comes for free). And so the receive_benefit function is executed for every retired claimant (the map function comes for free).

Also easy if you have a language with list comprehensions:

[receive_benefit(x, 5), x <- claimants, retired(x)]

And in Smalltalk and other languages you have 'perform' which can perform selectors.