Freer Monads, More Extensible Effects

Freer Monads, More Extensible Effects, by Oleg Kiselyov and Hiromi Ishii:

We present a rational reconstruction of extensible effects, the recently proposed alternative to monad transformers, as the confluence of efforts to make effectful computations compose. Free monads and then extensible effects emerge from the straightforward term representation of an effectful computation, as more and more boilerplate is abstracted away. The generalization process further leads to freer monads, constructed without the Functor constraint.

The continuation exposed in freer monads can then be represented as an efficient type-aligned data structure. The end result is the algorithmically efficient extensible effects library, which is not only more comprehensible but also faster than earlier implementations. As an illustration of the new library, we show three surprisingly simple applications: non-determinism with committed choice (LogicT), catching IO exceptions in the presence of other effects, and the semi-automatic management of file handles and other resources through monadic regions.

We extensively use and promote the new sort of ‘laziness’, which underlies the left Kan extension: instead of performing an operation, keep its operands and pretend it is done.

This looks very promising, and includes some benchmarks comparing the heavily optimized and special-cased monad transformers against this new formulation of extensible effects using Freer monads.

See also the reddit discussion.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Thanks in advance

I won't be able to read this for a few days, but I'm excited. Thanks Sandro.

Doesn't fix fundamental problem with monads: concurrency

The article is mathematically elegant but using freer monads doesn't fix the fundamental problem with monads, namely concurrency.

Concurrent programs can be exponentially faster than the parallel lambda calculus (foundation of monads) in practical systems that cannot afford to give up large amounts of performance.

PS. Of course, monads were originally invented by Scott and Strachey to simulate concurrency.

Still off-topic

Note: I'm only attempting a reply to you because of Ehud's moderation.

I consider even this comment of yours to be off-topic here. Everyone who knows anything about monads knows that half the value proposition is in sequencing. Despite --no, largely due to-- this anti-parallelism property of monads, they are useful for reasoning about an extremely large number of situations; e.g. those related to effects, the topic of the paper.

Your comment distracts from the important implications of a successful design for typing and working with compositions of effects in a practical language.

Unless you'd like to discuss challenges and opportunities of encoding concurrency effects in the context of the Eff monad, I suggest that you refrain from replying in this thread.

Thanks, Brandon, that's my

Thanks, Brandon, that's my impression too. I advise ending this sub-thread on this note and moving on.

Sequencing: Lists (i.e. [...]) versus ⬤ (i.e. ";")

Sequencing is ambiguous:
* Lists (i.e. [...]), which is largely independent of effects
* Afterward (i.e. ⬤ usually denoted by ";"), which is used in concurrent programs to manage effects.
However, having just one global monad for all of the concurrent uses of ⬤ in a system does not allow concurrent implementation which can exponentially reduce performance.

Get rid of cons!

I'm a big fan of this line of research. In particular the Eff language and Idris' ! bang notation are absolutely fantastic at overcoming my biggest complaints about programming with monads.

This paper cites the related Reflection without remorse work, where the same linear -> quadratic problem is discussed due to choice of data structure.

Similarly, Guy Steele's Thinking Parallel Programming talk calls for us to get rid of cons!. It's interesting that the same data structure derived improvements from Steele's discussion of parallelism yields gains in sequential programs. It's also interesting how "cons" takes many forms, from literal cons cell data structures, to implicit linked-lists via function composition, and beyond.

It's a shame that linked-lists and pairwise function composition get first-class treatment in languages and their grammars. It would be better to guarantee logarithmic performance of finger-trees or rrb-vectors in a language's foundational data structures and operations. For example, primitive function composition could be defined in terms of such data structures internally. In fact, a language would do well to encourage any monoid to build trees instead of lists.

A list (i.e. [...]) is an abstraction: many ways to implement

I agree with getting rid of cons!

A lists (i.e. [...]) is an abstraction which can be implemented many ways. New programming languages are starting to use [...] directly for construction and pattern matching with no mention of cons so that an underlying implementation might be a finger tree or RRB-vector. This is an area of active research.

cons and expression

In designing Kernel, without even bringing parallelism considerations into it, and with cons/car/cdr build into the primitives of the design, I found that in order to raise cyclic lists to first-class status I created whole-list operations that mostly obviated cdring down lists (since you can't rely on cdring to terminate unless you're certain the list is acyclic). It occurred to me at the time that building lists out of cons cells was looking more and more legacy. I've since had occasion to build a specialized Lisp interpreter (not for altogether-general-purpose programming, admittedly) on top of Lua in which lists are actually implemented as Lua arrays, and it's striking how easily the cons cells dropped out of the design. All that, again, is from a practical-expression perspective (which, to me, is non-negotiable; I'd be at least as dissatisfied sacrificing clean expression for efficiency as sacrificing efficiency for clean expression).

Yes, conses are nice because

Yes, conses are nice because they allow us to treat lists as molecules instead of atoms. But in return they they also allow the creation of many things that are not lists. Once you have whole list operations you don't really need to treat lists as conses anymore.

General Iterators

I agree the special treatment lists get is bad. What is needed is something that treats all containers equally, even user defined ones. Stepanov's treatment of iterators (although that wasn't his first choice of word, it is used for comparability with existing C++ concepts) provides this. Iterators abstract over the data access patters of algorithms, so you are probably familiar with forward-iterators, bidirectional-iterators, indexed-iterators, random-iterators, but there are also iterators for trees (binary trees have bifurcating-iterators), and multi-dimensional collections (co-ordinate structures).

When algorithms are abstracted over iterators, they can be used generically on any collection type that provides the required iterator. Iterators also provide performance guarantees, so for example a collection should only provide an indexed iterator if the lookup operation is efficient. So a singly-linked list would only provide a forward-iterator, a doubly-linked list would provide both forward and bidirectional-iterators, etc.

If this is combined with overloading that can specialise based on the iterator type, you can provide a set of algorithms for each type of iterator, and have the compiler simply choose the efficient implementations. For example choosing a version of reverse using only a forward-iterator for a singly linked list, and a bidirectional-iterator for a doubly linked list.

It's basically library support

Lists get special treatment in Lisps because they're old and have been around since the beginning. One of the things we did in R7RS-small was to beef up Scheme's vector procedures so that they essentially match the list procedures (with the obvious exceptions due to vectors being fixed-length in Scheme), so that you don't have to say "Well, I should use vectors, but there aren't enough built-in procedures, so I'll fall back on good old lists so as not to waste my time." Common Lisp does the same thing by providing sequence procedures, but that would violate Scheme's relentless monomorphism. R7RS-large is intended (among other things) to riff on the same theme across many data structures.

Added:

It's possible to integrate vectors with lists and immutable lists thus: mutable lists are made of mutable cons cells, immutable lists of immutable cons cells, and vectors are (at least conceptually) made of cons cells with mutable cars and immutable cdrs. I've pointed this out privately but not, I think, posted it publicly.

Once there was a language in which the CHAR function, when applied to a string, returned its first character. There was also a function which when applied to a string return all except the first character: the CHDR function.

Read "Elements of Programming"?

Have you read Stepanov's "Elements of Programming"? Chapter 6 "Iterators" is relevant, but also 7 "Coordinate Structures", 8 "Coordinates with Mutable Successors", and as much of the preceding material you need to get it.

An algorithm should not be based on whether the collection is a list or a vector, but on the access pattern. I really wish more people would read Stepanov's book before designing a collection library (even if to say " this is not what I want", it would facilitate discussion, and its hard to get the significance without reading the whole story. It's only a short book too). It may need language support if you don't have the right tools.

As in relevance to effects, each container should have its own monadic region, composable using effects. Oleg has provided a thorough treatment of exactly what I needed in my own language development, probably coincidental, but we have discussed similar things in the past (not suggesting i had any input to this paper, just we seem to still be interested in similar stuff following on from the HList paper).

Edit: the treatment of iterators in EoP us purely functional, so to get the next iterator you would evaluate "j = successor i", to access the collection element referenced by a read only iterator "v = source i". Of course for collections other than the singly-linked list you have to allow mutability, so you can write to an iterator with "sink i = v" and get a mutable reference with "deref".

Edit2: Lisp's special treatment of lists comes from the read only nature of singly linked lists. Elements can be pushed onto the head, or removed without altering the existing memory structure, hence they allowed for immutable collections. Very few structures have this property (finger-trees?, trees with up links or down links only, but not both?). Singly linked lists have the right properties and were easy to implement.

thank you but

I am not Brandon Bloom.

EDIT: Actual Brandon Bloom here, some kind of account impersonation problem. See below thread.

Brandon Bloom

?

It's a bit funny

I can't log in with own account; password recover spills server side entrails; the admin apparently does not feel like replying to emails asking for account recovery; and now I see that I am logged in as Brandon Bloom (without my doing anything whatsoever).

Ehud, please fix this longstanding issue. This is a great site, and I want to contribute but not under Mr. Bloom's handle.

Thank you.

So, who are you?

And what have you done with Brandon?

Actual Brandon Bloom Here

I was off-the-grid for a few days and came back to find this tweet: https://twitter.com/SunOf27/status/642008422456492032

Not sure what happened, but I still have access to my own account. Seems like Joubin Houshyar doesn't intend to cause me any trouble, hopefully that's true! :-)

That's what you get with a

That's what you get with a web platform built with PHP instead of Coq.