Aha moment

I've been staring a lot at Python code the last couple of days. It suddenly dawned on me that Python is a vindication of Icon's generators. Python's expressivity, as attested by powerful libraries, rests to a large extent on metaprogramming and generators.

Added:Once upon a time we called this sort of thing linguistic abstraction. And people used to have strong arguments against it. I think this is one battle that was decisively won, as usual in favor of the Lisp mindset. Doesn't mean we should strive to make it as safe as possible, but disallowing it is no longer a reasonable approach.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

A lesson that has not been

A lesson that has not been lost on Javascript and node.js: "What this means in practice is that we can get rid of the callback hell that has plagued node applications, and write code in a synchronous style, while it’s executed asynchronously behind the scenes."

How do generators escape

How do generators escape callback hell? In my experience, they worsen it because now the callbacks are hidden but still there. Debugging is still a pain. Pretty syntax though.


I don't know, but a syntax for JavaScript that seems to get rid of callback hell extremely nicely is PogoScript: http://pogoscript.org/2012/12/05/async.html

Callback hell is as much

Callback hell is as much about debugging code as it is about reading it. Pretty syntax is not really the answer, we also have to rationalize indirect call in our execution models (e.g. Via equational reason if you want to go functional).

In practice, though, are

In practice, though, are people using these libraries having a particularly hard time debugging?

Second question: I vaguely recall Icon had some unique tools for debugging. Anything worth coming back to?

Good question. I avoid using

Good question. I avoid using Linq and yield in C# for any kind of heavy lifting, which keeps me sane. That is just my experience though.

Debugging Asynchronous JavaScript with Chrome DevTools

A powerful feature that makes JavaScript unique is its ability to work asynchronously via callback functions. [ahem]

Luckily, now in Chrome Canary DevTools, you can view the full call stack of asynchronous JavaScript callbacks!

A double-edged sword

Interesting. Those async stack traces are a feature I've sometimes dreamed about when I'm working in Chrome DevTools, but I hope they have their limits.

I do ad hoc threading by setting up long-running asynchronous loops. If these loops are going to start accumulating async stack information just for debugging purposes, I'll have to do a trampoline that runs on setInterval or requestAnimationFrame to avoid memory leaks. (Fortunately for me, my code's almost prepared for this already.) To phrase things in terms of this async stack concept, my code has been using async tail calls for its looping purposes, and I'll have to stop relying on async TCO.

Incidentally, one of the most painful things when I'm debugging is how long Chrome DevTools takes to render thousands of stack trace entries to its interactive interface. Once I hit a breakpoint, I have to wait for 1 to 3 minutes sometimes, and my code isn't that slow. So I sometimes do a meaningless async call or use my trampoline just to keep the stack very shallow so that the debugging process can be less nightmarish. :)

are people using these

are people using these libraries having a particularly hard time debugging?

I've written decent-sized programs in Dart, which uses Futures/Promises for asynchrony. Debugging them is a complete and total nightmare. I've spent a ton of time trying to engineer around the pain of working with "unwind-to-the-event-loop"-style concurrency and even then it's still awful.

Direct-style heaven

What language is your experience with?

The post I linked to shows the use of generators in the context of Javascript and node.js, where the use of promises to manage synchronization is already common, and successful at reducing callback hell. In this environment, the introduction of generators is a useful improvement.

The "pretty syntax" is backed up by some effective semantics — not only in the generator design itself, but in the promise abstraction and the libraries that use promises. The examples in the post I linked rely on that, so what's actually happening may not be apparent just from skimming the syntax.

As for debugging, what are we comparing to? Debugging asynchronous and/or concurrent code will always be less easy than debugging single-threaded synchronous code, for example.

Personally, I'm still a fan of threaded approaches to concurrency, over event-driven — done right, they can perform at least as well, and they don't require the programmer to worry about concurrency or asynchronicity except when something actually needs to be synchronized. But if you're working in an environment that's inherently event-driven, having (something like) generators does help.

[Apples, Oranges].yield

I was talking about purely synchronic case, in which generators help library designers hide/encapsulate control flow.

I have been burned by

I have been burned by generators many times in C#, and it's not really about side effects but about complexity. Promises are equally crazy in the deferred execution department. Perhaps we should coin some new hells....

Lambda hell - like callback hell, but with pure higher order functions.

Indirect hell - it looks much better than call back hell, but cause and effect are not very evident, while time just seems to slow down for no apparent reason.

One way to get rid of callbacks in concurrent/reactive use cases is through polling, so I found immediate mode UIs to be quite refreshing: callback hell is gone while control flow is quite direct! The callbacks aren't necessary, we use them to be efficient, and there are other ways we can get back this efficiency (memoizing so replay of polling code goes more quickly).

Of course lambda is all about specifying what you want, without worrying about how it's done, or something like that. Meaning that control flow is buried shouldn't be that big of a deal, and we can reason about our programs using math to make up for the lack of debugging capabilities. Except I can't see how this works out in practice, it seems to me that functional styles increase cognitive loads on programmers, not less.

And people say

And people say non-mainstream languages are not influential...


There's a really obvious parallel in music, where there are non-mainstream people who are almost unknown to consumers but who, conversely, are well known by almost all expert practitioners. (The most obvious examples are in jazz - e.g. Allan Holdsworth - but I think it's true in other areas of music as well. For example, classical composers are often influenced by Alban Berg, and rock musicians by um well ok that's a harder one.)

It would be nice if this was an extremely generaliseable phenomenon, and maybe it is. Does it apply more widely in science, for example? Maybe it does - e.g., I did my undergraduate degree in Stephen Hawking's department, but the really intellectually influential people seemed to be people less well known to the public, notably John Conway and maybe also Béla Bollobás.

Examples from elsewhere? Fields where this doesn't apply at all?

From 2002

From 2002 (Python Mailing List)

> If you haven't played with the Icon language, you should!

[Andrew Koenig]
> I have.

Cool! Then build on that: Python's generators are the same as Icon's generators, except that exposing the .next() method allows Python's flavor to be used for *some* things that require full-blown coexpressions in Icon

Seems there is some awareness of this relationship to Icon.

Cool! Thanks for digging

Cool! Thanks for digging this up.

Come on

Ah, come on, this is LtU. Generators are merely a poor man's workaround for the lack of proper codatatypes. :)

Integrate them nicely in a

Integrate them nicely in a mainstream language and have multiple libraries use them to provide expressive linguistic abstraction (akin to DSLs) and I am all yours.

More to the point, in terms of historical influence and the specific language feature involved, I think generators are more relevant. I don't think Python's yield was influenced by work on codata.


My comment was tongue-in-cheek, of course. Regarding codata in mainstream languages, I'd already be glad if there was one that even had plain datatypes -- but I'm afraid the mainstream has to overcome OO ideology first.

Yeah... I resisted writing

Yeah... I resisted writing that "Python was not influenced by future work on codata".

Generators in Icon and CLU, and codata

Generators in Python are actually the vindication of generators in CLU (called iterators in that language). CLU iterators were inspired by generators in the language Alphard, developed in Mary Shaw group at CMU, with the explicit goal of formal verification. Iterators in CLU implemented all iterative constructs -- realizing Shaw et al. motivation ``generators abstract over control as functions abstract over operations.''

I strongly recommend reading the Alphard and the CLU history paper linked at
The above web page also shows the connection with Icon and earlier AI languages such as PLANNER.

The web page also shows why the generators are not just `codata' -- encoding Icon examples with lazy lists and demonstrating the awkwardness. Generators add not just a convenient abstraction. Simple generators such as those in CLU are not first-class, in contrast to streams and other such codata. They are less expressive, but they are also much, much simpler to implement and they do not suffer from the endemic for codata memory leaks.

I should have said something

I should have said something about CLU, but I really didn't want to. Thanks for filling the gap. The main thing to note is whether iterators are called explicitly, as opposed to implicit looping over iterable abstractions. Second, as far as I remember, goal-directed evaluation was not part of CLU iterators, and is unique to Icon.