What is Intuitive?

[$0.02 in the category of 'perhaps we can get past the silly holy wars and into interesting ones'] I've seen it come up a few times in discussions: the claim that something is more intuitive than something else. I have a passing interest in usability, UI, and visualization, so the term 'intuitive' is something I come across a lot. I think Jef Raskin had it right: it is not an absolute, it heavily depends on individual experience.

Thus, this is a request for folks to avoid getting sucked into the 'is too' 'is not'. You have to say 'to whom' and 'when' and 'under what circumstances'. For PL evangelists, I think this is good to keep in mind.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Doesn't mean anything

Easier to learn, easier to guess, easier to perform, more pleasant to use... it could mean anything. It's an extremely vague word. One of the (few) useful lessons I remember from the HCI course was not to use it in a usability report.

When writing, find a more specific term to use instead (decide what do you really mean). When reading, note occurrences of the word as an indication that the author probably doesn't know what he's talking about.

Is meaningful

But perhaps not to the extent that it is used. Intuitive simply means that a given group of people can be expected to understand it without any special training. If I show you a device that magically opens jars, and it has an icon of a jar on the front and a conspicuous button that says "Open", and you try to eat the device, then it doesn't have an intuitive interface. But if you point the device at a jar without any instruction, and you succeed in opening it, then I would argue that you find the interface to be intuitive.

So yes, "intuitive" is relative to a given population. And yes, that's because it's relative to a given set of experiences. But some experiences are more or less universal to your expected user population, which is why the unqualified word has *some* meaning. If I pull out a person-shaped jump suit with a zipper on the front, I dare say any human in the world with an IQ above 60 will have some idea that it's something to be worn. On the other hand, if I put a button on a window that has a glyph of an underscore, I wouldn't expect everyone to know that the button iconifies the window.

When it comes to recursion-vs-iteration, it's even trickier. Many humans that are likely to encounter a programming language have experience performing a task that involved counting (whether that's following a recipe, working on a production line, or whatnot). I dare say counting tasks are more similar to iteration than recursion, so most new programmers probably understand iteration "intuitively". On the other hand, I can't think of any real-world scenarios where the most natural thing to do is recurse.

Probably the most likely process is sorting. Given a large number of, say, letters, is the average person more likely to do something like an insertion sort or a merge sort or a radix sort? Of course, insertion/selection sort and radix sort are more iterative in nature, whereas merge/quicksort are more recursive in nature. I personally do a merge sort when I must sort something by hand, but that's simply because I know about it and happen to think that it's fairly efficient. However, most people I have observed are more likely to do something like a selection or insertion sort.

Probably an example that many people can relate to is sorting a deck of cards. In this instance, I actually resort to selection sort, even though merge sort is probably as fast or faster. Most people I know also resort to either selection or insertion sort. I have never seen someone perform a quicksort or merge sort on a deck of cards (though it would be cool to witness!). Perhaps that has as much to do with the size of the problem as anything else, but I have a strong suspicion that most people would never even think to do something like a merge sort, even on a large stack of papers.

So why do some people insist that recursion is "more natural"? Well, my guess is that those people have built up an intuition about recursion! Namely, they have come to think about problems in a recursive fashion, most likely because of exposure to mathematics. Because recursion can lead to elegantly parsimonious definitions, it becomes natural for mathematically-inclined folks to look for recursive solutions to problems. So when they encounter recursion in programming, it really is more intuitive for them.

An interesting way to test this hypothesis would be to interview new programmers in different contexts and histories. I conjecture that programmers who are self-taught before they gain a strong mathematical basis are more likely to find iteration "more intuitive", whilst programmers with a strong mathematical background prior to learning a programming language are more likely to find recursion more intuitive. Of course, the opinions of experienced programmers doesn't mean much, because they are most likely to find the technique they use the most to be most intuitive.

Natural recursion

On the other hand, I can't think of any real-world scenarios where the most natural thing to do is recurse.

So, Brezhnev is coming to power, and Krushchev gives him two sealed letters. He says "Open them if you get in a tight spot." Everything is going well for a while, but Brezhnev eventually comes to a jam, so he opens the first letter. "Blame everything on your predecessor!" it says. He does, and he escapes his comeuppance by the skin of his teeth. His political gambits get more risky, however, and he soon finds himself in trouble again. "No sweat," he thinks. "Letter #2 will save me." So he eagerly opens up the second letter. It reads "Write two letters . . . "

:-)

Recursion feels quite intuitive

...when dealing with data structures and sequences that are naturally defined recursively.

So things like trees - yes, although I guess less so for straightforward 'flat' structures like lists. Yes for things like the fibonacci sequence which is usually defined recursively, but less so for a straightforward loop counter.

Of course recursive definitions take some getting used to aswell, but once you've seen them, the recursive approach to programming with them feels quite natural, I think.

David, you give the example:

David, you give the example:
"If I show you a device that magically opens jars, and it has an icon of a jar on the front and a conspicuous button that says "Open", and you try to eat the device, then it doesn't have an intuitive interface. But if you point the device at a jar without any instruction, and you succeed in opening it, then I would argue that you find the interface to be intuitive."

There is a term for this: "percieved affordance". The percieved affordances of a thing are what people think they can do with it, which are often not what the designer of th thing intended. Screwdrivers are a great example. A steel shaft that's flattend at one end and has a handle a the other affords all kinds of uses variously well, and some of those at least as well as the "turning screws" affordance that the designer intended. Whereas a more complex tool, say a mitre-box has fewer affordances that aren't designed in.

How are your choices for sorting a deck of cards affected by the affordances of your visual system examining a deck of cards, of your motor system in manipulating them?

Meanwhile, the langauges that self taught programmers tend to pick up have affordances built in for iteration (say, arrays as the premier collection, the for loop as the premier repetition construct), so no wonder that they become trained in iteration. But iterative programs do not afford analytical thinking at all well.

To think deeply and well about iterative, imperative programming you need to be a Dijkstra, and even then the results are very ugly. Whereas other programming paradigms do afford that kind of thinking better.

But I'm rambling. Seems to me that a designer of tool should consider their tool intuitive if their intended affordance is the perceived affordance that most users come up with most often. For the user the tool is intuitive if what seems to them the most obvious percieved affordance is in fact supported by the tool well.

What that percieved affordance is will be conditioned by the users' prior experience, but only up to a point. Donald Norman's book explain in some detail how the (physical) form of a tool induces percieved affordances in all users. It may be that something similar happens based on the "psychloical form" of programming language constructs.

Keith

Just wondering

"Donald Norman's book explain in some detail how the (physical) form of a tool induces percieved affordances in all users."

Exactly what book do you mean? He has written several.

I suspect it's "The Design of

I suspect it's "The Design of Everyday Things".

I didn't find the book as illuminating as I thought, possibly due to the dry and laborious style, but at least I now have a name for those damn doors with identical pull-handles on both sides that still only open in one direction: Norman Doors.

Yes, I got an answer further

Yes, I got an answer further down. And it was "The Psychology of Everyday Things" wich is the original title for "The Design of Everyday Things".

OT aside

I imagine (combined with some other sorts e.g. a first pass of radix sort) many people use a gapped insertion sort for things they can manage in their hands, extrapolating from my personal experience. Gapped insertion sort is probabilistically O(n log n). Google "Insertion sort is O(n log n)!"

Natural recursion? Recursion in nature!

So why do some people insist that recursion is "more natural"?

To take you quite literally: perhaps because self-similarity is ubiquitous in nature? :) Think of trees, coastlines, micro vs. macro structure, population growth...

Of course, the self-similar description of these things is extrinsic, not intrinsic, to them. But the point is that we hear about these things from other people via media, and they often employ self-similarity as an explanatory strategy. So you can argue that, if recursion isn't intuitive to you, you don't get out enough. :)

BTW, I would like to add that I think it is a truism that although some people find recursion "more natural", that does not mean that they find "iteration" (a bad term — iteration is usually defined in mathematics as a form of recursion) less natural than those people who don't get recursion at all. I have also found this to be true more generally of functional programmers: there are lots of procedural and OO programmers who don't (claim to) "get" FP, but FPers almost all have a facility with procedural style and OO.

Ahh...

a bad term — iteration is usually defined in mathematics as a form of recursion

But is that because iteration == recursion, or because mathematicians tend to think functionally? ;> However, iteration is a perfectly natural mechanism in Computer Science, as you can find hardware support for it in virtually every processor architecture (i.e. decrement and jump if not zero type instructions). On the other hand, recursion is usually not as naturally supported (I mean, many architectures have some kind of CALL/RET instructions, but that either assumes your data fits in your register set, or you have to set up the call by pushing your arguments onto the stack or some such busywork, which is no longer assembly-level atomic).

There was a time when Computer Science was a branch of Mathematics, but I think it's legitimately its own field now, with the right to coin its own definitions (that don't necessarily coincide with the usage of words in other fields).

My neighbors suck.

But is that because iteration == recursion, or because mathematicians tend to think functionally? ;>

Iteration, in the mathematical sense, is not the same as recursion. It's a particular species of recursion, and there are other forms of recursion which are not iterative.

So you will see now that I didn't intend to suggest that iteration, in the programmer's usual sense, "is just" (or "==", as you wrote) recursion.

However, iteration is a perfectly natural mechanism in Computer Science, as you can find hardware support for it in virtually every processor architecture

First, I might argue that if people really found computer architecture "natural", they would all be programming in machine language. To many people, the point of a (high-level) programming language is precisely to abstract away from machine details, which suggests that at least some of those details are "unnatural".

Whether iteration in your sense is among those details or not might be questioned. If you believe that language popularity correlates with naturalness, then I suppose the answer is no, and that iteration is more "natural". But personally I think that popularity is like fashion, more social and extrinsic than biological and intrinsic, and when you say "natural" I think of the latter.

What cannot be questioned, I think, is that languages need to support (*) both recursion and iteration. If you are given a spec which is expressed via recursion, like say the factorial function, then it is easiest to use recursion in your program; if you are given a spec which is expressed via iteration (more generally, imperatively), as I have noticed is often the case in specs written by other programmers, then it is easiest to express it via iteration.

That said, it is also useful to translate one into the other. When you express a recursive spec as an iterative program, you often gain insights into how to algorithmic questions such as how to produce output incrementally, and how it should behave on infinite inputs. Conversely, when you express an iterative spec as a recursive program, you gain insight into how the algorithm depends on the structure of the input, and then sometimes you can abstract that out as a generic function (like, say, fold).

Second, what have the idiosyncracies of computer hardware got to do with programming? Hardly anyone writes programs on top of bare metal. In practice, most people write to the spec of some kind of software platform, and an implementation of that platform could as well be running on a hardware with support for recursion and no support for iteration. Once you have that extra layer in between your program and the hardware it becomes very difficult to support arguments about naturalness by pointing to the hardware, because there isn't any hardware! It's an abstraction, factored out by the spec. At least if you are writing portable applications.

I am thinking of platforms like Win32 or POSIX, but there are many programs written for the web which essentially run on "the HTML platform". HTML has a really obvious recursive structure. If you argued that iteration is more natural because processors support instructions for iteration, you would have to argue that recursion is more natural for HTML applications. Actually, it's amazing to me looking at programs which produce HTML by appending strings. But intellectually I know that it's not "unnatural", inasmuch as an HTML document is ultimately a string, which happens to have an additional tree (and hence recursive) structure. In academic circles it's convenient to ignore the text aspect entirely and just consider it a tree, and then you have to employ recursion (because by definition a tree is processed by recursion) but that's a simplification, though a relatively minor one.

There was a time when Computer Science was a branch of Mathematics, but I think it's legitimately its own field now, with the right to coin its own definitions (that don't necessarily coincide with the usage of words in other fields).

As far as I'm concerned, computer science is still a branch of mathematics. Software engineering might be something else, since it has to do with human processes and organizations. I think it is best not confuse computer science with software engineering.

And I think it is best not to coin new words if suitable ones already exist. Half the arguments on LtU occur because people are not using the same words for the same things. On the other hand, I recognize that people coin words first and it might be years or decades before they realize that they are renaming an old concept. What to do then is a problem. I often prefer to use the older words in the hope that people will make the connection. And there is also the fact that programming jargon is fuzzier than mathematical jargon, and I think fuzziness is detrimental to clear thinking.

(*) But "support" does not necessarily mean explicit syntactic support. Scheme minus derived forms like do still "supports" iteration in this sense.

Gedankenexperiment

BTW, here is a thought experiment for you.

Imagine an architecture A, such as x86, with support for iteration but not recursion.

Now imagine an architecture B, which is exactly like A except that it has an additional instruction (say, FIX) supporting recursion. So, in particular, any program written for A is portable to B.

You argued that iteration in programs is more natural than recursion because most extant architectures are like A, and do not support recursion directly. But every A-program is a B-program, and B supports recursion directly. So I think really your argument reduces to "it is more natural ipso facto because most programs are written for A."

Of course, it is true that there are hardly any computers of sort B. (Perhaps the Lisp Machine had something comparable, but...) But, first, there could be, so your notion of "natural" is at least partly linked to historical issues and not only universal ones. Second, offhand I think it would be pretty easy to write a macro assembler which compiles something like FIX down to A-instructions. Then the question comes down to whether you decide to write code for your usual A-assembler, or for this B-assembler which happens to cross-compile to A. And whether you use one or the other is a choice, not a natural circumstance.

Iteration vs. recursion?

Now imagine an architecture B, which is exactly like A except that it has an additional instruction (say, FIX) supporting recursion. So, in particular, any program written for A is portable to B.

Well, I wrote a long paragraph about how impractical such a FIX instruction would be, but after thinking about it for a while, I decided that it would be doable. After briefly reviewing the x86 instruction set, I decided that INT is the closest such instruction (pushes registers onto the stack and performs a CALL). Even so, I would argue that iteration is more fundamental from a *computational* point of view. In particular, I argue that it is smaller in both space and time.

Clearly, iteration derives its performance the same way every engine does...by increasing entropy. In this case, it does so by destroying information. This allows it to avoid the one step that makes recursion expensive: copying. Recursion is elegant at least partly because it's pure. And being pure, it is automagically reversible. However, that purity comes at the cost of copying the local context on every recursive call [that cannot be optimized by TCO]. This is why recursion is more expensive in both time and space. It takes time to copy the state, and you need a place to copy it to.

It seems clear to me that the potentially unbounded cost of recursive calls is why most architectures are of the A type, and few, if any, are of the B type. Since the A type is computationally cheaper, it makes sense that early programmers, needing to conserve resources, would adopt a style that makes the most use of the A type architecture. So I agree that iteration is more popular because of history. I disagree that it was an *accident* of history. It is inevitable that non-TCO recursion will always be more expensive than iteration on the machine level (unless QM somehow gives us a device where copying is free). If we order computational mechanisms by how many micro-ops they require, given any reasonable hardware design, then I think it is demonstrated that iteration is more fundamental.

Now, this is not at all to say that iteration is *better*. Merely that abstraction has a cost, and that cost ought to be recognized and given due consideration (and it is, which is why we have things like TCO, and why FPers consciously try to enable it).

An example from daily life

Cooking and knitting are two human activities which might be considered to be somewhat distant from computer programming. I don't have any knitting patterns handy, but a quick look through one of my favourite cookbooks provided the following example, which I've edited slightly:

...
3:
   Stir the batter well.
   Heat the remaining oil
      and add 2 tablespoons of batter to the pan,
      swirling it to form a small round pancake.
   Cook the pancake for 30 seconds
   Place 2 pieces of pork,
            1 tablespoon of prawn meat,
            1 tablespoon spring onion
            and 1 tablespoon bean sprouts
     in the center of the pancake.
   Cover the pan and cook for 2 minutes
   Place the pancake on a platter
   Repeat with the remaining ingredients.

4:
   Place each cooked pancake 
      inside a lettuce leaf and top with 2 mint leaves.
      Fold the lettuce to form a parcel
      Serve with the Dipping Sauce.

5:
   To make the Dipping Sauce:
      Combine the fish sauce, lime juice, chili and sugar in a bowl
      Whisk until well-blended.

Step 3 seems to me to be clearly a description of an "iterative" algorithm making use of a recursive tail call:

   (Repeat (remaining ingredients))

while step 4 is clearly using a higher-order function. It's interesting as well that step 5 needs to be executed prior to the completion of step 4.

Since this is apparently a functional cookbook, the map function in step 4 (which constructs each served-object from a pancake and a lettuce leaf) is a purely functional map, not dependent on any ordering of pancakes in the platter. Evidently, the procedure could be deforested if a second chef were available to perform step 4 in parallel with step 3.

:)

What about two cakes?

I'd say that in the case of making two cakes, the cake function calls itself.

I learned how to knit a few weeks back, it sure looks recursive to me. You need the result of the last step as one of the inputs for this step.

Hm, upon further thought, life in general seems more like continuation passing style.

Does CPS count as iterative or recursive? Or neither?

cookbook algorithm style

Step 3 seems to me to be clearly a description of an "iterative" algorithm making use of a recursive tail call:

But with no base case... Alternatively, you could read it as a "foreach" loop where the loop header comes at the end.

    {
        cook the pancake;
        assemble the pancake;
        put the pancake on the platter;
    } foreach (pancake in batter)

Several of the papers from the Natural Programming Project at CMU talk about how non-programmers tend to express loops. They're often "distributive" (eg foreach- or map-style) rather than explicitly counted out, and people often put the loop body first, and then explain the looping rules at the end. ("Lather, rinse, repeat", not "Do this twice: Lather, rinse".)

'Intuitive' is a stick...

...that you use to beat people. If you don't like someone's approach to a problem you say something like "I don't find that approach intuitive". It's considered more polite than "your way is crap" but it still carries a similar kind of force.

...and in the absence of supporting evidence...

arguments based on "intuitiveness" also carry a similar amount of useful content as "your way is crap".

In the context of UI design, usability, etc... the term gets used to refer to the ease of a new user figuring out how to use a device; but that notion of intuitivity is generally supported by user trials and other objective evidence.

In most other contexts, "it's not intuitive" means little more than "I don't like it".

Intuitive is a predicate with

Intuitive is a predicate with an unknown implementation.

Iteration and Recursion

I recently started learning OCaml. My primary background is from Java, PHP and Python. I wouldn't say that I have a strong mathematical background, just the standard mathematical stuff you pick up while doing a computer science degree, i.e. Discrete mathematics, basic calculus and linear algebra.

Whenever I read about iteration versus recursion here at LtU I used to agree with the "Recursion is not intuitive" stance. After about a month of OCaml, this has changed.

To learn OCaml I implemented a parser which reads text with a reStructuredText flavour to it. This text is turned into a simple AST which is then converted to HTML. What I experienced while doing this was that I never needed iteration in the form of while or for loops. In fact, I first tried to use them (because I knew them well), but found that they complicated matters.

What I realized was that there are basically two types of iteration. There is the foreach/forall style of iteration which just iterates over a collection of things, and then there is the sort of iteration that has many condititons that can affect the next step in the iteration. This latter form of iteration fits very nicely with use of pattern matching and recursion. As I started getting used to this way of thinking (which happened over the course of two evenings), I found that the code became surprisingly easy to write. The divide and conquer method that it naturally leads to is very powerful.

Of course, I still use foreach/forall, but I use them all in the form of List.map, List.filter, List.iter etc. which are more powerful IMO because they take some of the drudgery out of it.

Of course, I am not in any way new to programming. I've been writing imperative programs ever since I first started playing with Python some six years ago. I already had a good understanding of recursion when I started with OCaml. But IMO recursion is a basic technique that all programmers must understand. Once you understand recursion, replacing iteration with recursion isn't that big a step.

Recursion was certainly a lot more intuitive for the task I tried to solve, despite my lack of experience with this style of programming. As a programmer, learning more techniques gives you more tools to choose from, which means that you can choose the approach which is most intuitive given the problem. Whether it is easy or hard to understand the technique the first time you encounter it really doesn't matter.

Of course, there are things that will never be intuitive. I don't think concurrent programming in Java is ever going to be intuitive for example.

There is only one intuitive interface

and that's the nipple. Everything else is learned. Generally "intuitive" in programming language discussions really means "similiar to something I already know". Which is how, for example, functional programming can be both intuitivie and nonintuitive- if you're comming from a math background and a functional language is the first language you learn, then it's intuitive, being similiar to mathematical notation. But if you're comming at it from a background of already knowing one or more of the more popular languages, it's not intuitive, because it's not similiar to the languages you already know. If you're unfamiliar with both mathematics and programming, than just about any programming language (and pretty much all of mathematics) will be non-intuitive (just ask your average english major).

From where I'm sitting, intui

From where I'm sitting, intuitive means something that I have intuitions about - that I can work with without having to consciously reason my way through every last step. That does tend to come from being yet another instance of something I already know though, otherwise it's time for a lot of rote-learning-by-doing...

Actually a litle more things

Actually a litle more things is intuitive. The most relevant to programing language is probably natural language. Most neurolinguists these days belive that we are all born with some knowledge about grammar. For example pars trees for natural language tends to slope from left to right. And also it would seem that infix notations is quite intutive in some contexts.

Tangent

As a new father, and a programmer who's oh-so-familiar with the canard about the nipple being the only intuitive interface, I have to say that (from my observations) the nipple is actually *not* particularly intuitive. My baby does 'naturally' try to suck on my wife's breast, of course. But he also tries to suck on my shoulder, stomach, arm, and face.

So, with breasts as with programming languages, my personal impression is that the notion of "intuition" says more about the analyzer than the object of analysis (just like you said yourself). Babies are programmed to suck, VB programmers are programmed to prefer iteration. Maybe intuition is a cultural property rather than an individual (human) property.

Intuition sucks?

Couldn't resist.

Cheers, Brandon J. Van Every
(cruise (director (of SeaFunc) '(Seattle Functional Programmers)))

Instinctive and Intuitive

You are mixing up two difference concepts. Intuitive and instinctive.

A Baby's instinct is to suck.

Because of the instinct, the nipple is somewhat intuitive - when placed at the nipple the baby will get the right result (food). However the nipple is not intuitive enough that the baby will find the nipple without assistance. Your wife has to place the baby's mouth at the nipple for it to work.

So the nipple is not instinctive at all, and only partially intuitive.

surely the meaning of intuiti

surely the meaning of intuitive is intuitive? :o)

(so it's either true or undecidable?)

A simple and cynical definition...

"Intuitive" can safely be taken to mean "unlikely to confuse your dimmest coworker". If you're lucky enough that your co-workers are bright code warriors, the range of "intuitive" constructs may be very large, encompassing recursion, higher-order functions, polymorphic and dependent types, combinators, monads, continuations, and all those other fine toys so popular on LtU. If you're stuck in the trenches of average enterprise software development, intuitive goes about as far a imperative constructs, simple inheritance, declarative query languages, simple lifecycle-management constructs, design-by-contract, and most of the popular design patterns. Maybe some simple threading, if you're lucky. Go much beyond that and you're asking for heartache. Programming is a team sport, and it doesn't much matter if you know all of the cool constructs, if no-one else on working on your codebase does.

"This is a for-loop"

I remember having to explain the meaning of a for-loop to a colleague (who fancied himself my superior) who disputed the correctness of my code based on his inability to understand the basic meaning of the construct.

The solution is not to give up the for-loop, but to introduce new abstractions, document code very well to ensure that others have no justifiable complaint that clarity and elegance are "cryptic", and explaining the code to those who have to work with it who don't understand, including the appropriate benefits. If you keep doing this, you will come to a situation where their headaches can be cured by the use of an "advanced" construct that you have introduced, and you can show them this.

Yes, but with qualifications

This is true, unless you get so far ahead of the curve that recruiting becomes an issue. In my most recent project, I used Java dynamic proxies for the first time in a production application. I took the risk, because my current team is good enough, qualified Java programmers are easily available in our area, and the dynamic proxy construct only requires about one "aha!" step for a qualified Java programmer to understand. If it were just me and my current team, considerably more difficult constructs would have been valuable, but it would have been unprofessional to use them given the current state of the recruiting pool.

I am sure

We could come up with a range of fanciful user experiences that we would agree were non-intuitive however we would argue as to a range of actual user experiences that were intuitive. My explanation for this is that some people lack a strong sense of intuition.

Good catch. I meant specific

Good catch.
I meant specifically "The Psychology of Everyday Things"

Papers

Although I can't read them myself, I've just seen the titles of two papers on CiteULike, and they seem highly relevant to your question:

The greedy trap and learning from mistakes, and

Viewpoint: Intuitive equals familiar

Re: Papers

Perhaps Google can find you readable copies, somewhere?

Ah! Attempts at actual research, with Raskin an author, to boot! I'll read them, thank you.

I always think research is good (e.g.: I like that there are people smart enough to be pursuing PLT) although I never have the fortitude to do any myself, and anything touching on the realm of psychology seems to me to require large doses of salt along side? I don't mean that to be denigrating, I just mean that I feel like more is subjective than we think.

What is Intuitive?

There are two jokes about computer novices, illustrating what intuition is.

An old person is told to "move the mouse to that button in that window". The old person picks up the mouse, takes it up to the screen, and places it over the screen area where the button is displayed. This interpretation must be completely intuitive to the old person's sense of "move to" - it has always meant that to the old person for 70 years.

An office clerk receives a new computer at work. It has a new front panel he has never seen before: a button, which when pushed, causes a retractable tray to come out; the tray has a circular depression and a smaller circular hole. The clerk intuits it to be a cup holder - afterall, the clerk has never seen a CD player, but has always seen retractable trays with circular depressions and holes in cars and these have invariably been cup holders.

According to my observation, intuition is expectation on the ground of an analogy with past experience and education. When a direct analogy is not found, an indirect one, usually very weird, is drawn; the neural network is very creative and strange in drawing analogies, sound or flawed. Wherever intuition is possible, paradox is also possible.

Intuition is derived from educational background. Suppose a programming language is to be designed for expressing the act of performing tasks R, S, T in that sequence, possibly with dataflow between consecutive stages. A European or American may suggest

R | S | T

as an intuitive way; afterall his/her mother tongue is written from left to right. An Arab may suggest

T . S . R

as an intuitive way; afterall his/her mother tongue is written from right to left. This competition of two equally intuitive intuitions can easily escalate to a war of languages, if not a war of religions.

When a concept is asserted to be intuitive, it is equivalently confessed to be nothing new. This is a two-edged sword. If a concept or problem is transformable to old ones, it saves us effort of learning or solving. But if all new concepts are confined to the known old ones, how is advance possible? "New" is by definition hitherto unknown and unlike the old, and must be at odds with intuition. Seen in this light, intuitiveness is probably undesirable for original research, and is probably a sign of stagnation for industrial practice if overemphasized.

Many new, alternative concepts are rejected because they are "counter-intuitive". It may be an understandable, benign desire to save and reuse past knowledge. But thinking deeper, if education encourages you to reject those alternatives you have never been taught, has education become brainwashing? What is the difference, and where do you draw the line?

Is intuition your master, or are you the master of your intuition? Do you let intuition make all decisions for you, or do you keep modifying your intuition to accomodate new learning?

From future to past

Intuition is about what we know by doing and not what we know by knowing. But what we know by doing is often guided by well known preconceptions. Responding to the discussion about how intuitive iteration is vs recursion I do think the underlying preconception is the idea of a flow of time from past to future and that an algorithm has to start with a setting and transform it in tiny little steps advancing in time to reach a solution. Recursion is more magic/declarative in that it reorganizes time as if it flows from future to past starting with the form of a solution and runs through a self-referential involution to reach it's origin. While the process of computation advances in time in the same preconceptual setting, recursion seems to advance in the opposite direction.

Nevertheless we are often able to drop our preconceptions if we move into an operational mode. Then it becomes hard to explain what is particular hard about the use of car and cdr.

Kay

Mathematical background & recursion

I have a fairly mathematical background (information theory), and only much after my pencil and paper days got into programming. So in a sense, I should have been primed for recursion. But it didn't turn out that day.

I prefer to avoid recursion unless the non-recursive solution is far more painful (or nonexistent!). The "far more" is a judgement issue of course. This is not for performance reasons. It is just a discomfort with too much self-reference, I suppose. And I personally grok the Turing model of computation far better than the lambda calculus model.

Based on my years of experience interacting with a lot of programmers, most folks just aren't very comfortable with recursion.

Unproven Hunch: If 100 totally fresh, very smart kids not exposed to either recursion or iteration were trained first in recursion and then taught iteration, I believe 70-80 of them would still prefer iteration.

Partly this could be a recursive(!) issue itself: since they know most people would favor iteration over recursion, they adjust their style to favor iteration themselves.

Whatever the reason, in real world commercial software, recursion makes only a rare appearance. Yes, a good functional programmer would make recursion (and tail recursion) appear everywhere, but I don't consider that a win in terms of comprehensibility, especially if my hunch above is correct.

Mathematics and recursion

I have a fairly mathematical background (I am a mathematician), though for me programming and programming languages are just a vague side interest that I pay casual attention to. With regards to recursion as a concept independent of programming - I don't think you can say anything meaningful about how intuitive it is without some reference to the problem at hand. Whether the recursive approach is the intuitive one very much depends on what you want to do with it. For some uses it is easiest to think of something in terms of recursive structure, while at other times thinking of the same thing in expressed non-reuirsively is the most natural.

In what programming I do things generally work the same way. Whether I write something recursively or iteratively is really all about how I want to use it. Neither approach seems any more intuitive without some understanding of exactly how whatever it is is going to fit in with everything else.

It's all to easy to generaliz

It's all to easy to generalize from your own experience. Just to make a counter example: I don't like math and avoid it as much as I can. I learned programing in a BASIC that didn't even allow rekursion. But today I allways use recursion (or fold). It's been years since i used recursion for any thing.

Recursion is my flavor.

I agree with Li.

I'm a self taught programmer, but I started using recursion and pure functions in Python, since that felt simpler than iteration and mutation.

Luckily for me, someone saw my Python code and suggested that I learn Haskell.

Here's a possible definition

Something is 'intuitively' designed if the orthogonality between what is necessary to perform similar actions is the minimum amount of orthogonolity necessary to recognize the distinction between the two actions.

In other words, if knowing how to perform one action automatically clues me in to how to perform similar actions. This why the 'what you are familiar with' definition sounds so sexy. Because how orthogonal two operations are in your mind might not necessarily correlate to how orthogonal they are in another person's mind might lead one to think 'intuitive' is totally subjective.

To break it down again -- that just means know your audience ;) In the case of HCI and usability studies, we can make certain observations about the general population and use that to determine what would be 'intuitive.' Obviously a GUI for a music player that requires algebra knowledge wouldn't be very usable, because the average computer user has no mental connection between those two areas. But designing a programming language to have similar syntax to C might be a good idea if you're targetting C programmers (notice that in that case what is intuitive depends on audience, for the general population C syntax may not be a great idea).

PLT needs a Bob Geldof

know your audience

Ja wohl. For folks who scratch their heads and wonder, "Gee, we have all this good stuff (Erlang, Haskell, O'Caml, etc.), why aren't more people using it?" part of the answer is that marketing is needed: a) being able to communicate how the new thing will help solve people's actual problems and b) being good at teaching. Java certainly seemed to do that for C/++ folks.

The Answer

What is Intuitive?

That should be intuitively obvious.

And if it isn't, then you are obviously from another planet, clearly a moron, apparently a heathen, or evidently of weak moral character.

Just look deep inside yourself, and you will find the Truth.

Re: The Truth is Inside

Hm, I'm not sure what "All I really want is to quit my day job and go home and drink hot chocolate and play Half Life 2 all day" has to do with PLT. I'll meditate on it further...

Studying the Language and Structure in Non-Programmers’ Solution

An old study, but I found it interesting nonetheless: Studying the Language and Structure in Non-Programmers’ Solutions to Programming Problems.

Despite its age, this seemed like the most relevant thread since it's discussing intuitiveness. This paper was mentioned on LtU in a quite old thread.

As for results, it's interesting that most people naturally gravitated towards event-based programming styles, and an imperative style was relatively uncommon by comparison. Many other interesting characteristics showed up, including how people reason temporally without state variables, set construction and data structure operations.

yay cmu

I worked for Dr. Myers while an undergrad. I sucked at it. He's kept up the good faith all along. Yay! Wish I'd done a better job back then.

Also see Crista's