Computational Thinking

Computational thinking is a fundamental skill for everyone, not just for computer scientists. To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability. Just as the printing press facilitated the spread of the three Rs, what is appropriately incestuous about this vision is that computing and computers facilitate the spread of computational thinking.
Computational thinking involves solving problems, designing systems, and understanding human behavior, by drawing on the concepts fundamental to computer science. Computational thinking includes a range of mental tools that reflect the breadth of the field of computer science.

from Jeannette M. Wing's Computational Thinking Manifesto

The Center for Computation Thinking at CMU has more information about the subject.

We talked briefly about Computational Thinking back in 2006. Recently I listened to Jon Udell's interview with Jeannette Wing and realized that it's time to bring this subject up again for discussion.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

For the record....

Ages ago (around 20 years ago...) I started writing a collection of extended puzzle stories under the title of "algorithmic thinking". The puzzles revolved around everyday situations (selling theater tickets, balancing boats, planning bus routes), and guided the reader to relevant algorithmic issues (finding robust distributed solutions, designing protocls etc.) Based on what I read so far, without reading the work on computational thinking in depth, it seems to me that what people are now approaching is something somewhat similar to what I did back then.

I never completed the book (I was barely a teenager, I had other things to do), but I may be able to find copies of the chapters I did write and share them (as far as I recall all of them are in Hebrew, though...)

Language barriers do matter in the age of Web 2.0?

It would be great even with the language barrier. We can always setup a Wiki and find people willing to translate the chapters. I can do some of the technical work and setup things, but Hebrew is a complete mistery to me.

We can try...

Well, if someone volunteers to do the translation, I am willing to try to dig up what I wrote. Some of the stuff was written on very obscure Hebrew word processors so just handling the file formats is going to be an issue. As far as I remember some chapters were complete, where as others are only stubs, so maybe people here will want to improve them...

Programming vs. Computational Thinking

In the interview and the manifesto I found interesting to see the distinction between programming and computational thinking, and aiming the latter without using programming-specific concepts. In essence I agree with their arguments, as computational thinking is more general than programming, but I feel that programming is the tool that better conveys computational thinking and it'll be hard to find a good substitute.

Obviously we need a good PL to teach computational thinking, some that let you keep the abstractions separate and go down (or up) only when you want. The usual suspects (e.g. PLT Scheme , Squeak) are the best tools we have today, but in which ways we could do better? What interests me is how do we mix such ease of usage with static (or slider ;) typing? It seems that the approach from theorem proving tools (e.g. Agda, Coq) or Epigram could be simplified (e.g. reduce technical jargon) and be very useful for this endeavor, does anybody knows of work on this area (i.e. type-based teaching tools for people unfamiliar with set theory, logic and such)?

Understanding algorithms and processes

Understanding algorithms and processes is indeed something that any knowledgeable person should be able to do. However, I think that Dr. Wing is confusing the technology with what is necessary to be taught. It's not clear to me that this kind of knowledge is necessarily better transferred via computers, as many of the activities that we carry out each day are procedural without being placed into a computational setting. But then, I am an old fuddy-duddy. I even decry the disappearance of one of the first algorithmic activities from the classroom - that of long division from fourth grade studies - something that the more modern deem unnecessary.

Long division

I agree completely. The removal of long division from the classroom completely misses its actual educational value.

The brain does not compute, it matches

I don't think the brain has mechanicms for computations. All it does is pattern matching. That's why we can easily compute 1000000 + 1000000 but not 123456789 + 987654321, for example.

Understood, but not agreed

(a) what you wrote is interesting and makes me think that we have a layer cake of computation and pattern matching. one learns to recognize the characters 0..9 via pattern matching. then one learns to compute that 3 oranges + 2 oranges = 5 oranges. then one learns to pattern match e.g. by memorizing multiplication tables. so then one can compute something more advanced, but not pattern match it until much practice has happened. etc.

(b) there can be different approaches to computation. a calculator might from an external perspective easily calculate either of what you wrote. but a human given enough time can as well. it just isn't as fast in each case. that doesn't prove no computation. in fact to me it proves computation because we are failing to pattern match the latter in some sense.

(c) the fact that given enough time we can compute the answer to 123456789 + 987654321 means that we can do computation.

?

That's why we can easily

That's why we can easily compute 1000000 + 1000000 but not 123456789 + 987654321, for example.

I think that's just an artifact of the base 10 number system we use. Given another base, 123456789 + 987654321 might be just as easy for us to add as 1000000 + 1000000 in base 10.

The references escape me,

The references escape me, but there has been quite a deal of experimental and modeling research into this concrete question of addition of big, hairy numbers in the cognitive science community. The essential point probably does not veer far from what you do in practice: pick fence posts (big number, really big number, etc) and put those together. I think James Anderson has written a bit about it as part of his Ersatz project.

123456789 + 987654321 is not

123456789 + 987654321 is not much harder than 1000000 + 1000000. Even if it was, that doesn't mean that the brain does not compute. It just means that the brain has different primitive operations.

Not quite your point...

... but we can compute 123456789 + 987654321 quite easily. All you have to do is notice that each pair of digits sums to 10 and the answer has to be 11111111110.

But it is not obvious whether that supports or contradicts your point. If you take sequences of pattern matching, where we can choose what to do next based on the previous pattern then don't we have a simple model of computation?

The ICFP Contest got there first...

...though I doubt Endo's going to spring fully-formed from our brains any time soon!

Repeatedly matching and rewriting is a form of computation.

Unifying theory

Does computational thinking have a theory? What about Dynamic Programming? This is an old subject that seems to make more and more sense all the time. And what about a dynamic programming language.

CAS, Swarm, Artificial Life

What about complex adaptive systems? Swarm intelligence? Biological computation? These approaches are a huge part of computational "thinking" in the natural world. They should be studied by anyone interested in computation, especially beginners.

An example I first heard from Chris Langton: ask some young children to make a circle on the playground around a point. How do they "compute" the outline of the circle? Perhaps by holding hands and stepping back until theirs arms are fully stretched. That's a pretty good solution compared to use of graphs, trig, or complex roots.

With the coming "problem" of too many CPU cores and no way to program them, these alternative computational strategies will be even more important.

By all means!

By a unifying theory I was thinking of something that works for computations in any form, whether analog or digital, biological or mechanical. Dynamic programming actually has a history of doing that. In its early days it was applied mainly to control systems, and now days, judging from the wikipedia, it is used in the more familiar computing with numbers setting.

In my thinking at least it is a fundamental law, that helps to derive various models of physicality, and thereby constrain the computation to familiar semantics.

Edit, Clarification: Prograqmming is often not so much about the program but the algorithm. Algorithm design is an overlooked part of the programming puzzle that has been studied extensively.