LtU Forum

According to Allen Holub programming is now a liberal art. When did this happen?

Allen Holub has written an article explaining quite clearly why neither programming nor computer science is a science. He also does an excellent job of explaining why software engineering is an oxymoron. So far he's preaching to the choir.



However he concludes that if it isn't a science or engineering then it must be a 'liberal art.' Apparently the work we do has more in common with creative writing than with physics and consequently we should eliminate most of the mathematical aspects from undergraduate education in favour of English composition.



I find myself agreeing with many of his initial premises but violently disagreeing with his conclusions especially the notion that: "programming has changed from the study and implementation of algorithms to the study and creation of complex documents."



In my niche, highly complex custom applications for large companies, programming is often as much about solving problems for and with large numbers of people as sitting in a corner typing away. This leads me to believe that an undergraduate education should be focussed on:

  • teamwork
  • learning the sort of basic software development techniques outlined in books like Robert Glass's Facts & Fallacies of Software Engineering
  • discrete mathematics
  • algorithms
  • data-structures
  • building complex software
  • learning to comprehend and change large codebases built by other people with no documentation other than the code

In what ways would members of LtU change undergraduate curricula in what is now called computer science if you had the power?

Reflection in Java: how are they getting with it?

We seem to discuss pretty specific issues lately, so I feel I can bring this one up.

For a PL in which dynamism is one of the selling points, Java has pretty bad support for reflection: reification/introspection/intercession trio.

Reification is crippled by fact that not all concepts have representation, accessible to programs - the one for bytecode (literally an array of bytes) is a joke. This can be helped by libraries, though, and indeed is.

Introspection is limited - for bytecode as well (a program cannot in general obtain a representation of bytecode for the object at hand). That's a more serious limitation, although there are ways around it if you are willing to make some assumptions.

Intercession has another twist - on a positive side, it's possible to obtain classes from bytes (though it was not possible the other way around - funny, eh?), but on a negative side, while introspection had structured representation for some concepts (like fields), intercession knows only byte arrays. Ugh.

I will skip my complaints to representation of invocations, call stacks and threads, as the post is already quite negative.

To drive this towards PL design: how did Java get there? Was it because in early years there was little use for reflection? Why wasn't it fixed? Is it a scientific problem, an engineering one, or maybe cultural? How can newly designed PLs avoid such a fate?

PS: For some users reflection and MOP may be something very esoteric and unpractical, but look at Java "enterprise" open source projects: most of them are just grown from one or other attempt to overcome these deficiencies of the language. Unfortunately, there is a limit on what libraries can do to a PL.

Halting program density?

We know the density of primes tends towards 1/log(n) as n get larger. Do we know anything about the density of programs which halt? For example, take your favorite universal turing machine. Generate all programs of size 1. Record the percentage of programs that halt in less than 1 millisecond (or 10,000 head movements, or some other metric). Run the remaining programs for 10 milliseconds each, record the percentage which halt. Continue for 100, 1000, etc. milliseconds. Then repeat the procedure for all programs of length 2,3,4..N. Do we know anything about the surface just described? It surely must be monotonically increasing. Can we say anything more? Can we describe any aspect of it with numbers? This seems like it might be an easier task to undertake, compared to the corresponding question of "What's the density of provable truths in arithmetic?" Thoughts?

The fate of reduce() in Python 3000

Link

Guido van Rossum comments on the elimination of map(), filter(), reduce() and lambda from Python 3000:

About 12 years ago, Python aquired lambda, reduce(), filter() and map(), courtesy of (I believe) a Lisp hacker who missed them and submitted working patches. But, despite of the PR value, I think these features should be cut from Python 3000.

He believes map and filter are unnecessary because there are list comprehensions. But what about reduce ? It seems Guido doesn't believe in folds:

So now reduce(). This is actually the one I've always hated most, because, apart from a few examples involving + or *, almost every time I see a reduce() call with a non-trivial function argument, I need to grab pen and paper to diagram what's actually being fed into that function before I understand what the reduce() is supposed to do. So in my mind, the applicability of reduce() is pretty much limited to associative operators, and in all other cases it's better to write out the accumulation loop explicitly.

And then, with these functions gone, there is no use for lambdas, he argues.

I find it positive, in general, when designers eliminate features from a programming languages, if they are rarely used, but maybe he is being too harsh on reduce(); it may be my predilection for functional programming getting on the way, though.

via PLNews

2nd CfP: Structures and Deductions

The domain is proof theory, the theme is "Eliminate Bureaucracy", the LtU angle is that if we can eliminate bureaucracy from proof theory, then we open the floodgates to applications of proof theory to computer science. That's the theory anyway...

Check out:

  1. The workshop homepage;
  2. The SD05 wiki page at Greg Restall's wiki;
  3. Fancy going to Lisbon in July? slashdot journal thread...
  4. LtU node #551 wherein Greg and I chatted about the workshop...

Grady Booch's keynote on software complexity at AOSD

This quote:

overall flexibilty increases by having fewer components but greater composability.

made me think of HOFs.

A couple more teaser quotes:

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work.

...

It's better to be simple than to handle every possible case.

Embedding one language in another

Assembly code can be embedded in C code. Are there examples of a higher level programming language which could embed another high level programming language. HTML and javascipt does not count ;-)

Xactium -- "lightweight language engineering" ??!?

Xactium

There's a lot of marketing wrapped around their technical ideas, but this appears to be a model-driven architecture (MDA) tool with domain-specific language capabilities. Not sure what they mean by "lightweight language engineering" -- macros are too heavy? -- and I'm not sure I want to install their preview release.

MDAs and DSLs always seemed to be competing approaches, although you might have a specialized language for defining the models themselves. They both share the notion of defining the "problem" at a relatively high level and then magically creating the end solution, but I suppose that description is vague enough to apply to every tool since FORTRAN was first implemented.

Exploiting parser ambiguity

First I'll state my (admittedly fledgling) understanding of avoiding parser ambiguity in normal languages and then I'll ask my question about how it might be exploited instead of avoided in some "abnormal" languages.

My understanding is that normally it is imporant for the concrete syntax of a language to parse unambiguously, e.g. it is important that precedence be specified so that an expression like

a + b * c

will be parsed to give the same parse tree as either

a + (b * c)

or

(a + b) * c

(For this example, probably the former, but that is beside the point.)

In giving an abstract syntax, it seems that you can usually assume the concrete syntax is designed to parse unambiguously, i.e. the abstract syntax defines the syntax of parse trees rather than the syntax of the lexeme sequences that are parsed into those trees.

So, my question is, what if you want to design a language in which

a op1 b op2 c

means

(a op1 b) op3 (b op2 c)

i.e. the ambiguity is resolved by inserting an implicit operator (op3) and repeating the symbol in common between both sub-expressions (b). E.g. the expression

a < b < c

might (as it normally does in math) mean

a < b && b < c

So my question is, how does one present syntaxes (concrete and abstract) and semantics for such a language? It seems like traditional methods can't accomodate this way of resolving parsing ambiguity. Or maybe they can and I'm missing something; that would be great.

I hope this post's topic is appropriate to this site. I also hope my question isn't too vague or malformed: I admit I haven't thought about it a lot yet but thought some responses might be useful to point me in the right direction earlier rather than later so I don't go reinventing the wheel.

Phil Wadler's blog

Phil Wadler has a blog. Who's blog could be more interesing for LtU readers? Read it at Wadler's blog.

XML feed