For some inexplicable reason COBOL doesn't get much love from LtU. But COBOL turns 50 some time this year and we owe a tip of the hat to this venerable language behind so many large institutions.

The Guardian understands.

According to Michael Coughlan, a lecturer at the University of Limerick, one of Cobol's perceived drawbacks is its verbosity. But he reckons that's also one of its strengths. "It's not just a write-only language," he says. "You can come back years later and understand the code."

This opinion is shared by Mike Gilpin, of Forrester, who is an ex-Cobol programmer. "Cobol is one of the few languages written in the last 50 years that's readable and understandable," he says. And he's scathing about the readability of more fashionable languages, such as PHP and Java: "Modern programming languages are ridiculously hard to understand."

There you have it! More readable than PHP and Java. A ringing endorsement for the next half century.

A Computer-Generated Proof that P=NP

Doron Zeilberger announced yesterday that he has proven that P=NP.

Using 3000 hours of CPU time on a CRAY machine, we settle the notorious P vs. NP problem in the affirmative, by presenting a “polynomial” time algorithm for the NP-complete subset sum problem.

The paper is available here and his 98th Opinion is offered as commentary.

A Tiny Computer

A Tiny Computer. An unpublished memo by Chuck Thacker, Microsoft Research, 3 September 2007. Posted with permission.

Alan Kay recently posed the following problem:

"I'd like to show JHS and HS kids 'the simplest non-tricky architecture' in which simple gates and flipflops manifest a programmable computer”.

Alan posed a couple of other desiderata, primarily that the computer needs to demonstrate fundamental principles, but should be capable of running real programs produced by a compiler. This introduces some tension into the design, since simplicity and performance sometimes are in conflict.

This sounded like an interesting challenge, and I have a proposed design.

Presents the design of a complete CPU in under two pages of FPGA-ready Verilog. The TC3 is a Harvard architecture 32-bit RISC with 1KB of instruction memory, 1KB of data memory, and 128 general-purpose registers. This design is an ancestor of the DDR2 DRAM Controller used in the BEE3.

To help us into the brave new world of Hardware/Software co-design!

Project Euler

Ran across a short weblog entry on Leonhard Euler, the father of functions and initiator of much in the way of number theory. The mention of Project Euler caught my eye, as I rather like projects that involve multiple PLs attacking the same sets of problems.

Project Euler is a series of challenging mathematical/computer programming problems that will require more than just mathematical insights to solve. Although mathematics will help you arrive at elegant and efficient methods, the use of a computer and programming skills will be required to solve most problems. The motivation for starting Project Euler, and its continuation, is to provide a platform for the inquiring mind to delve into unfamiliar areas and learn new concepts in a fun and recreational context.
Project Euler has been around for almost as long as LtU but this is the first I'd heard of it. I find the questions and the competitive gaming aspect to be interesting, though I have a long way to go (level 2 out 5). Not sure there is a direct tie into PLs, but since I'm using it to learn more math and investigate the breaking points and elegance of PLs... and anything involving multiple PLs in competition and mathematics can't be too far removed... and since I haven't posted a story to LtU in a while and this happens to be my current interest... well, that will have to suffice. I'll have to admit though that many of the solutions I looked at are either too rushed (brute force), too cumbersome (indexes flying everywhere) or too terse. But that's true of most code that I run across (including my own). Still hoping for a PL that has that just right aspect - though I'm leaning to Oz these days.

(For those of us that are looking for walk-through/cheat guides, the Haskell wiki has the code to the first 200 problems, and I've put my Oz code for the first 50 up on the CTM wiki).

If Programming Languages were <T>

With the recent popularity of the comparison between PLs and Religions (reddit, slashdot), I thought it'd be mildly amusing to see what other comparisons were out there on the intarweb.

Here's the list for the meme that I collected of If Programming Languages were ...

Probably others that I missed. (Note: There's probably material in here to offend all). (Personally, I think the obvious missing comparison is If Programming Languages Were Tools. I nominate Assembler as the Stick, being the most primitive).

Guy Steele & Richard Gabriel: 50 in 50

For those who like their PL History presented in avante guard beat poetry, a video of Steele & Gabriel's 50 in 50 speech at JAOO is made to order. Or as the link says:

A fun, artistic and enlightning presentation full of interesting facts - and who better to do it than Richard P. Gabriel and Guy L. Steele (the great Quux). Nothing more to say than the rallying cry; More cowbell!

Passing aside the Stephen Wright comic delivery of the two speakers, there are a lot of interesting thoughts, though very few are dwelled on. I think the most interesting things were the languages that they chose as expositions for the major ideas that they covered. Here's the ones that I picked out (though I ended up with only 49):

Do LoopsFortran (Pascal,APL)Guarded CommandsAlgol-68
Array OriginC, Fortan, Pascal, APLExtensible LanguagePPL
Domain Specific LanguageAPTStructured ProgrammingBLISS, INTERCAL
Text vs. EnvironmentAlgol-60, Lisp, SmalltalkLanguage as Educational ToolLogo
Stack MachinesBefunge (SECD Machine, Forth)Formal Dynamic SemanticsSECD
Data ParallelismAPLEnumerated TypesPascal
CoercionPL/I (Fortran-V)Backtracking and Theorem ProvingConniver (Prolog)
Hierarchical RecordsCOBOLArgument HandlingCommon Lisp, Ada, Python (VB, C#, Suneido, PL/pgSQ)
Pointers & ListsIPL-VCoding in Natural LanguagePerligata (COBOL, Hypercard)
ParsingYacc (LR1, Recursive Descent)Computational DramaShakespeare
Linked RecordsAEDReasoningProlog
Mathematical SyntaxMADCAP, MIRFAC, Kleerer-May SystemType DeclaratorsC
Line NumbersBasic (Focal, APL)Data AbstractionCLU, Alphard
Visual LanguagesPietDynamic vs. Lexical ScopingScheme
Pattern Matching & ReplacementCOMIT, SNOBOLKnowledge RepresentationKRL (Conniver, Microplanner)
BrandingAda (COMIT, SNOBOL, TRAC)Stream ProcessingLucid
Dynamic LanguagesAMBIT/LGeneric FunctionsCommon Lisp
Program as DataLispReflection3-Lisp
Macro ProcessorTRAC, ML/I, Limp, M4Metacircular InterpretersLisp
Call By Name vs. Call By ValueC, Algol-60Functional ProgrammingKRC
Dangling ElseAlgol-60Control ParallelismOccam
Formal Static SemanticsAlgol-68Domain Specific LanguagesHQ9+, MUMBLE
Algebraic Formula ManipulationFormac (Macsyma, Mathematica)Build LanguagesMake, Ant, Rake (JCL)
Message PassingSmalltalk (C++, C#, Java, Flavors, Common Loops, CLOS, Scheme, Dylan, Simula, Self)ScriptingPerl
ObjectsSimula (Smalltalk, C++, Java)

The Transactional Memory / Garbage Collection Analogy

Courtesy of my shiny new commute, I have been listing to various podcasts, including Software Engineering Radio. A while back, they had an interview with Dan Grossman on his OOPSLA 2007 paper, which I have not seen discussed here.

The Transactional Memory / Garbage Collection Analogy is an essay comparing transactional memory with garbage collection based on the analogy:

Transactional memory (TM) is to shared-memory concurrency
garbage collection (GC) is to memory management.

Grossman presents the analogy as a word-for-word transliteration of a discussion of each of the technologies. (Hence the "fun" category.)

(As an aside, Grossman does not address message-passing, but says, "Assuming that shared memory is one model we will continue to use for the foreseeable future, it is worth improving," which is probably a correct assumption.)

One point that he does make is that

[This essay] will lead us to the balanced and obvious-once-you-say-it conclusion that transactions make it easy to define critical sections (which is a huge help in writing and maintaining shared-memory programs) but provide no help in identifying where a critical section should begin or end (which remains an enormous challenge).

The one serious weakness of the analogy, to me, is that GC does not require (much) programmer input to work, while TM does.

Although some parts of the analogy are strained, there are some interesting correspondences.

Differentiating regions

As a follow up to the previous post, check out how Chung-chieh Shan applied regions to a seemingly unrelated problem. His post begins by explaining how automatic (numerical) partial differentiation can be implemented, and goes on to show how to use regions to avoid mixing-up the variables being differentiated.

ICFP contest starts tomorrow

Just a quick reminder -- the 2008 ICFP Programming Contest starts tomorrow.

Language geek at the Maker Faire

Maker Faire was fun, but you can read all about it on numerous web sites and blogs. While I enjoyed the Coke + Mentos demonstration like everyone else, some things caught my eye in particular and may also amuse LtU readers.

Talking to the guys demoing the CNC machines I discovered G Code which turns out to be the main machine languages used to control the CNC machines. It was cool to meet people who actually wrote their own software to emit or/consume G codes (the styrofoam CNC machine was way cool).

I also enjoyed The Art of Motion Control sculpture, since the text said "Path designs using custom LISP routines running within AutoCAD." I think that was the only one explicitly mentioning Lisp. At least, that was the only one I saw...

Any cool language references I missed?

XML feed