quite "old" books discussion

Hi,

many of you probably do not know me (and vice versa), I'm quite "new" in the field of computer programming (and LtU) and basically self-taught.
I live in South America (Argentina) and, of course, I speak spanish. In my view of the "arena" of computer science here, we're are pretty "late". I think that the reason is mostly due to the lack of translations of books in spanish (considering the overwhelming amount of publications of computer science in english).
To make a little absurd example, in our universities are still using the first edition of the Dragon's book (isnt a REALLY good example, but it makes the point). Anyway, I'm a big fan of "old" books, I dont know, I have a certain kind of appeal for them, I feel in the "new" publications a more present presence of the "industry" or "what the industry wants" than real concerning about "true knowledge".
To make the story short (maybe its pretty late for that!) I have come across in my search of certain "oldies" which I bought but did not read yet. I'll list them here and I'll appreciate any comments from people who read them or know them about if they are useless or good things can come from them.

Harlan D. Mills - Principles of computer programming: A mathematical approach (1986)

PJ Brown - Software portability, an advanced course (1979)

Perlis, Biggerstaff - Software reusability, Vol. I concepts and models (1989)

and currently reading, Richard Bornat - Programming from first principles (1987)

they are pretty old, yes, and you may think that I'm a little crazy, that's true too (lol), but, I don't know, thay caught my eye !

Best regards,
Sebastian.

PS: I'm sorry for any mistake that you might encounter in my english !

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Old books are good

They give you a feel for the way that people think about the subject has changed. Further, the quality of exposition is often higher with old books (really, books from before 1980), because publishers cared more about quality then, and what authors thought was required from them in terms of content was somewhat higher. It's less suitable for clasroom teaching, say, and is much "lower tech", but the quality of exposition, nature of engagement with applications, and thoroughness of The Scott-Strachey Approach to Programming Language Theory (Stoy 1977) simply is higher than Semantics of Programming Languages (Gunter 1992), and I imagine I will say the same about Practical Foundations for Programming Languages (Harper, forthcoming).

This is not really meant as criticism of these later works —they are excellent— it is a point about the changing role of books in the culture of computer science. By contrast, it seems to me that graduate-level classroom teaching and informal scientific communication are much better now than it was then, with the same people being involved.

Exposition quality is higher

Exposition quality is higher with old books that have remained popular...simply because they have withstood the test of time. I'm sure a bit of the content written today will be highly regarded in the future. The mega-book store trend with lots of shelf space has encouraged the proliferation of crap books, but I bet the amount of quality content is the same (its just a smaller piece of a larger pie).

The same has basically happened to papers: a proliferation of conferences with paper slots to fill has attracted more papers of mediocre merits. There should still be some masterpieces in there, but the majority won't stand the test of time. Same with grad students: more grad student slots will invariably reduce quality, but there are still some brilliant ones.

We should all be striving for masterpieces although most of us are not capable, nor does the system often have enough patience to wait for one (how can you if you must publish 20 papers a year?).

20 years from now we will look at today with rose tinted glasses since only the masterpieces will have survived, probably with a different area of focus from the other eras. IMHO PL hasn't done too well since...the early 90s, but perhaps that is just because people have focused their attention elsewhere. We are due for a revival though.

Nostalgia just ain't what it

Nostalgia just ain't what it used to be.

Posterity

I don't think the problem is a nostalgia-induced illusion. I think you hit one of the problem's nails on the head with nor does the system often have enough patience to wait for one (how can you if you must publish 20 papers a year?) — this and the preference for conference publications over journal publications that emerged around 25 years ago, have meant that fewer PL writers have even thought about how to write for an audience 20 years or more into the future. Dana Scott, I understand, saw his writing in the tradition of immortal figures like Tarski and Prior —both teachers of his— and the ethos of Strachey and Scott made the world in which Stoy wrote his classic.

Then there's a whole different sort of problem with changes in the academic publishing industry.

About twenty years ago, my

About twenty years ago, my Master's adviser told me textbooks lag at least seven years behind the cutting edge because their authors can't take the time to write them until they've got tenure. The alternative to waiting that long would involve lowering the bar for the textbooks themselves.

Most old books are just old

I agree that it's useful to see how perspectives evolve and change.

That said, there are plenty of cruddy books written both before and after 1990. Gunter's text is simply a weak text. The green dragon book, in my opinion, wasn't an improvement on the red one.

There are also good ones. Benjamin Pierce's books are tough to beat in any decade, with the notable exception that his book on category theory is too dense for mortals. That may reflect more on category theory than on Benjamin. :-)

In architecture, Hennessy and Patterson is strong, as long as you don't expect John's math to be right. It was a running joke in his lectures that he can't add on his feet.

The quality of the book is a function of the quality of the author; that's all.

absolutely true (at least for me)

Mr. Stewart that is exactly the way I feel about "oldies" even if I didn't read anyone yet, its like only if you read the preface "you know it" you know that the book has strong possibilities to be good, that's what happened to me in my search (at amazon) of books to buy I admit that I searched for them in electronic format (after), they are not quite expensive but the shipments does kill me and I obviously wanted to "check" the quality -I am not saying that I am a master of quality checking since I am new and can not tell whether its is presenting wrong information- but if I can understand it (due to my nature of "foreing reader" that would help to make me grasp the principles and then tell if it is wrong or right -or seems rasonable-).

"low tech" seems to be a big plus since there is less effort about tech specific stuff and more "abstract thinking". Of course, this is one of the reasons why I asked if some of my boughts were "useless" because some topics cant be separated from tech because the topic is actually TECH (and as a newbie I cant tell 100% which topics cant, you see in my list portability, reusability, programming, etc).

For what I can undertand you are saying that The scott approach is better (at least to you) that the later publications mentioned, is that right? In that case I'll check for that book since I do not have any book on my shelf about PL theory -it always seemed hard to me and it may be very helpful-.

I also have:

A.K Dewdney - The new turing omnibus, sixty-six excursions in computer science (1993)
Hoffman, Weiss - Software fundamentals, collected papers by David L. Parnas (2001, I think)

Thank you very much for your comment and advice.

Warmest regards,
Sebastian,

As Landin would say ...

Mr. McDirmid,

to make an "analogy":

"....today... 1,700,000 computer science books to 'communicate' about 700,000 application areas"

will be correct?.

your comment is very much appreciated.

Warmest regards,
Sebastian.

One "old" book that I like

Object Oriented Software Construction by Bertrand Meyer

Not a bad book, but I'm not

Not a bad book, but I'm not sure I would call it a classic. I think the Smalltalk 80 color (purple, orange, green, blue) books are much more inspiring if you are into OO programming.

DbC

Having not read the Smalltalk 80 books, I cannot do a comparison..
OOSC also introduces "Design by Contract" which is interesting.

a wonderful blend

of terrific insights that are still worth-while, and horribly stunning gaffes-in-hindsight such as the suggestion that we use inheritance for any reason any time any where go forth and multiply (and damn the maintenance programmer who takes over).

Some old books are great. We

Some old books are great. We can all also agree that not all old books are great. Which brings us to the books mentioned by the OP... any opinion on those?
FWIW, it is rather strange to see a list of recommended books for SE students I put up around 2000.

Some old books are unread by me

From your list, 1-4,8 seem sound, though I guess design patterns have lost a little lustre in the intervening years. I haven't read the others.

From Sebastian's list, I like the dragon book. I've leafed through Bornat's book, but I couldn't really be said to have read it. I haven't looked at the others, but the Mills book sounds interesting.

I add some others to the discussion

Hello everybody, it's nice to see many replies, and all as interetings.
I add to my previous list some other books but I did not bought them they are only waiting in my "original list", as the ones I already bought they seemed interesting so, here they are ...

(I'll put next to the book for what pourpose I chosen them, because
maybe the pourpose is wrong but the book is good anyway)

Leon S. Levy - Fundamental concepts of computer science,
mathematical fundations of programming (1988)
- I dont have a sound idea about what can be learnt
from this one-

Larry L. Constantine - Software for use (1999) -for making interfaces
more user-friendly-

Greg Michaelson - An introduction to functional programming through
lambda calculus (1986, 2011 3nd ed)
- obvious reasons, it actually seems to be very good
and I like it as a starting poing because it is
not very language-centered-

Dromey - How to solve it by computer (1982) -Algorithms-

Dijkstra, Dahl, Hoare - Structured programming (1972)
-because its a must?, lol, wanting to read
most the section by Dahl-

Aho, Ullman - Theory of parsing, translation and compiling (1972)
-its said in the preface that they wanted to focus
in concepts which will pass the test of time, I dont
know if they did, but that's what they said-

Aho, Ullman - Dragon's

Bornat - Understanding and writing compilers, a do-it-yourself guide
(1979)

Levine - Linkers & loaders (1999)

Brinch Hansen - Operating system principles (1973) - and some others
by Hansen about concurrent programming-

Tharp - File organization and processing (1988)

well, those are some of the books that I plan to read someday, of course there are others but they not fall in the "old book" category.
So I'll appreciate some comments in both ways, if you think they are useful or if they are pretty out of date and useless.

Warmest regards,
Sebastian.

The Psychology Of Computer

The Psychology Of Computer Programming by Gerald Weinberg (1971)

This one is a rather famous

This one is a rather famous book, that is always just below the radar. But when I did finally look at it, I remember I wasn't as impressed as I thought I might be. It was awhile ago, so I don't remember any details, just that I was underwhelmed.

The main problem with this

The main problem with this book is that it didn't provide immediate gratification, that wisdom in the book would require actually reading it and pondering it in a feedback loop (read, ponder, re-read, ...). Its definitely not a casual book, but is worth it if you are dedicated to learning the area.

I would love to see some updated work in this area in book form, something more structured than just reading through PPIG papers. In particular, I think there is a lot of promise in bricolage/experimental programming (rather than think carefully before writing code approach) given the advancement computing technology.

Bad Textbooks can be Better

But sometimes a bad book can be better than a good book!

The long-standing dominator for operating systems texts - Silberschatz and Galvin's Operating Systems Concepts - includes an amazing number of definitively stated practical conclusions based on theoretical criteria. These provide a textbook example (forgive me) of logic used as a means to go wrong with confidence. Which can be very useful in a classroom!

For example, it insisted for six editions that the "buddy system" should be the preferred kernel memory allocator on the grounds that it minimizes fragmentation when presented with a random allocation request stream. Which is correct.

Red face test: name a production OS that uses the buddy system? There aren't any, because kernels don't generate random allocation request streams! In fact, the allocation streams of a modern kernel are very regular and predictable. Allocations can lead to live-lock, and the buddy allocator isn't very well-suited to addressing that.

There are comparably fundamental errors in every chapter of every edition up to the 6th (I haven't checked later editions). I've seen Avi make this same mistake in person. He is not under-burdened by self-assurance, and his listeners are rarely willing to call him on this sort of thing.

In spite of this, I would say that Avi's text, even with its bad conclusions, is far better than any other text I have found. Its organization is strong, it connects outcomes to fundamentals, and it teaches students to do the same. Given those three things, students can derive correct understandings when led to the right questions. Ironically, this made the book stronger in the classroom, because each week my students knew they would get asked which assumption Avi bungled that week, what was actually true, what conclusion should have been drawn based on the facts, and why. A well-organized, well-connected, well-written, and consistently wrong text coupled to a sequence of "red face tests" in this way is a wonderful learning tool. It forced the students into reading critically.

Unfortunately, there aren't many teachers in academia who know enough about the practicum to provide that sort of reality check. There are many more than the outside world believes, but far fewer than we need.

Textbooks become dominant because (a) the author is a strong writer (b) they know enough to appear credible to the buying audience, few of whom are expert (c) the author has time to write a text book, which is an incredibly demanding and laborious process, and (d) they are good enough to pass the "put up or shut up" behavior filters of potentially better authors.

But I still have a really hard time bringing myself to call it a good book...

Notes

Unfortunately, there aren't many teachers in academia who know enough about the practicum to provide that sort of reality check.

If you teach it again next year, why not try to collect those "red face facts" into a releasable document? It may interest other teachers and, if written tactfully, may even not come out too adversarial (... different situation, different assumptions ...).

Well some good books are "bad" too

I think that "programming pearls" is considered to be a good book, but when I read 'middle := (low+high)/2' in the book (in an example about proving programs!!!), I think that it took me more than 10 minutes to stop laughing and being able to continue reading.