Patrick Logan on patterns

Patrick, who used to be one of us, posted some thoughts on the subject.

Those of us who like the notion of mining languages to patterns should keep in mind the point Patrick is making.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Patterns...

...are those repetitive "thingies" (to paraphrase Logan's masthead) which are difficult to abstract away in your programming language of choice.

Many design patterns which were observed to occur in C++ (or even Smalltalk), but which cannot easily be turned into a class, function, template, or macro--can be so abstracted in other languages (or are otherwise unnecessary). Guys like Norvig and Graham have repeatedly made this observation, often as a criticism of the design patterns community (which is, or has been in the past, insufficiently rigorous for some), and of languages like C++ and similar.

But patterns exist in higher-level languages as well; though they generally occupy a higher conceptual level than the dirty implementation details handled by things such as "visitor", which essentially tries to solve the expression problem in single-dispatch languages (and doesn't do a very good job of it, IMO). MVC is one example of a higher-level pattern from Smalltalk.

At some point, diminishing returns applies, and no longer makes sense to distill a pattern into a language feature or library component; even if your language has the most powerful abstraction facilities you can think of.

My view is different

But patterns exist in higher-level languages as well; though they generally occupy a higher conceptual level than the dirty implementation details handled by things such as "visitor", which essentially tries to solve the expression problem in single-dispatch languages (and doesn't do a very good job of it, IMO). MVC is one example of a higher-level pattern from Smalltalk.

I disagree. MVC is not a higher-level pattern, regardless of whether you are from a Smalltalk-80 background or a Cocoa background or client-server projection UI (web browser) background. Why do you think MVC is a pattern at a "higher conceptual level"?

Constructing an example without details is disappointing. I realize I am asking you a tough question, but that is how it's done: I ask tough questions, and I wait for the answers. It strikes my eye that you consider MVC to be a higher level concept than a visitor. How does MVC stand above or out from Visitor?

In my view, patterns are a reply to the following conundrum I faced early in my programming career (ages 11 through about 20). The conundrum can be summed up as, "I don't have a solution, but I admire the problem." It is a relief to have a "way out", especially when you are not a natural problem solver. Put another way, I wish I had read Alan J. Perlis's advice on programming when I was a teenager:

10. Get into a rut early: Do the same process the same way. Accumulate idioms. Standardize. The only difference(!) between Shakespeare and you was the size of his idiom list - not the size of his vocabulary.

I'd add that Sheakespeare was relentless in differentiating the power of his idioms. The difference between my teenage self and Shakespeare is that Shakespeare had the instincts, after discovering Visitor, to then discover multi-methods. Adding a visitor pattern to your idiom list is only useful if you understand its internal and external complexity. In my experience, visitor is a function object that profoundly effects maintenance, because extending the tree of elements it operates on with new types requires touching code in more than one place. It is seductive, because we are taught early in school a very lazy way to solve problems: expansive tree modeling.

The visitor pattern is really just a throwback to how we teach students about complicated subjects. We teach brute force over elegance, by requiring rote memorization of a sheet of physics formulas. In solving physical problems, we're taught to proceed by searching for "the right formula". We're also taught to "just try something", that by traveling down a solution path we might get lucky. Modeling systems "by probable cause" is a terrible way to simulate nature, and therefore do mathematics. What they are effectively teaching you is that once you've found "the answer", the tree model collapses into a single branch. You are also expected to show this path on a test, because your result is not good enough. Teachers want to test your ability to collapse trees into branches, and they do not want you to show generalizations.

Perlis has several good epigrams on patterns, especially the following one:

15. Everything should be built top-down, except the first time.

One of my favorites, because it applies so well to the expansive tree modeling first paradigm.

Also, some design patterns are eternally useful design principles, and calling them by the "pattern" misnomer is poor elocution. For example, when designing a system, I concentrate a lot on "the facade story". If I get it wrong, the API will be wrong and my software factory will not facilitate rapid application development. Instead, it will resemble a dinosaur in a tar pit. Eventually, we'll pronounce it a fossil and "re-write it" or create a "legacy" wrapper for it. Effectively, what we're doing is building a new factory where part of the factory is reserved for a "Code Museum" to exhibit all our past, failed, fossilized factories.

For what it's worth, the facade story does exist as a principle, although people tend to be ignorant of it:

The structure of a system tends to mirror the structure of the group producing it.
-- Conway's Law (Mel Conway, April, 1968 Datamation)

In other words, if the structure of a system does not match its users, the roles of its users will have to adapt.

What do you mean by

What do you mean by "higher-level" languages? Or do you mean C++ is inferior? Please stop saying things like this.

[Admin]

I remind members to read the LtU policy on user names and try to comply with it, if possible.

Going to a higher level

I suspect that Scott is using "higher-level" to mean that the language provides "higher-level" abstractions. C is a higher-level language than assembler, because it provides abstractions for dealing with things like loops, procedures, and typed or structured data. C++ is a higher-level language than C because it provides abstractions for dealing with things like dynamic dispatch and inheritance relationships. And yes, some languages are "higher-level" than C++, because they provide (within the language) abstractions for dealing with things like first-class functions, generic programming, and the like. See the relevant article at Ward's Wiki for more.

Correct

I was not "flaming" C++, merely noting that some of the patterns in the GoF book--visitor being the example I gave--are handled "natively" in other languages like Common Lisp, or anything that supports true multiple dispatch. If you have multi-methods, you don't need visitor.

I use C++ quite a bit professionally, so I'm quite aware of its strenghts and limitations. I actually like it more than many folks around here, but there are things it doesn't do well. The lack of algebraic sum types and pattern matching, for instance, makes it a pain to write a PL with; you can use inheritance as a substitute, but it isn't a very good one.

Visitor vs generic function vs ADT

Visitor isn't really a substitute for CL style generic functions. Vistor allows you to add operations arbitrarily, but it seals the types it can handle in the form of the acceptFoo/acceptBar/acceptBaz interface. Generic functions, on the other hand, are extensible in both the operations you can perform (add a new generic function) and the types you can dispatch on (add new methods to a generic function).

Visitor is more nearly a substitute for closed ADTs and pattern matching. In a very real sense it takes all the upsides and downsides of the ADT/pattern match side of the expression problem and moves them into an OO context - it makes it easy to add operations and hard to add data constructors.

To bring this all back on the original topic, the big downside to visitor vs ADTs is the metric ton of boilerplate that has to be written every time you want to use it. That's why pattern matching and sealed hierarchies can still be a big win in an OO language.

I followed half of what you said

I followed half of what you said, and found the half I didn't understand worth trying to comprehend.

In particular,

the big downside to visitor vs ADTs is the metric ton of boilerplate that has to be written every time you want to use it. That's why pattern matching and sealed hierarchies can still be a big win in an OO language.

I don't see the big win you are referring to. Are you saying it is a best practice to use OO to model static is-a hierarchies (using Layered Supertypes, etc.)? Are you implying pattern matching (the use of a combinator DSL like CSS Selectors) is necessary to deal with composable types embedded within the elements in the hierarchy? If so, I would say be careful going the route suggest by the former, and the latter just allows you to obey the Open-Closed Principle and section changes off into separate places for orthogonal concerns. e.g. Use one CSS Selector for layout, and another for brushes (text color, background, foreground, etc.), and so forth.

Boilerplate.

Compare (but yes, the code is trivial): Chapter 4 of A Little Java, A Few Patterns in Oz to Chapter 3 of The Little MLer in Oz (both by LtU's own Chris Rathman).

Another "visitors vs. ADTs" example that is a bit less trivial is the classic symbolic differentiation example (e.g. this midterm project from a course by Peter Van Roy).

hmm... boilerplate is not

hmm... boilerplate is not how I would describe the downside of visitors. Are you saying acceptFoo/acceptBar/acceptBaz is boilerplate? If so, then I think you've missed the point the Nice programming language designers as well as the subject-oriented programming folks at IBM were arguing over. (In my scholastic research, these are the first two known projects denouncing the visitor pattern.) Has Peter Van Roy or Chris Rathman written anything critical about the visitor you could point me to that isn't written in French?

Digging for an hour didn't turn up anything by them, though I did find a couple papers I don't have scholarly access to. In particular,

Bruno C. d. S. Oliveira, Meng Wang, and Jeremy Gibbons. The VISITOR Pattern as a Reusable, Generic, Type-Safe Component. (This paper deftly criticizes the other papers I've read for the authors use of reflection or introspection schemes, and appears to be the best paper of the lot.)

I've never read A Little Java, A Few Patterns. It is not clear to me what the value is in transliterating Java to Oz, which, without reading the book, appears to be what Rathman has done. Hence, your statement about boilerplate. With regards to boilerplate, I think what you are really trying to say here is, "isn't it cool that discriminated unions can have member functions?" I'd say its important to point out you can define a catamorphism on your discriminated union, because it encapsulates all logic for walking any data structure that can be described using a discriminated union. Letting a catamorphism take care of recursion, traversal, and pattern matching is just a nicer way to represent your problem. I'm not sure I'd call this a "BIG WIN", though, and certainly not a "BIG WIN" for OO (over, say, FP). If anything, this is a win for generic programming. My point to Scott Johnson way up above remains: don't just generalize your "fold" operators, generalize your problem.

My guess is most programmers don't see the power in generalizing the problem. This reflects my real world experience talking to people who've struggled with poor problem models. To go back to Shakespeare, when Grendel sings his song of sorrow after Beowulf tears off his arm, Shakespeare's audience tends to mistake it for a wail.

Not my point

certainly not a "BIG WIN" for OO (over, say, FP)

It was not my point that pattern matching makes OO a big win over FP. My point was that OO with pattern matching is a big win over OO with visitor pattern. That thinking might change when visitor becomes a reusable component rather than a pattern.

Also, I didn't capitalize "BIG WIN."

Visitor: pattern matching or fold?

My understanding of the Visitor pattern is that it is more akin to a fold rather than to pattern matching, in that it (like a fold) encapsulates a particular traversal strategy and to some extent insulates client code from changes in the underlying data structure representation. Pattern matching, on the other hand, exposes the representation of the data structure (at least, as implemented in most languages). Indeed, it seems that you could implement the visitor pattern using pattern matching (pseudo-Java), very much like a fold:

public class Term {
    public final void visit(TermVisitor v) {
        match (this) {
            case IntLit(int i): v.visitInteger(i); break;
            case Lambda(Var param, Term body): v.visitLambda(param, body); break;
            ....
        }
    }
}

The idea being that instead of the visit method being abstract and the concrete implementation spread amongst sub-classes, you can instead concentrate it in the super-class (using pattern matching). The type signature remains the same. So, while pattern matching is a very useful language feature, I don't think it replaces the visitor pattern, any more than it replaces a fold.

I don't get it

Pattern matching, on the other hand, exposes the representation of the data structure (at least, as implemented in most languages)

Both Scala and F# address that problem. Scala uses "extractors", F# uses "active patterns" - I don't know enough about F# to describe the difference, if any. The Scala solution is basically to call a function that the compiler defines in a standard way, but which the user can custom write if there are representation details that need hiding.

In fact, Wadler proposed a solution for Haskell-like languages: Views: a way for pattern matching to cohabit with data abstraction. So I'd say that the current lack of representation independence in most pattern matching languages isn't inherent to the concept.

As for your code example of the utility of visitor even with pattern matching it's not clear what the extra level of indirection is buying here. Term has a visit method which pattern matches and calls TermVisitor visitInteger or visitLambda methods. Why wouldn't you just write a class (call it TermVisitor) with a method that takes a Term and pattern matches on it directly? Internally it could have "handleInteger" and "handleLambda" methods if that was the clearest way to handle the work. If you want the indirection via Term visit for some reason, you could still have that method just turn around and invoke TermVistor visit.

Encapsulation

Both Scala and F# address that problem.[...]

Which is why I qualified the statement with "at least, as implemented in most languages." I was aware of Wadler's paper, but wasn't aware something similar was actually implemented in Scala and F#.

As for your code example of the utility of visitor even with pattern matching it's not clear what the extra level of indirection is buying here. Term has a visit method which pattern matches and calls TermVisitor visitInteger or visitLambda methods. Why wouldn't you just write a class (call it TermVisitor) with a method that takes a Term and pattern matches on it directly?

Precisely for encapsulation. Why would you write a fold instead of just pattern-matching on terms? My point is that Visitor is just the OO way of writing a fold. If each TermVisitor directly pattern matches on the Term, then you need to expose the representation of that term to every such client. Encapsulating the traversal in a fold, or a Visitor pattern, or by virtualising the pattern matching operation, solves this problem and so the clients only depend on a particular interface rather than some concrete representation. My instinct is that it is nicer to use the standard interface mechanisms of the language and a fold/visitor, rather than to virtualise pattern matching, but possibly the latter approach results in more succinct code.

Ah!

Okay, now I see. So let me amend slightly: ADTs and pattern matching replace visitor if the language has some way to separate pattern matching from representation details.

Why not do both?

Why would you write a fold instead of just pattern-matching on terms?

Why not do both?

You can write a catamorphism (read: generic folding operator) that accepts arguments that specialize the function. The whole point of dynamic dispatch is to divide an algorithm into a set of special run-time scenarios. Use pattern matching to label those scenarios as match cases. At run-time, the generic function delegates the message to the best case. In a statically typed language, pattern matching is the cleanest way to select the best case. By pattern matching against the arguments, you've effectively statically implemented dynamic overload resolution (multi-methods). In fact, in a language that doesn't have built-in semantics for either pattern matching or multi-methods, you could implement both using a finite state machine.

My instinct is that it is nicer to use the standard interface mechanisms of the language and a fold/visitor, rather than to virtualise pattern matching, but possibly the latter approach results in more succinct code.

Instincts need not apply. Just read Perlis :)

1. One man's constant is another man's variable.

With pattern matching, every named constant is a potential parameter.

2. Functions delay binding; data structures induce binding. Moral: Structure data late in the programming process.

Pushing the pattern matching into the implementation details and out of the view of the client programmer allows you to manage complexity without exposing it to the client.

6. Symmetry is a complexity-reducing concept (co-routines include subroutines); seek it everywhere.

The specific folding strategy hides behind a generic interface. Each match case implements the fold with a symmetric Command interface.

22. A good system can't have a weak command language.

Faking pattern matching with ad-hoc polymorphism complicates the public API, and makes it harder to reason about execution side-effects and software maintenance. The primary use case for a Visitor is in conjunction with a Command pattern (an interpretter, pretty printer, etc.). The Visitor's public interface should emphasize its a helper for the Command to do its job.

Why?

You can write a catamorphism (read: generic folding operator) that accepts arguments that specialize the function.

Could you elaborate on what you mean here (perhaps with an example)? Do you mean (e.g. for lists) specialising some generic fold into either foldl or foldr, or something else? I'm not sure why you would want a function passed to a fold to be able to pattern match on the structure the fold is supposed to encapsulate. Doesn't that defeat the purpose?

Why would you write a fold

Why would you write a fold instead of just pattern-matching on terms?

Often, I won't. A good many traversals aren't folds, or at least aren't pure folds with no need for a projection afterwards. Also, the typical notation for a fold actually obscures connections to the problem domain - especially with larger ADTs! Finally, if we're looking for abstraction then it's not abstract enough either - I'd probably want to work with strategies.

That said, if I've got an operation that makes sense as a fold then I write it as one even if it means having to write the fold first. Views are something of another issue, and I have to admit to some relief that GHC's getting view patterns in 6.10.

Transliteration

transliterating Java to Oz, which, without reading the book, appears to be what Rathman has done

Since a similar discussion on reddit cropped up, I'd note that the Oz translations are not transliterations. Transliteration is the act of spelling a word in one language using the characters and symbols of another language. Given that the semantic meaning of the translated Oz code has the exact same meaning in both the source and target languages (unless I've made some error), I'd say that attempting to misapply the negative connotations of transliteration is not very meaningful. In any translation, there is the concepts of fidelity vs. transparency. So the criticism being leveled is that the translation errs on the side of fidelity to the source language instead of being transparent (idiomatic) in the target language.

The question in the particular case of the Visitor Pattern in Oz is whether this truly represents a departure from idiomatic OOP in Oz? I'd say that the Visitor Pattern will be rarely applied in Oz given that it has pattern matching facilities. But that would probably also be true in a language like Scala.

Indeed

I'd say that the Visitor Pattern will be rarely applied in Oz ...[but] that would probably also be true in a language like Scala.

In a rebuttal comment to a blog post attacking Scala's use of pattern matching as not being sufficiently OO, Martin Odersky wrote:

Let me relate my experience to you: As you know I wrote the javac compiler with visitors and various ad-hoc solutions and I wrote most parts of the scalac compiler with case classes [edit:, Scala's mechanism for ADTs,] everywhere. The difference needs to be experienced to be believed. I would never, ever go back to a language that did not have case classes (or some equivalent) and pattern matching. Not if I wanted to write a compiler, or any other software that analyzes and transforms symbolic information.

See also In Defense of Pattern Matching.

Well, that blogger posts without thinking

With all due respect to Beust, who has helped revolutionize software testing with Test-NG... he tends to post flames without thinking, expecting others to provide a cogent argument for him. (He also flamed Erlang over its claim of being more fault tolerant, and couldn't accept that some execution models are more fault tolerant than others.) This isn't a bad strategy, providing you learn from it, but it upsets others.

What Martin Odersky did with the design of Scala is the same thing most language authors do when writing compilers for their language using incremental versions of the language itself. Niklas Wirth did it for Pascal. Wirth may've originated this bootstrapping, according to his story of how he determined Pascal's feature set by implementing a Pascal compiler in Pascal, putting features into the language that he felt were simultaneously easy for compiler writers to implement and useful semantics for compiler engineering.

In highly composable systems, pattern matching is essential to separate orthogonal concerns. Pattern matching, and especially the example of using combinators to analyze strings, is a very OO way to analyze streams of information. It allows you to conform to the Open-Closed Principle and avoid a very sinister form of object coupling that makes maintenance unnecessarily difficult: what I call The Bread Crumbs Invocation Code Smell, e.g. C# (really, Cw (C-Omega)) property accessing such as Obj1.Obj2.Obj3. This is the second worst form of coupling in OO systems.

See: Steve Freeman, Tim Mackinnon, Nat Pryce, Joe Walnes. Mock Roles, Not Objects.

[Mocks are] a technique for identifying types in a system based on the roles that objects play … In particular, we now understand that the most important benefit of Mock Objects is what we originally called interface discovery.

See also their QCon 2007 London presentation, which further elaborates on their original position by recommending Mock Roles, Not Object States.

Martin's reply to Beust was simply, "Don't knock it until you try it", and a bunch of Ph.D.'s replying to Beust's blog went right over his head. All most people care to have explained is the consequences of their actions. The layman doesn't care about category theory, and all that jazz theory. He wants practical arguments about OO system design.

Martin's points In Defense of Pattern Matching are deft, but he misses out on the argument I always provide: (Pre-mature) encapsulation is not always a good thing, and it is okay to break encapsulation so long as you can preserve invariants and have an artifact, such as source code, ensuring the model is stable and maintains its integrity (e.g., a tail call shares the caller's return address with the tail (base) case in order to perform a call optimization). What you really want is a way for your static type system to enforce the Open-Closed Principle to allow for event scripting instead of monolithic event handling. That's the argument Martin and most others miss, and it's a biggie.

The notion that sometimes encapsulation may be premature in OO languages is discussed by P.M.D. Gray's position paper in the 1996 foreshadowing of our field by UK researchers, titled Computing Tomorrow. See Chapter 7, Page 110: Large Databases and Knowledge Re-use. This isn't your ordinary discussion of how-to-store-artifacts; this is the point-of-view of a well-regarded database researcher with respected contributions to knowledge engineering and constraint processing.

By the way, lighten up on the "BIG WIN". I wasn't mocking you, or anything. Realize you are reading text and not my tone of voice. I am typing with one hand.

If so, then I think you've

If so, then I think you've missed the point the Nice programming language designers as well as the subject-oriented programming folks at IBM were arguing over.

I must admit I don't know about Nice and subject-oriented programming. So I don't know what point their creators were arguing over. Can you elaborate?

I think what you are really trying to say here is, "isn't it cool that discriminated unions can have member functions?"

I don't know if I was trying to say that, because I don't know what that sentence means.

I'm a rather concrete person. When I read A Little Java, A Few Patterns after having read The Little MLer, I thought "wow, Pizzas and Shish Kebabs again, but with a lot of repetitive code". I pointed to Chris's translations because they seem to illustrate that observation well (I don't have The Little Books here, but from memory, I think his translations are pretty faithful).

I only linked to PvR's course material because it describes a classic but not totally trivial problem that can be solved it a (to me) nice way using pattern matching.

Sorry.

Sorry. I guess people here a more familiar with Peter Norvig's criticisms and Paul Graham's criticisms.

For Nice's argument, see: Visitor Pattern Versus Multimethods

For subject-oriented programming's argument, it appears you have to use The Wayback Machine: Subject-oriented programming and the visitor pattern. I am unaware of the authors ever "polishing" their thoughts on the matter.

I think what you are really trying to say here is, "isn't it cool that discriminated unions can have member functions?"

I don't know if I was trying to say that, because I don't know what that sentence means.

A discriminated union can have member functions (in F#), which makes it possible to implement different strategies for your catamorphism. Your catamorphism can use strategies, instead of writing one visitor for each strategy, you write one catamorphism which applies a general strategy and pattern matches against a type of the discriminated union. You have to supply a specific strategy to the catamorphism, and possibly also complicate your discriminated union by adding or overriding member functions to its type. Its a trade-off, and you have to be careful to avoid extending your types with doIt() member functions. Otherwise, your types will have more knowledge and responsibility then your problem conveys. Also, you probably want to prefer F#'s "type augmentation" to re-editing the type definition (think of type augmentation as C# extension methods), and package your augmentation in the same assembly as your catamorphism's specific folding strategies. However, as Odersky points out In Defense of Pattern Matching, the thing you really don't want to do is expose internal data.

Again, seeing the big picture, what you want to be weary of is expansive tree modeling.

Subject-oriented programming

I can't seem to find what

I can't seem to find what data structure they are using to represent artifacts; I just see buzzwords. They express their ideas using hifalutin vocabulary specific to their research, such as "hyperspaces", "concern spaces", etc.

Are they referring to coordinate hyperspaces? If so, why the statement that a hyperspace is-a concern space? This betrays my mathematical definition of the term, since I don't map "concern space" to any mathematical knowledge I have acquired. What does the hierarchical state diagram of a hyperspace design look like?

I don't particularly care for these apparently unnecessary, arbitrary, premature abstractions. Forgive me, I mimic Dijkstra: show me an elegant argument and I will know it is correct; otherwise, consider me characteristically unimpressed.

Has LtU covered Hyperspaces before?

Similar reaction

The information on the website looked like mumbo-jumbo. After skimming the paper Multi-Dimensional Separation of Concerns using Hyperspaces linked from the site, I was more skeptical. Separation of concerns isn't something that just requires a new syntax. Concerns in general are inseparable and techniques for teasing them apart are domain specific. It might be possible to encode a particular separation of concerns into this framework, but I don't think the framework does much to address the harder problem of finding a workable separation. It also doesn't seem to provide much in the way of checking the validity of your decomposition. I saw an explanation of "declarative completeness" checking (did you declare all of your symbols in the right scope), but nothing for deeper semantics. But it's possible I missed the point...

ML/Haskell

As Alexander Richer points out I was talking about the closed algebraic data types and pattern matching offered by languages of the ML/Haskell ilk.

CSS pattern matching, regular expression pattern matching, XSLT pattern matching, etc are all related in the abstract but would probably take us too far afield from the discussion of visitor.

CSS combinators astray from pattern matching + visitor convo?

Forgive me if I my viewpoint is naive, but I took the view awhile back that catamorphisms on discriminated unions provide you with side-effect free, mutually exclusive combinators for free.

In CSS, the right-hand values supply DOM elements with new/overridden values for the left-hand pattern matching expression. These new values are essentially provided by Command patterns operating on a projection of DOM tree elements.

CSS doesn't guarantee mutual exclusion, even within a single property sheet. It simply guarantees order of evaluation. Now, this does lead to an interesting topic, which Wadler's Views paper forces you to deliberate upon: hygienic ways to express side-effects when using combinators. Jonathan Edwards' subtext project has been working on stuff like this, and FireBug does a decent job, too, such that its been copied by Microsoft Expression Web.

The only murky thing about CSS combinators is the awful W3C specification for them! However, it is still better than the specification it replaced (in conjunction with XSL and XPath): DSSSL, which expected the Layman to write Lisp inside Baroque SGML.

Pattern Matching with Objects

Here's a relevant paper to this discussion Pattern Matching with Objects by Burak Emir. It talks about the expression problem, single dispatch virtual methods, runtime type testing in various forms, the visitor pattern, pattern matching, and representation independence.

That paper has a very good set of criteria

That paper has a very good set of criteria, IMHO. JMHO, but Criteria 5, 6, 8 and 9 can be summarized as: How well does the approach fair in adhering to the Open-Closed Principle? (Although, I like the fact it is broken down so thoroughly.)

Good find, thanks.

A pattern of forgetfulness

I think that Patrick is on to something here. If you go back and look at the origins of the patterns movement, which lie in Christopher Alexander's work on architectural patterns, you'll find that one of the most important aspects of patterns was the fact that they formed a "language" of interlocking patterns at different levels of abstraction. Indeed, Alexander's original patterns book was called "A Pattern Language". The intent was that by using the language to compose a design, designers could generate a coherent architecture in which the pieces at different levels of abstraction all "fit" together. The early software patterns work by Ward Cunningham and Kent Beck also focused on developing pattern languages, not just individual patterns. Somewhere along the line, the software community seems to have lost sight of that (even the original GoF book is more a collection of patterns than a coherent language), which has perhaps compromised much of the value of patterns.

some authors don't forget!

Some authors don't forget!

The first patterns book I enjoyed reading was Holub on Patterns. To say the least, it is a unique book. It is four chapters long, and the last chapter is almost 300 pages! This wasn't sloppiness on the author's part, either. It's worth pointing out that the tech book publishing industry used to set 50 page limits on chapters. Here's a review snippet from TechBookReport:

Unlike the majority of books on design patterns in software, this one is not structured around a series of stand-alone patterns that exist in splendid isolation. Where most patterns books are catalogs of varying degrees of complexity, this one is heavily code-centric and seeks to show patterns as they exist in the wild. This makes for a much more complex and demanding read but it's necessary because the real world is much messier than the relatively simplistic world that emerges from the relatively simple examples in many design patterns texts. There's no doubt that there's a big gap between an intellectual understanding of patterns and being able to recognize when and where to use them in your own code. This is the gap that Allen Holub aims to bridge with this book.