I'm going to let the dust settle on my recent posting on patterns, and then do a follow up—some interesting stuff has come out of it. For that, I think, I need a bit of supporting material some of which is here.
Recently this little gem resurfaced on reddit, which prompted a certain line of investigation. Normally I spread a lot of links around my posts[*] partly as aides memoir for myself, partly because my preferred influencing style is association, and partly because I believe that links are content. But I really want you to go and look at this cartoon, right now. Off you go.
Back again? Good. This "learning curve" metaphor is an interesting one. Before getting into this IT lark I was in training to be a physicist so curves on charts have are strongly suggestive to me. I want them to represent a relationship between quantities that reveals something about an underlying process. I want the derivatives at points and areas under segments to mean something. What might these meanings be in the case of the learning curve?
Most every-day uses of the phrase "learning curve" appeal to a notion of how hard it is to acquire some knowledge or skill. We speak of something difficult having a "steep" learning curve, or a "high" learning curve, or a "long" one. We find the cartoon is funny (to the extent that we do—and you might find that it's in the only-funny-once category) because our experience of learning vi was indeed a bit like running into a brick wall. Learning emacs did indeed feel a little bit like going round in circles, it did indeed seem as if learning Visual Studio was relatively easy but ultimately fruitless.
But where did this idea of a learning curve come from? A little bit of digging reveals that there's only one (family of) learning curve(s), and what it/they represents is the relationship between how well one can perform a task vs how much practice once has had. It is a concept derived from the worlds of the military and manufacturing, so "how well" has a quite specific meaning: it means how consistently. And how consistently we can perform an action is only of interest if we have to perform the action many, many times. Which is what people who work in manufacturing (and, at the time that the original studies were done, the military) do.
And it turns out, in pretty much all the cases that anyone has looked at, that the range of improvement that is possible is huge (thus all learning curves are high), and that the vast majority of the improvement comes from the early minority of repetitions (thus all learning curves are steep). Even at very high repetition counts, tens or hundreds of thousands, further repetitions can produce a marginal improvement in consistency (thus all learning curves are long). This is of great interest to people who plan manufacturing, or other, similar, operations because they can then do a little bit of experimentation to see how many repetitions a worker needs to do to obtain a couple of given levels of consistency. They can then fit a power-law curve trough that data and predict how many repetitions will be needed to obtain another, higher, required level of consistency.
Actual learning curves seem usually to be represented as showing some measure of error, or variation, starting at some high value and then dropping, very quickly at first, as the number of repetitions increases.
Which is great is large numbers of uniform repetitions is how you add value.
But, if we as programmers believe in automating all repetition, what then for the learning curve?
[*] Note: for those of you who don't like this trait, recall that you don't have to follow the links.
The Problem with Problems with Patterns
There seems to be a new venom in the informal blogsphere-wide movement against patterns. I'm not the only one to notice it.
Commentators are making statements like this:
There are anecdotes:
That Smalltalk has blocks (and the careful choice of the objects in the image) gives it a certain property, that application programmers can create their own control-flow structures. Smalltalk doesn't have
Now, this is meant to mean that patterns disappear in such languages because design patterns are understood to be missing language features, and in Lisp and Smalltalk you can add those features in. So there shouldn't be any missing. The ur-text for this sort of thinking is Norvig's Design Patterns in Dynamic Programming.
The trouble is that he only shows that 16 of the 23 GoF patterns are "invisible or simpler" in Lisp because of various (default) Lisp language features. This has somehow been inflated in people's minds to be the equivalent of "there are no patterns in Lisp programs", where "Lisp" is understood these days as a placeholder for "my favourite dynamic language that's not nearly as flexible as Lisp".
Just in passing, here's a little gem from Lisp. Macros are great, but Common Lisp macros have this property called variable capture, which can cause problems. There are a bunch of things you can do to work around this, On Lisp gives 5. Wait, what? Surely this isn't a bunch of recurrent solutions to a problem in a context? No, that couldn't be. As it happens, it is possible to make it so that macros don't have this problem at all. What you get then is Scheme's hygienic macros, which the Common Lisp community doesn't seem too keen to adopt. Instead, they define variable capture to be a feature.
Now, it's tempting to take these observations about the Lisp world and conclude that the whole "no patterns (are bunk because) in my language" thread is arrant nonsense and proceed on our way. But I think that this would be to make the mistake that the anti-patterns folk are making, which is to treat patterns as a primarily technical and not social phenomenon.
Actually, a little bit too social for my taste. I've been to a EuroPLoP and work-shopped (organizational) patterns of my own co-discovery, and it'll be a good while before I go to another one–rather too much of this sort of thing for my liking. But that's my problem, it seems to work for the regulars and more power to them.
This is made pretty explicit in another high-profile pattern basher's writing
A guy who works for Google once asked me why I thought it should be that, of all the Googlers who blog, Steve has the highest profile. It's probably got something to do with him being the only one who regularly calls a large segment of the industry idiots to their face.
And now don't we get down to it? If you use patterns in your work, then you must be an idiot. Not only are you working on one of those weak languages, but you don't even know enough to know that GoF is only a bunch of work-arounds for C++'s failings, right? I think Dick Gabriel's comment goes to the heart of it. If patterns are the master programmers advice to the beginning programmer, codified, then to use patterns is to admit to being a beginner. And who'd want to do that?
Commentators are making statements like this:
In certain programming cultures, people consider Design Patterns to be a core set of practices that must be used to build software. It isn’t a case of when you need to solve a problem best addressed by a design pattern, then use the pattern and refer to it by name. It’s a case of always use design patterns. If your solution doesn’t naturally conform to patterns, refactor it to patterns.What Reg originally said was "Most people consider Design Patterns to be a core set of practices that must be used to build software." Most? Really most? I challenged this remarkable claim, and Reg subsequently re-worded it as you see above. Quite who's in this certain culture I don't know. And apparently neither does Reg:
We could argue whether this applies to most, or most Java, or most BigCo, or most overall but not most who are Agile, or most who have heard the phrase "Design Patterns" but have never read the book, or most who have read the book but cannot name any pattern not included in the book, or...But if that "certain culture" doesn't exist, then what's the problem? And if it does exist, then why is it so hard to pin down?
But it seems off topic. So I have modified the phrase to be neutral.
There are anecdotes:
When I was [...] TA for an intro programming class [...] I was asked to whip up a kind of web browser "shell" in Java. [...] Now, the first language that I learned was Smalltalk, which has historically been relatively design-pattern-free due to blocks and whatnot, and I had only learned Java as an afterthought while in college, so I coded in a slightly unorthodox way, making use of anonymous inner classes (i.e., shitty lambdas), reflection, and the like. I ended up with an extremely loosely coupled design that was extremely easy to extend; just unorthodox.So, maybe there are some teachers of programming (an emphatically different breed from industry practitioners) who over-emphasize patterns. That I can believe. And it even makes sense, because patterns are a tool for knowledge management, and academics are in the business of knowing stuff.
When I gave it to the prof, his first reaction, on reading the code, was...utter bafflement. He and another TA actually went over what I wrote in an attempt to find and catalog the GoF patterns that I'd used when coding the application. Their conclusion after a fairly thorough review was that my main pattern was, "Code well."
It's where Patterns came from
But what's this? "Smalltalk [...] has historically been relatively design-pattern-free due to blocks and whatnot" That's simply not true. In fact, the idea of using Alexander's pattern concept to capture recurrent software design ideas comes from a group that has a large intersection with the Smalltalk community. There's a Smalltalk-originated pattern called MVC, you may have heard of it being applied here and there. Smalltalk is pattern-free? Where would such confusion come from? I think that the mention of blocks is the key.That Smalltalk has blocks (and the careful choice of the objects in the image) gives it a certain property, that application programmers can create their own control-flow structures. Smalltalk doesn't have
if
and while
and what have you baked in. Lisps have this same property (although they do tend to have cond
and similar baked in) through the mechanism of macros.Now, this is meant to mean that patterns disappear in such languages because design patterns are understood to be missing language features, and in Lisp and Smalltalk you can add those features in. So there shouldn't be any missing. The ur-text for this sort of thinking is Norvig's Design Patterns in Dynamic Programming.
The trouble is that he only shows that 16 of the 23 GoF patterns are "invisible or simpler" in Lisp because of various (default) Lisp language features. This has somehow been inflated in people's minds to be the equivalent of "there are no patterns in Lisp programs", where "Lisp" is understood these days as a placeholder for "my favourite dynamic language that's not nearly as flexible as Lisp".
But there are
And yet, there are patterns in Lisp. Dick Gabriel explains just how bizarre the world would have to be otherwise:If there are no such things as patterns in the Lisp world, then there are no statements a master programmer can give as advice for junior programmers, and hence there is no difference at all between an beginning programmer and a master programmer in the Lisp world.Because that's what patterns are about, they are stories about recurrent judgments, as made by the masters in some domain.
Just in passing, here's a little gem from Lisp. Macros are great, but Common Lisp macros have this property called variable capture, which can cause problems. There are a bunch of things you can do to work around this, On Lisp gives 5. Wait, what? Surely this isn't a bunch of recurrent solutions to a problem in a context? No, that couldn't be. As it happens, it is possible to make it so that macros don't have this problem at all. What you get then is Scheme's hygienic macros, which the Common Lisp community doesn't seem too keen to adopt. Instead, they define variable capture to be a feature.
Now, it's tempting to take these observations about the Lisp world and conclude that the whole "no patterns (are bunk because) in my language" thread is arrant nonsense and proceed on our way. But I think that this would be to make the mistake that the anti-patterns folk are making, which is to treat patterns as a primarily technical and not social phenomenon.
Actually, a little bit too social for my taste. I've been to a EuroPLoP and work-shopped (organizational) patterns of my own co-discovery, and it'll be a good while before I go to another one–rather too much of this sort of thing for my liking. But that's my problem, it seems to work for the regulars and more power to them.
Contempt
Look again at Reg's non-definition of the "certain programming cultures":[...] most, or most Java, or most BigCo, or most overall but not most who are Agile, or most who have heard the phrase "Design Patterns" but have never read the book, or most who have read the book but cannot name any pattern not included in the book,[...]doesn't it seem a little bit as if these pattern-deluded folks are mainly "programmers I don't respect"?
This is made pretty explicit in another high-profile pattern basher's writing
The other seminal industry book in software design was Design Patterns, which left a mark the width of a two-by-four on the faces of every programmer in the world, assuming the world contains only Java and C++ programmers, which they often do. [...] The only people who routinely get excited about Design Patterns are programmers, and only programmers who use certain languages. Perl programmers were, by and large, not very impressed with Design Patterns. However, Java programmers misattributed this; they concluded that Perl programmers must be slovenly [characters in a strained metaphor].Just in case you're in any doubt, he adds
I'll bet that by now you're just as glad as I am that we're not talking to Java programmers right now! Now that I've demonstrated one way (of many) in which they're utterly irrational, it should be pretty clear that their response isn't likely to be a rational one.No, of course not.
A guy who works for Google once asked me why I thought it should be that, of all the Googlers who blog, Steve has the highest profile. It's probably got something to do with him being the only one who regularly calls a large segment of the industry idiots to their face.
And now don't we get down to it? If you use patterns in your work, then you must be an idiot. Not only are you working on one of those weak languages, but you don't even know enough to know that GoF is only a bunch of work-arounds for C++'s failings, right? I think Dick Gabriel's comment goes to the heart of it. If patterns are the master programmers advice to the beginning programmer, codified, then to use patterns is to admit to being a beginner. And who'd want to do that?
Subscribe to:
Posts (Atom)