Seduced by the drama?

Have you ever watched those shows on the Discovery channel (or similar) where the huge construction project goes a bit wrong? If it were more mainstream programming the story would pretty quickly stop being about the stuff and start being about the people, but since it is intended for 14-year old boys of every age these programmes don't quite go down that route. Big yellow machines, yum! But they do always have that drama to them. Drama comes form conflict and in these shows the conflict is between what the plan says should happen and the conditions in the world. A smooth and straightforward project would be less than gripping.

There are lots of those, but they wouldn't make great television. I find that many of these shows still don't make especially great television anyway because what actually happens is that, for example, a smooth technocrat (often German, the best sort) arrives, points out a few alternatives, gets everything back on track and off we go to a successful delivery. That's how it mostly is in the grown-up world of proper stuff. Risks materialise as issues (as it is expected they will, from time to time) and are dealt with in a calm and orderly way. There's the occasional stopppage, the odd bout of overtime, the best crane driver in the world has to be lured out of retirement for this one last job or whatever it may be. But there's no food fight. Food fight? Bear with me.

Scott Berkun has an essay out called "Ugly Teams Win", a part of the forthcoming Beautiful Teams. He presents an...interesting model:
[...] when things get tough, it's the ugly teams that win. People from ugly teams expect things to go wrong and show up anyway.
Well. He passes through some interesting observations about the, what shall we say? challenging personalities of several stellar individuals in several fields. You might be very happy to have a Picasso in your house, but to have had Picasso himself would be a different matter. Fine. Of course, a gang of stellar performers is a very different thing from a well-performing team.

However, Berkun's point is well made that the best people to have on your project for the good of the project might not appear to be the best people generally, in all sorts of ways. Building effective teams is a tricky business. He goes further, though:
The only use of beauty applied to teams that makes sense is the Japanese concept of wabi-sabi. Roughly, wabi-sabi means there is a special beauty found in things that have been used.
I'm not sure that it does, although things that have been used often end up wabi-sabi. Here's a description of wabi-sabi as I've come to understand it
Wabi refers to that which is humble, simple, normal, and healthy, while sabi refers to elegant detachment and the rustic maturity that comes to something as it grows old. It is seen in the quiet loneliness of a garden in which the stones have become covered with moss or an old twig fence that seems to grow naturally from the ground. In the tearoom it is seen in the rusty tea kettle (sabi literally means rusty). The total effect of wabi and sabi is not gloominess or shabbiness, however, but one of peace and tranquillity
In summary "wabi-sabi refers to the delicate balance between the pleasure we get from things and the pleasure we get from freedom from things"

One interesting aspect of this is that things made from natural materials (wood, stone, leather) may acquire charm with wear, but things made form synthetic materials seem not to. Objects can become wabi-sabi through use, wear, or the simple passage of time and the natural processes that they take part in. Berkun wants that to apply to teams:
In this sense, the ugly teams I described at the beginning of this chapter, the underdog, the misfit, represent the wabi-sabi teams
That seems like a huge leap to me, especially when we find out what he means for a team to be used. Here's how he describes the early days of a project, the "Channels" functionality of IE 4:
The deals we made forced legal contracts into the hands of the development team: the use of data [...] had many restrictions and we had to follow them, despite the fact that few doing the design work had seen them before they were signed. Like the day the Titanic set sail with thousands of defective rivets, our fate was sealed well before the screaming began. Despite months of work, the [...] team failed to deliver. The demos were embarrassing. The answers to basic questions were worse.
"Our fate was sealed well before the screaming began" Oooh-kay. And this is the steady-state:
somewhere in our fourth reorg, under our third general manager and with our fifth project manager for Channels, the gallows humor began. It is here that the seeds of team wabi-sabi are sown. Pushed so far beyond what any of us expected, our sense of humor shifted into black-death Beckett mode. It began when we were facing yet another ridiculous, idiotic, self-destructive decision where all options were comically bad. "Feel the love," someone would say.
At least one of Berkun and I have completely and utterly failed to understand what wabi-sabi means.

But that's the least of the issues I have with this. It looks to me as if this team is not being used in the way that an old shoe was used (and so gained its comfort, charm, personality and identity). This team is being abused. Here's how they ended up:
Late in the project, I became the sixth, and last, program manager for Channels. My job was to get something out quickly for the final beta release, and do what damage control I could before it went out the door in the final release. When we pulled it off and found a mostly positive response from the world, we had the craziest ship party I'd ever seen. It wasn't the champagne, or the venue, or even how many people showed up. It was how little of the many tables of food was eaten: in just a few minutes, most of it had been lovingly thrown at teammates and managers.
And there's the food-fight. Don't be distracted by the hysterical high-jinks (as bad a sign as they are). Note that they've had six programme managers by this point. The end-point of the project is described in terms of damage control. Does that sound like the wabi-sabi of the moss-covered stone lantern in the quiet garden? Does it in fact sound like an in any way attractive or desirable outcome? Berkun certainly seems to want to call this a success. He says "The few who remained to work on Internet Explorer 5.0 had a special bond". No doubt they did, no doubt they did.

Here's where this story starts to turn my stomach a little. You see, after going through this dreadful experience "the few that remained" went on to make "Internet Explorer 5.0 [...] the best project team I'd ever work on, and one of the best software releases in Microsoft's history" Maybe so. But at what cost? The majority that didn't remain (it's hard not to imagine them being considered washouts, dropouts, failures), how did they feel about being placed in this outrageous position on IE4? And what are we to infer about teams toughing it out through pre-doomed projects?

One of the few cogent points to have emerged form the recent spasm of interest in so-called Software Craftsmanship is the idea that a "craftsman" has a line that they will not cross, things that they will not do. I tend to agree. I think that the industry would be in globally better shape if more people were prepared to locally say "no" to destructive madness of, well, of exactly the kind reported here for IE4 Channels. I don't consider the members of that team heroic for having made it through and bonded and all that stuff. Well, certainly not "the few". I'm inferring that some folks gave up on this project, walked away from the screaming and got on with something less harmful to themselves. Those are the folks I want to celebrate. And at every scale.

I'm very disturbed that a story such as this one is going to make it into a book about "beautiful teams", even if the point of it is supposed to be the subsequent success of the IE 5 team after their traumatic bonding experience.

This story celebrates failure. And it celebrates a particularly seductive kind of failure, one with which the IT industry is riddled. It celebrates a macho bullshit kind of failure that looks like success to stupid, evil people. It celebrates a kind of failure that too many programmers have come to (secretly) enjoy, and that too many businesses have come to expect that their programmers will (secretly) enjoy and therefore put up with.

In the grown-up world of proper stuff stupid, doomed, destructive projects get cancelled. And that is a successful outcome. We should do more of that.




Life in APL, programming in a live environment

This gorgeous screencast of an APL session shows an implementation of Conway's Game of Life being derived in APL.

It's a very delicious demonstration of what can be achieved in 1) a "live" computational environment (rather than a mere language 'n' platform) with 2) a language that works by evaluating expressions (rather than merely marshalling operations) and 3) already knows how to show you what complicated values look like (because it understands its work to be symbolic not operational).

Ponder what the Dyalog folks (no affiliation) are showing you there, ponder just how far towards that same goal you'd get in whatever programming system you use at work in 7 minutes 47 seconds (even if you'd done as much rehearsal as I'm sure they did) and then ponder the state of our industry.

Then have a stiff drink.

Agile Coaches Gathering

I'm going to the Agile Coaches Gathering at (and partly in aid of) Bletchley Park in May. I commend the event to you.

The Lives of Others, the Lives of Ourselves

I've kept to a rule with this blog, that the posts on it should be of broadly technical and mostly professional nature. I'm going to break that rule now.

Some time ago I watched the film Das Leben der Anderen, a bitter-sweet offering from Florian Henckel von Donnersmarck. It's a love story, of several kinds, and a thriller and little bit of (and little bit of a response to) an Ostalgie comedy. In one scene Hauptmann Wiesler, a Stasi interrogator and surveillance officer, teaches a class of prospective secret policemen. He puts great emphasis on the importance of the handling of the seat cover after an interview. This was a real thing: during interrogation a cloth would be used absorb the personal odours of the subject and then sealed in a jar. When, not if, the Stasi needed to find you the jar would be unsealed and used to give tracker dogs your scent.

I recall hearing about this in a John Peel piece many years ago. He went to interview some punks in the former DDR and they spoke of this and many other horrors of living in an authoritarian surveillance society. And this sort of thing was considered a genuine horror. A fine example of the reason why it was worth the grinding fear of the Cold War, to avoid living in that sort of society.

Between the time of watching that film and writing this piece I got into a conversation with a Daily Mail type of fellow, who was complaining that "the UK is becoming a complete police state. Like[sic], on the level of Nazi Germany". This is well known to be a bad way to argue. My response was that it wasn't really, not even on the level of Communist Germany, go watch Das Leben der Anderen to see what a real police state looked like.

Today I'm thinking that I might owe that guy a little bit of an apology. This report [pdf] from the Rowntree Reform Trust makes dismaying reading:
  • A quarter of the public-sector databases reviewed are almost certainly illegal under human rights or data protection law
  • Fewer than 15% of the public databases assessed in this report are effective, proportionate and necessary, with a proper legal basis for any privacy intrusions
  • The benefits claimed for data sharing are often illusory.
Well, joy.

Top of the list of problematical databases is the National DNA Database. A little background: England and Wales (Scotland and Northern Ireland have their own arrangements) are Common Law countries and like other such used to have a distinction between summary offences, misdemeanours and felonies. We don't any more (much as we don't have Grand Juries, either) we only have summary vs indictable offences. But that only applies if you are brought to trial.

We used also to have a distinction between arrestable offences (effectively, felonies) and non-arrestable offences. Over the years so many more offences have been created that this distinction was seen as unhelpful and at first it was blurred and eventually, in the Serious Organised Crime and Police Act 2005, abolished.

Under section 110 of that act a constable may arrest, without warrant, anyone he or she "has reasonable grounds for suspecting to be about to commit an offence" (any offence at all, remember, however minor) if this is "necessary". For example "to enable the name of the person in question to be ascertained" (and for various other reasons listed in the Act). What this amounts to is a summary power of arrest, of anyone, at any time. This was spectacularly under-reported at the time. And here's the punchline: everyone who is arrested is obliged to give a DNA sample and that goes on the National DNA Database.

And it stays there. It stays there if you are released without charge. It stays there even if you are charged, go to trial and are acquitted. This was already known to be in breach of the European Convention on Human Rights. In that unanimous ruling the European Court of Human Rights made a number of interesting statements:
The Court was struck by the blanket and indiscriminate nature of the power of retention in England and Wales. In particular, the data in question could be retained irrespective of the nature or gravity of the offence with which the individual was originally suspected or of the age of the suspected offender; the retention was not time-limited; and there existed only limited possibilities for an acquitted individual to have the data removed from the nationwide database or to have the materials destroyed.
And indeed it turns out to be non-trivial to get the DNA of innocent persons removed from the database.

Also,
The Court noted that England, Wales and Northern Ireland appeared to be the only jurisdictions within the Council of Europe to allow the indefinite retention of fingerprint and DNA material of any person of any age suspected of any recordable offence.
and
The Court expressed a particular concern at the risk of stigmatisation, stemming from the fact that persons in the position of the applicants, who had not been convicted of any offence and were entitled to the presumption of innocence, were treated in the same way as convicted persons.
So we are, here in the UK, faced with a police system that in this regard at least makes no distinction between the innocent and the guilty. And on a grand scale. The full version[pdf] of the RRT report states that
Over half a million innocent people (people not convicted, reprimanded, given a final warning or cautioned, and with no proceedings pending against them) – including over 39,000 children – are now on the database
and to no good effect
doubling the number of people on the database from about 2m to about 4m has not increased the proportion of crimes solved using DNA, which remains steady at about 1 in 300. Indeed, in 2007 the number actually fell slightly
I'm no longer sure that I know how this is much different from the Stasi bottling seat covers.

Except maybe that we do still have a chance to do something about it. A shame that we must appeal to the European institutions for protection from our own executive. Time to join Liberty. Hope I'm not too late.

Slipping through our fingers?

It was close. So close.

What's the really exiting thing about the Agile development movement? Is it getting to ship working software? That's pretty good. Is it not having to lie about status all the time? That's great too! Is it being able (and allowed, and required) to take true responsibility for our work? Wonderful stuff! But I don't think these are the best part.

I think that the best part is that we were starting to normalise the idea that programmers are valuable knowledge workers who collaborate with other valuable workers in the business as peers. Big chunks of the industry had to (re)learn how to do all that shipping and telling the truth and stuff in order to get to that point, but that's only the means. The end is to be dealt with by our paymasters as if we make a genuine contribution to our businesses. And even that is only a starting point for the really interesting stuff.

And now along come folks shouting about Lean. A lot of them are Agile folks looking (quite properly) for what to do next that's better. And the message seems to be: programmers are operators of a technical workflow, and nothing else. They pull to-do items from a queue (which someone else manages) and work them through various states of a purely technical nature until they can be handed off into production (where someone else creates value using them). Again and again and again forever. And it is claimed that if they are organised this way then they will be more efficient and shipping those to-do items than if organzied in other ways. This is almost certainly true.

In which case it is going to be hard to avoid being shoe-horned into that mode once the development managers wake up to it. Back down into the engine room. Back to being, rather more literally than before, a cog in a machine.

The Agile approach to developing software (or, something a lot like it) is gaining more and more interest because, along with the stuff about getting to look up at the stars it is actually a more productive way to develop than a lot of the incumbent approaches. If Lean is more productive again but doesn't even have that social stuff as a hidden agenda, I'm not sure that we as inhabitants of this corner of the industry will turn out to have done ourselves much of a favour.

Fauxtrovert

It took me a long time to overcome my distaste of blogs. I'm still not a huge fan but writing peripatetic axiom does seem to be better than useless. After a certain amount of prodding I've started to dabble with twitter.

I'm not finding it easy. One problem is that a lot of the time I'm doing things that I'm not supposed to tell anyone else about (because they are commercially sensitive) and a lot of the rest of the time I'm doing things that I just can't believe anyone would be interested to know about. That second part would seem to be a part of the introvert type.

As with blogs in the early days ("I like beer", "isn't my cat cute" etc.) the art of twittering all the miscellaneous stuff of ones' life seem fairly pointless in a way I've found difficult to explain. But now redditor shenpen has expressed it very well for me. The twitter stance would appear often to be not introversion, and not extroversion, but fauxtroversion

Lumpy Kanban

The coalescing of bits and bobs in this posting to the kanbandev Y! group have brought me to a realisation of why Kanban-for-software seems to make me cringe so much. I thought that I'd replied in the group, but apparently not, so I'll do it here. Reply now here

Versus what I'm used to seeing (expect to see, want to see) in an Agile development organisation the Kanban that I've seen explained in numerous posts, conference sessions and so forth has far too large a batch size. There's far too much work in progress. The flow of value is far too uneven. I mean, really, the Kanbanista's want us to organise our work around something as big and lumpy and lengthy as an entire feature!?

Consulting Engineers

For the last couple of years I've lived at the bottom of Syndenham Hill (on the Kent side). At the top is the place to where the Crystal Palace was moved after the Great Exhibition. The Palace and the park built around it had many water features and for this and other reasons water towers were erected to give sufficient head of pressure, something otherwise hard to achieve at the top of the biggest hill for miles around. Big park, lots of features, huge towers. 

The engineer who built these towers proved to be mentally unsound and the towers structurally unsound, so Brunel was brought in to rebuild them (which might happen again). The story goes that he was at first most reluctant to pass judgement on the work of another engineer, however bonkers. Gentlemen didn't do that to one another. 

This was in 1855, when engineering as a profession was young and about twenty years before the Tay Bridge disaster firmly fixed in people's heads the idea that one engineer won't do. Much as even accountancy firms themselves have to get an accountant in to audit their books, on a big enough job engineering firms get one another in to check they've got their sums right. This the work of Consulting Engineers. 

In the IT world we bandy the word "consultant" around with a certain amount of levity. In the US a consultant doesn't really seem much different from what in the UK we call a contractor: in everything but name a temporary employee. In the UK there is a slight difference, which a contractor once expressed to me (a consultant) in this way: contractors don't have to write reports.

More generally contractors build the system they way they are told to, consultants have an opinion about how the system should be built. This distinction is not lost on the tax authorities.

Anyway, what with all this talk of "craftsmanship" there's been about the place recently thoughts naturally turns towards the more mature disciplines. A significant aspect of those, in many cases, is that it's awfully hard to get certified to practice. 

That's in part so that individuals can bear some liability if the job goes tits-up because the certifying bodies certainly will. This is as imperfect as any human institution, but at least they try. Another aspect of this is genuine consultancy. If you, you personally, are going to bear liability for it all going pear-shaped then maybe you sould spread some of the blame load by getting someone else in. 

Notice that in the medical field, if you aren't sure of a diagnosis then you can ask for a second opinion. If your doctor isn't sure, they'll go and get one by themselves.

This seems not to happen much in the IT field. Perhaps another symptom of our immaturity? When was the last time you worked on a project where the prime contractor brought in a competitor to check that they'd "got their sums right"?

Have we had our Tay Bridge yet? Maybe. In some domains, avionics, for instance, you do hear of multiple independent implementations. That's not quite what I mean, though. 

Why don't IT companies running projects (of a certain size or complexity) routinely get the competition in to express an opinion? Why don't clients demand this as part of their risk mitigation strategy? What is it going to take for folks to bring a genuinely professional standard of conduct to IT?

Update: it's two days later and no-one has thrown down the gauntlet. I was expecting some bright individual to come back at me with a "you work for a consultancy, so you tell us why this doesn't happen" But no-one seems to be biting....

The Rush to Lean Makes Me Nervous

And I'm not the only one.

Now, I am not an expert on manufacturing, but I have seen some. Software development is not manufacturing. I don't think that it's even very much like manufacturing. Manufacturing is about achieving uniformity across huge numbers of multiples and making some trade-off between the highest and the most economic rate of production of those multiples. 

Software development is about making one

I believe that software development is more like product development than manufacturing.

I'm not an expert in product development either, but I work for a company that, amongst other things, does develop products. Product development is very different from manufacturing (which we don't do), although the two are very closely related. A big part of product development is to arrive at a design that can be effectively manufactured. 

Unfortunately a lot of our clients don't want you to know that we designed their products, so I can't brag too much. Which is a shame as a lot of them are waaaay cool. However, this flier[pdf] is public and describes one episode of product development. The flier mentions PEP, our process for developing products. As a deliberate policy we try to keep the processes that we use to develop products and the processes we use to execute software development projects as closely aligned as possible. 

Both are iterative. Both involve exploration. Both are adapted to circumstances while maintaining certain principles (such as actively managing risk). Both have delivered a very high success rate over many, many engagements. So I feel on fairly firm ground in claiming this similarity between software development and product development.

As Steve Freeman points out, by a strict Lean interpretation of the manufacturing school product development looks wasteful. And it is. And that's OK because it isn't manufacturing. The economics and the goals are different.

Its worth noting that the highly-publicised Lean successes seem to be concerned largely with operational activities: never-ending, on-going care and feeding of an existing system. To the extent that your activities are like that, a more Lean approach is likely to work well for you, I think. I've yet to hear of a success story of strict Lean (no iterations, no retrospectives, all the rest of it) run in a project setting to produce a new product. 

I don't say it can't be done, but I've not heard of it. If you have, I'd love to.


Let over Lambda

Let over Lambda is a new self–published book on lisp by Doug Hoyte. I'm not quite sure what to make of it, overall.

It's great to see a book full of straight-ahead programming these days rather than mere system composition. It's especially great to see an extended work dealing with programming in the small. It's great fun to see someone who really likes programming as an activity in its own right exhibit their enjoyment. It's a great pleasure to see assembly language appear in a programming book of the 21st century. I find the curious combination of lisp being at once both a very high level language suited to symbolic programming and being very close to the metal most stimulating. That's a pair of properties that the programming language design mainstream seems to have abandoned. Java, for instance, doesn't feel particularly close even to the JVM.

It's especially great to see a lisp afficionado standing up for vi.

Assembler arises in a couple of spots where the impact of macros, the parsimony of lisp's internal representations and the intelligence of the lisp compiler combine to collapse quite sophisticated source code down to startlingly few opcodes. Which is all very fascinating. So much so that I was inspired to resurrect the SBCL installation on my Mac and go refresh my memory of how cute the disassemble function is. However, it feels to me as if an opportunity has been lost to take that just a bit further and come up with some Programming Perls–like startling observations about performance.

The book builds up to a very interesting exercise in implementing Forth. It's very nicely done and a great illustration of how easy it is to implement one interesting language given another. Lisp/Prolog used to be the canonical pair, I think. This illustration makes a good case for lisp/forth being roughly as illuminating.

Along the way there are several not–quite–consistant claims about what the book is for and the big build up to the alleged principle of “duality of syntax” is a very long run for a very short slide. Again, it feels as if an opportunity to do somehting really startling has been lost. There's a sort of plonking “here it is” presentation of this and other material. It's often good and interesting material, but needs a little bit more development in places.

It's perhaps not so great to have the, what shall I call it? unfettered enthusiasm of the author for lisp, macros and all that they imply coming at you un–moderated. I don't think that a commercial editor would have allowed quite so much polemic to make it onto the page. There's a bit too much direct quotation of Paul Graham material (“Blub”, “secret weapon”, you know the sort of stuff) that makes it quite clear that there are on the one hand people who get it, and on the other dullards. This is made very explicit on the back cover blurb:
Only the top percentile of programmers use lisp and if you can understand this book you are in the top percentile of lisp programmers.
Hmm. I have a strong feeling that I understand most of what's in the book and also that I'm not in the top of the top. Whatever that means. I'm not even a “lisp programmer” in any very serious sense of the term. Faced with a little light programming to do then in the absence of any other constraints I'm likely to reach for Scheme—and that brings me to another item that a commercial editor probably wouldn't have let through.

You might imagine that the differing trade–offs made in Scheme and Common Lisp are something that reasonable people could agree to disagree about. Hoyte wants his reader to understand very clearly that this is not so: the choices made in Scheme are wrong (emphasis in the original) and those made in CL are right (emphasis also in the original). The first one of these assertions was amusing enough. The second, not so much. And they just keep on coming. Hoyte is far too young to be settling scores from some X3J13 puch–up, wich would be embarrasing enough, so it all ends up looking a bit adolescent to me.

One last thing...at least in the print–on–demand form I've got from Lightning Source UK the book looks absolutely ghastly. “Made with lisp” says the front matter. Lisp with a lot of help from TeX and that's really not good enough for 2009, not without a lot more tuning than has gone into this. And Lightning Source (or whomever did the camera-ready copy) have originated the work at too low a resolution. That and the lazy choice of CMR combined with the glossy toner makes the actual print a less than comfortable read. Self–publishig has a long way to go yet.

Genius at work

Over the years there has been a lot of propaganda claiming that Google both creates the only working environment in which double-super-genius types can function at peak effectiveness and then goes out of its way to hire only double-super-geniuses to work in it. How, then, does this happen:
We periodically receive updates to that list [of malevolent sites] and received one such update to release on the site this morning. Unfortunately (and here's the human error), the URL of '/' was mistakenly checked in as a value to the file and '/' expands to all URLs.
This sort of thing happens because of bad processes. It doesn't matter how smart folks are, if they work in a bad way, poor results will ensue.

Of course, I don't know exactly what occurred, but I'll bet you a pound that one lone double-super-genius is left to make changes to this file by themselves. 

Update: Jonathan Wolter raises some questions about testing such changes.

Self: the movie

My recent posting on Self provoked some interest on reddit and one keen redditor found the old Self movie. I first learned about Self when one of the giant pulsing brains from Manchester university came down to a BCS meeting in London (When did the BCS stop doing things that interesting? Why?) and showed us this movie. Great days.

Dynamic vs static: once more with feeling

In what feels to me like a voyage through a time-warp to the beginnings of my programming career in the mid–90's, Jason Gorman has revived the old static vs dynamic typing debate. 

Oh, how it all comes flooding back: "strong[sic] typing is for weak minds", "static type systems catch the kind of bug that managers understand" etc. etc. etc.

Jason's concerns seem to have been raised by someting to do with this sort of thing (see the part on the var keyword) although he seems to muddle up a language having a dynamic type system with it being dynamic. These aspects are closely related, but are not quite the same (as that post explains reasonably well). It's picking nits of this variety that keeps us all in work. 

Anyway, I very much agree with Jason that this resurgence of interest in dynamicish, scripty lanaguages is driven largely by fashion, and that claims made about it should be closely examined. I'm not so sure about the rest of his argument. Not least because I'm not sure that what he complains about: 
Proponents of such languages cite the relative flexibility of dynamic typing compared to statically-typed languages like Java and C++. Type safety, they argue, can be achieved through unit testing.
is actually being said by anyone. Type safety through unit testing? Really? Maybe someone is, and I haven't seen it. I'd be interested to if anyone has a link.

Personally, I do tend towards dynamic languages and away form manifest static type systems. My preference away from manifest static typing is that (in the words of, IIRC Kent Back) they make me say things that I don't know are true. On the other hand I've been dabbling a bit with Haskell, which has a very strong, very expressive static type system and offers the promise (through type inferencing) to not require all those pesky declarations. That has been both educational and fun. Unfortunately, as Nat pointed out in another context, that lovely promise might not be delivered upon:
[in Haskell] If you don't write explicit type constraints you can end up with a type inference error somewhere in your code. Where? The only way to find out is to incrementally add explicit type constraints (using binary chop, for example) to narrow down where the error is. It's not much different, and no easier, than using printf for debugging C code. 
If this is the best case of static typing, then we have a problem.

Meanwhile, let's consider the distinction between systems programming and application programming. Try googling around the various attempts to nail down that distinction. I don't find any of them terribly satisfactory. For me, the crucial distinction is that the system programmer must allow for any possibly use of their code, whereas an application programmer does not.

This means that in systems land a function declared like f :: int -> int must, unles very carefully specified otherwise, be known to do the right thing at every element of the whole of the cartesian product of the int type with itself. But in application land we might, for example, know that those ints are really the numbers of the days in the week, so we only really need the function to do the right thing over {0,1,2,3,4,5,6}² Demonstrating those two different kinds of correctness require different techniques, I think.

Of course, using int when you mean {0,1,2,3,4,5,6} is a smell. And here is the deodorant.

I want to join these two things up. I want to make a connection between the kind of correctness that systems code needs to have and the way that static typing is might help us with that versus the kind of correctness that application code needs to have and the way that unit testing might help us with that. But I'm not there yet. Watch this space.

Self

So, Self continues to be developed and has a very nice new site, here. Self is like Smalltalk "only more so". 

These days Self runs like a dream (and blindingly fast) on the Mac. Back in the day Self would only run on Sparc hardware, so I bought a Sun workstation for the express purpose of being able to play with Self. A wise investment. If a point of view is worth 80 IQ points then learning Self is a much better bet than all that brain training and smart drinks. 

I you want to understand to powerful (and fun and (partly, therefore) enticing) a system can be if you trust your users to manipulate objects directly, take a look at Self. If you want to understand just how far the notion of Integrating and Development Environment can be pushed (and how much fun that turns out to be) take a look at Self. If you want to understand why some folks consider Javascript to be such a huge missed opportunity, take a look at Self.


If you aren't part of the solution...

At XtC the other night Mark McKergow presented the Solutions Focus approach to...stuff. 

"Solution" here seems to mean those actions observed to take you in the direction of a more pleasing state of the world and is contrasted with the more traditional "problem" which would be the reasons for the mis–match between what you have now and what you want. The suggestion is to focus on the former and overlook the latter. While we could identify some pretty nasty failure modes for this approach it also has a lot going for it. In particular, this line of thinking reenforces the too–little practised technique of recording what went well during your retrospectives and taking an action to do more of that.

Meanwhile two items from the discussion caught my attention.

Gary Player said (as reported by David Anderson) that when a good player makes a great stroke he swings again to fix the memory of what the good stroke felt like, whereas a poor player swings again after a bad stroke to try to identify what went wrong. Given what the learning curve tells us about repetition and consistency this suddenly made Solutions Focus make a lot more sense to me. 

It was observed that a lot of the examples given of the effectiveness of Solutions Focus were concerned with its application to psychotherapy. Asked why this should be Mark opined that therapy is easier than management consulting. Nice.

Examples, Exemplars, Gauges, Tests

At XpDay London 2008 I gave a lightening talk that turned into an open space session about examples an why they work so well. Some folks asked for some more information and some is available in this presentation [pdf] from Agile 2008, and also this old post.

There are some gaps in the presentation where we did exercises. The first is to ask why Jaffa Cakes are cakes and not biscuits. The others encouraged attendees to explore firstly some of the limitations of definitions and then some of the power of examples.

Enjoy.

(And sorry it took me so long to get around to posting this)

An unreasonably high bar?

Some time ago Adam Goucher posted this response to my 5 questions interview with Michael Hunter. There's a few points in there that I want to come back to, but right now the one that's at the front of my mind is this:
Count tests to get a useless number; I can write a million tests that provide useless information but still shows 7 figures in the count.
Well yes, you could. But why would you? We seem to have a hankering in the industry for techniques that would give good results even when badly applied by malicious idiots. That seems unreasonable. And also pointless: I don't believe that the industry is populated by malicious idiots. On the other hand, the kind of answer one gets depends a lot on how a question is asked. 

There is (I read somewhere recently) a principle in economics that one cannot use one number as both a measure of and a target for the same thing and expect anything sensible to happen. [Allan tells me that this is Goodhart's Law --kb] In our world this is the route to the gaming of metrics. I also don't believe that gaming works by folks consciously sitting down and conspiring to fabricate results. I do believe that if we measure, say, test coverage at every check-in and publish it on our whizzy CI server dashboard thingy and have a trend line of coverage over time and we talk a lot about higher coverage being better, or even that test coverage has something to do with "quality" [that would be the "surrogate measure" part of Goodhart's Law --kb] then it is in fact the response of a smart and well intentioned team member to write more tests to get the number up. Even if those tests turn out not to be much use for anything else.

I think (certainly I hope so) that my recommendation to measure scope by counting tests doesn't fall into that trap. Don't write the tests so that you can measure scope. But observe that you can if you write the tests the right way. Of which I shall have a bit more to say later.

Soft-Craft

So, on Tuesday night the programme for Software Craftsmanship was worked out. Looks pretty good, I think: there's plenty of the doing sessions that Jason wanted with a reasonable counterbalance of more reflective activities.

I was quite amused by the metaphor collision that's taking place, though. It seems that some people find that the best way to move forward the craft of software development is to head down the dojo for some sparring. I'm far form convinced that either of those metaphors is much help by itself. To see them embedded in one another makes my head spin.

I look forward to the day when we feel secure enough as an industry to have a programming conference where people's ability to do programming is improved. 

Skill vs Mastery

Since it's the holidays I get to tourist around London at bit in a way that I tend not to much through the rest of the year. Today I took in the Rothko show at Tate Modern. It's the first time in a long time (maybe ever, or at least since Rothko gave them away) that so many of the colour field paintings have been in the same place at the same time. 

Although these paintings are (at first sight) relatively featureless the viewer is intended to get right up close to them, to feel as if they are in the painting, to be overwhelmed. Although they are not figurative or representational they do carry a lot of content, a lot of emotional impact. There were people sitting in the room where the Seagram murals are, completely transfixed by the paintings. People have been known to burst into tears at the sight of these works.

This sort of modern art is perhaps one of the easiest for the "my four-year-old could do better" school of art critic to avoid thinking about. Of course, their four-year-old couldn't. One room of the exhibition goes some way towards explaining why not. These paintings are on (very) close examination revealed to be extremely complicated objects. Rothko put a lot of technique into these paintings: a lot of careful choice of materials, a lot of careful surface treatments, a lot of careful brush work.  In the language of the curators most of them aren't a "painting" at all. They are mixed media on canvas, so complicated is the structure.

But none of this shows up when one stands before the works, only the emotional effect comes through. That's mastery. As with painting, as Steve says over here with reference to music (and code), it's all about the expressiveness.