Designing in stupidity

About this time last year I posted a very short entry, “On being less stupid”, quoting from Brechts Galieo:
“Truth is the child of time, not authority. Our ignorance is infinite, lets whittle away just one cubic millimetre. Why should we want to be so clever when at long last we have a chance of being a little less stupid.”

I was reminded of this reading a piece by Carne Ross about the processes leading up to the Iraq war in the FT a few weeks ago (subscription only Im afraid, though there are accessible versions of this floating around on the web). Carne Ross was, as he says, “… from 1998 to 2002, the British expert on Iraq for the UK delegation to the UN Security Council, responsible for policy on both weapons inspections and sanctions against Iraq. He goes on to say,“My experience in those years and what happened subsequently is in part why I recently resigned from the Foreign Office.”
What concerned him about the work he was doing and what he observed in others was the way that:
“Evidence is selected from the available mass, contradictions are excised, and the selected data are repeated, rephrased, polished (spun, if you prefer), until it seems neat, coherent and convincing, to the extent that those presenting it may believe it fully themselves.”
He gives as an example how the argument between the opponents of sanctions and those who supported a more aggressive stance against the Iraqi regime in the UN:
“… illustrates how governments and their officials can compose convincing versions of the truth, filled with more or less verifiable facts, and yet be entirely wrong. I did not make up lies about Husseins smuggling or obstruction of the UNs humanitarian programme. The speeches I drafted for the Security Council and my telegrams back to London were composed of facts filtered from the stacks of reports and intelligence that daily hit my desk. As I read these reports, facts and judgements that contradicted our version of events would almost literally fade into nothingness. Facts that reinforced our narrative would stand out to me almost as if highlighted, to be later deployed by me, my ambassador and my ministers like hand grenades in the diplomatic trench warfare. Details in otherwise complex reports would be extracted to be telegraphed back to London, where they would be inserted into ministerial briefings or press articles. A complicated picture was reduced to a selection of facts that became factoids, such as the suggestion that Hussein imported huge quantities of whisky or built a dozen palaces, validated by constant repetition: true, but not the whole truth.”
Contrast this with a quote by Natalie Angier from an earlier post of mine:
“…’One of the first things you learn in science’, one Caltech biologist told me, ‘is that how you want it to be doesnt make any difference’. This is a powerful principle, and a very good thing, even a beautiful thing. This is something we should embrace as the best part of ourselves, our willingness to see the world as it is, not as were told it is, nor as our confectionary fantasies might wish it to be.”

Of course, even those engaged in the scientific enterprise are as prone to filter out unwelcome news as the rest of us, but at least there is some awareness within the scientific tradition to recognise this tendency and to build in steps to counter it.
What I fear is that in many other human organisations and enterprises we have failed to build in such steps and are, in effect, designing in stupidity. By stupidity I mean, in the words of Dudley Lynch and Paul L.Kordis, “the inability of the brain or any other part of nature to accept useful information, learn from it, and act intelligently on it.”
In other words, what I am suggesting is that in many of our organisations, both public and private, we have created situations which make intelligent action more difficult. This is not because the people within them lack ability, it is because the system they are operating within pulls against appropriate action.
Now, of course, you could argue against this and point out that on the whole things seem to work and you would be right. We can do much that is stupid and still seem to be all right.
But thinking about my earlier post about Jared Diamonds “Collapse”, where he talks about those earlier civilisations where everything seemed to be all right until they went into fatal collapse, I wonder whether it might not be smarter to try and design intelligence into our organisations and institutions, rather than hoping everything will OK and continuing to design stupidity in.