Archive for the ‘Iraq’ Category
Lizzy Siddal reviews Don’t Get Fooled Again
We all know that you shouldn’t believe everything that you read in the press , or hear on the news, don’t we? What’s the definition of lying? Inventing stories, misappropriating the truth, lies of omission, spin?
Richard Wilson’s Don’t Get Fooled Again is an illuminating compilation of methods and examples from both the 20th and 21st centuries in which governments and the general public have been duped by flawed thinking:
a) Pseudo-science – 30 million deaths in China as a result of adopting Lysenko’s agricultural reforms (already the cause of millions of deaths in Stalin’s Russia!).
b) Relativism – the uncounted number of deaths in Africa as a result of the success of those denying the existence of HIV and AIDS.
c) The power of vested interests and commercial journalism – the decade-long controvery over whether smoking was bad for your health.
d) Groupthink – spiralling terrorism leading inevitability to the second Iraq war and the excesses at Abu Ghraib.
Wilson doesn’t just detail the facts in his examples. He explains the underlying psychologies. It’s only by understanding these that we, as individuals, can choose not to get fooled again. He offers the following toolkit for spotting manipulation:
1) The antidotes to delusion are logic and evidence, preferably from multiple sources.
2) Remember – it’s not all relative!
3) Spot the false sceptic. Remarkably credulous about facts which support their viewpoint but always demanding more evidence for those which do not.
4) Beware of groupthink.
I haven’t read a newspaper in years because of the underlying – and manipulative – bias of the writing. I think I might just revisit that policy. Armed with the above, it will be an interesting exercise.
Sceptic of the week: Muntazer al-Zaidi
From The Times:
“This is a goodbye kiss from the Iraqi people, dog. This is from the widows, the orphans and those who were killed in Iraq,” he shouted before being overpowered by security guards and bundled out of the room.
Groupthink, Self-serving bias, Space Cadets and the Stanford Prison Experiment – “Don’t Get Fooled Again” at the Beyond Words book festival
Earlier this week I had a very enjoyable afternoon discussing “Don’t Get Fooled Again” at University College School’s “Beyond Words” book festival, and looking at some of the more eye-catching bits of psychological research that I came across while writing the book. The audience had some challenging and wide-ranging questions, and I thought it might be interesting to post some more background links here.
I was about halfway through my research when I realised that a great many of the delusions I was looking at seemed to come down, at least in part, to what might once have been called “excessive vanity”: The conspiracy theorists who think that they alone are part of some special group with special knowledge about AIDS/the Illuminati/911 or the “great asbestos scam” while everyone else muddles along in their brainwashed ignorance. Or the crank who’s convinced that, just by using his uncommon “common sense”, he’s managed to find a fatal flaw in a well-established scientific theory that generations of world-renowned biologists have somehow contrived to miss.
But what I hadn’t known was the degree to which almost all of us seem to over-rate ourselves and our own abilities, at least to some degree. The average driver thinks that they’re a better driver than the average driver – and reason dictates that we can’t all be above average. Most people also seem to rate themselves at least slightly better than others in intelligence, and significantly better in warmth, generosity and – my personal favourite – likelihood of leaving the largest slice of pizza for someone else when there are only two slices left. The research link for that particular claim can be found here – and for a more general overview I’d recommend Cordelia Fine’s excellent book “A Mind of Its Own: How your brain distorts and deceives”.
Somewhat less academic but still very interesting was the case of a reality TV show called “Space Cadets” where a Channel Four production company managed to convince a group of contestants that they were being sent into space and put into orbit around the earth. In fact they were sitting in a Hollywood space shuttle simulator at a disused military airbase in Suffolk.
The programme-makers had set out explicitly to recreate a psychological phenomenon known as “groupthink”, where members of a close-knit social group become so focussed on a particular group belief or objective that they lose their ability to think critically. But what hadn’t been predicted was the effect that the experience would have on the actors who were in on the hoax, and whose job it was to pretend to be an ordinary member of the group.
“My poor brain is a scramble of half-truths, astronomical lies and unbridled lunacy”, Skelton wrote in the Guardian, shortly after the hoax was finally revealed.
“I’ve just scribbled a list of what I know for sure: I’ve been a mole on a fake reality show called Space Cadets; I have a Russian doll in my hand luggage; I’ve just spent the past five days in a flight simulator in a hangar on the Suffolk coast; and – last but by no means least – I’ve just spent the past five days in space.
My default brain position aboard Earth Orbiter One was that we were 200 kilometres up, travelling at about seven kilometres per second. Too many things were telling me that for me to think otherwise.”
The psychological manipulation had been so strong that even though Skelton knew, rationally, that
the whole thing was a hoax, he found himself believing it anyway.
The third case study I looked at was the notorious Stanford Prison Experiment, which took place in the early 1970s. Researcher Philip Zimbardo constructed a model prison in the basement of the Stanford Psychology Department, and got a group of students to play the roles of prison guards and prisoners. Within days, a significant proportion of the guards, who just days previously had been seemingly normal college students, had been transformed into brutal sadists, who relished the power that they’d been given and the opportunities for abuse that it gave them. In the end, the experiment had to be terminated early. Full details about the experiment and its wider implications can be found at Zimbardo’s excellent Stanford Prison Experiment website.
Interestingly, when the soldiers implicated in the horrific Abu Ghraib prison abuse scandal were put on trial, Philip Zimbardo was one of the key witnesses for the defence, arguing that the “situational pressures” on the guards, stemming from the way the prison had been mismanaged, made such abuses entirely predictable.
In “Don’t Get Fooled Again” I argue that human beings are rather more vulnerable to delusion and manipulation than we are usually be prepared to admit – but that confronting these vulnerabilities, and doing our best to understand them, is crucial in reducing our risk of being fooled in future.
Stephen Colbert on “truthiness”…
“That’s where the truth lies – right down here in the gut. Do you know you have more nerve endings in your gut than you have in your head? You can look it up.
Now, I know some of you are going to say, ‘I did look it up, and that’s not true.’ That’s ’cause you looked it up in a book. Next time, look it up in your gut. I did. My gut tells me that’s how our nervous system works.”
– Stephen Colbert, White House Correspondents Association dinner, April 2006
British MPs sceptical of UK government denials over Iraq torture
The Observer reports that a committee of MPs has cast doubt on UK government denials over the use of torture in Iraq. Evidence heard during the trial of soldiers implicated in the killing of an Iraqi prisoner, Baha Musa, suggested that the troops had been ordered to use coercive interrogation techniques, including hooding and ‘stress positions’. Now the Parliamentary select committee on human rights has accused the Ministry of Defence of blocking their efforts to trace responsibility further up the command chain. The committee also suggests that public assurances given by former armed forces minister Adam Ingram, and Lieutenant General Robin Brims, have been contradicted by evidence that UK troops had been using banned interrogation techniques following legal advice from their superiors in Iraq.
Wide-ranging freedom of information laws in the United States have helped to ensure intense public scrutiny of the conduct of American forces in Iraq. A series of legal-rulings compelling the release of previously classified government documents have helped to illuminate the role played by senior figures in helping to make situations such as Abu Ghraib possible. In Don’t Get Fooled Again I was able to draw on many of these primary sources in seeking to understand Abu Ghraib and other related cases.
But here in the UK, the picture is still far more murky. So far, our senior officials have largely escaped any implication that they ordered or condoned the use of torture or other abusive treatment in Iraq. Cases such as the killing of Baha Musa have largely been seen – as was Abu Ghraib at one time – as the work of ‘bad apples’ rather than the result of systematic, officially-sanctioned, abuses. Britons have so far been able to console themselves over the various fiascos relating to Iraq with the assurance that at least ‘our boys’ would never engage in the kind of systematic depravity pursued by US forces at Abu Ghraib. But in the absence, here in Britain, of the kind of judicially-enforced transparency made possible in the US by robust freedom of information laws, it’s tempting to wonder whether the UK chain of command may simply have been in a better position to cover its tracks.
US Major General who led Abu Ghraib investigation accuses authorities of “war crimes”
In 2004, it was Major General Antonio Taguba’s damning report – then still a classified document – that triggered the prosecution of a number of the soldiers who had committed abuses at the Abu Ghraib prison, in Iraq. Now, in the preface to a detailed study by Physicians for Human Rights, Taguba states that “there is no longer any doubt as to whether the current administration has committed war crimes. The only question that remains to be answered is whether those who ordered the use of torture will be held to account.”
In “Don’t Get Fooled Again” I look into the deceptions and delusions around the use of torture in Iraq, at the evidence which suggests that Abu Ghraib was anything but an isolated case, and at the striking parallels between the Abu Ghraib abuses, and the notorious “Stanford Prison Experiment”.