Richard Wilson's blog

richardcameronwilson AT yahoo dot co dot UK

Archive for the ‘Stanford Prison Experiment’ Category

Groupthink, Self-serving bias, Space Cadets and the Stanford Prison Experiment – “Don’t Get Fooled Again” at the Beyond Words book festival

with 2 comments

Earlier this week I had a very enjoyable afternoon discussing “Don’t Get Fooled Again” at University College School’s “Beyond Words” book festival, and looking at some of the more eye-catching bits of psychological research that I came across while writing the book. The audience had some challenging and wide-ranging questions, and I thought it might be interesting to post some more background links here.

I was about halfway through my research when I realised that a great many of the delusions I was looking at seemed to come down, at least in part, to what might once have been called “excessive vanity”: The conspiracy theorists who think that they alone are part of some special group with special knowledge about AIDS/the Illuminati/911 or the “great asbestos scam” while everyone else muddles along in their brainwashed ignorance. Or the crank who’s convinced that, just by using his uncommon “common sense”, he’s managed to find a fatal flaw in a well-established scientific theory that generations of world-renowned biologists have somehow contrived to miss.

But what I hadn’t known was the degree to which almost all of us seem to over-rate ourselves and our own abilities, at least to some degree. The average driver thinks that they’re a better driver than the average driver – and reason dictates that we can’t all be above average. Most people also seem to rate themselves at least slightly better than others in intelligence, and significantly better in warmth, generosity and – my personal favourite – likelihood of leaving the largest slice of pizza for someone else when there are only two slices left. The research link for that particular claim can be found here – and for a more general overview I’d recommend Cordelia Fine’s excellent book “A Mind of Its Own: How your brain distorts and deceives”.

Somewhat less academic but still very interesting was the case of a reality TV show called “Space Cadets” where a Channel Four production company managed to convince a group of contestants that they were being sent into space and put into orbit around the earth. In fact they were sitting in a Hollywood space shuttle simulator at a disused military airbase in Suffolk.

The programme-makers had set out explicitly to recreate a psychological phenomenon known as “groupthink”, where members of a close-knit social group become so focussed on a particular group belief or objective that they lose their ability to think critically. But what hadn’t been predicted was the effect that the experience would have on the actors who were in on the hoax, and whose job it was to pretend to be an ordinary member of the group.

“My poor brain is a scramble of half-truths, astronomical lies and unbridled lunacy”, Skelton wrote in the Guardian, shortly after the hoax was finally revealed.

“I’ve just scribbled a list of what I know for sure: I’ve been a mole on a fake reality show called Space Cadets; I have a Russian doll in my hand luggage; I’ve just spent the past five days in a flight simulator in a hangar on the Suffolk coast; and – last but by no means least – I’ve just spent the past five days in space.

My default brain position aboard Earth Orbiter One was that we were 200 kilometres up, travelling at about seven kilometres per second. Too many things were telling me that for me to think otherwise.”

The psychological manipulation had been so strong that even though Skelton knew, rationally, that
the whole thing was a hoax, he found himself believing it anyway.

The third case study I looked at was the notorious Stanford Prison Experiment, which took place in the early 1970s. Researcher Philip Zimbardo constructed a model prison in the basement of the Stanford Psychology Department, and got a group of students to play the roles of prison guards and prisoners. Within days, a significant proportion of the guards, who just days previously had been seemingly normal college students, had been transformed into brutal sadists, who relished the power that they’d been given and the opportunities for abuse that it gave them. In the end, the experiment had to be terminated early. Full details about the experiment and its wider implications can be found at Zimbardo’s excellent Stanford Prison Experiment website.

Interestingly, when the soldiers implicated in the horrific Abu Ghraib prison abuse scandal were put on trial, Philip Zimbardo was one of the key witnesses for the defence, arguing that the “situational pressures” on the guards, stemming from the way the prison had been mismanaged, made such abuses entirely predictable.

In “Don’t Get Fooled Again” I argue that human beings are rather more vulnerable to delusion and manipulation than we are usually be prepared to admit – but that confronting these vulnerabilities, and doing our best to understand them, is crucial in reducing our risk of being fooled in future.