Will evil prevail? is the topic of the first afternoon session. It’s hard to think of a better person to put the question to than Dr. Philip Zimbardo of Stanford university. 37 years ago, he ran the famous Stanford prison experiments, an experiment that demonstrated that people in stressful situations can become incredibly brutal to one another. He begins his talk, referring to his childhood in the South Bronx. “I knew that, unlike many people think, that the line between good and evil is moveable and permeable.”
He shows an MC Escher print. Focus on the white figures and it’s a world full of angels. Focus on the black, and it’s full of devils. It’s a reminder of the relationship between God and the devil – Lucifer was an angel who disobeyed God and was kicked out. God created hell as a place to store evil… but evil has a way of getting out into the world.
Zimbardo offers a complex definition of evil, focused on the idea that it is the exercise of power to harm others. While the definition is complex, we use the word very simply. Search for “evil” and “George Bush” on Google and you’ll get 2 million hits, most oversimplifying what evil is.
We tend to argue that evil is a personal characteristic. In Abu Ghraib, the Bush administration argued that the soldiers were good, but there were a few bad apples. Zimbrano’s hypothesis was that the apples were good and the barrel was bad. He’s got a unique perspective – he was an expert witness in support of Sgt. Chip Frederick, accused of abuse in Abu Ghraib. He shows us painful photos of what happened on Tier 1-A on the night shift, the part of the prison where interrogators tried to break prisoners.
Zimbrano reminds us that the soldiers guarding this prison were reservists, completely unprepared for the circumstances they were dealing with. He shows us a chilling set of slides, some of the thousand photos taken by these soldiers guarding Abu Ghraib. The final image he shows us is of a prisoner covered in feces and dirt. He was mentally ill and covered himself in his own feces every day – the guards rolled him in dirt to cloak the smell. “What the hell was he doing in Abu Ghraib?”
One explanation for Abu Ghraib is dispositional – the bad people were bad apples. Another is situational – the barrel was bad; external factors moved people to evil. The third explanation is systemic – it blames the power in the system, the barrel-makers. These three explanations together form the “lucifer effect” – a way of understanding human character transformations when ordinary people become perpetrators of evil.
Would you electrocute a stranger if Hitler asked you to? That’s the question Stanley Milgram – who was a high school classmate of Zimbrano’s – wanted to answer. He wondered if the Holocaust could occur in America. In 1963, he brought a thousand people into labs in Connecticut. He offered them $4 for their time, and used men between 20 and 50. He looked for ordinary working people, not for students. The group was split into two – teachers and students. The student was an actor – the teachers were the subjects, and they were told to press the shock button when students got the answers wrong.
“All evil starts with 15 volts,” Zimbrano tells us. But the problem is that the apparatus went to 450 volts. Most people do complain, but when they’re told the authority figure will be responsible, most people will go all the way to the end. Ask a psychiatrist how many people will inflict 450 volts and they’ll tell you 1%, the fraction of people who are clinically sadistic. But in the experiments, more than two thirds of people went all the way. People who saw another subject go all the way were 90% likely to go to 450 volts – those who saw someone resist resisted in 90% of cases.
In Zimbrano’s Stanford Prison experiment, he tried to test the power of institutions, not just the power of individuals. He built a realistic scenario, using healthy students and having Palo Alto police arrest them, fingerprint them, then put them into a dehumanizing prison environment run by fellow students. He knew he had only good apples, and no difference between prisoners and guards. But he ended up with a situation that went entirely out of control within 5 days.
Guards humiliated the prisoners, putting them into positions of simulated sodomy and taunting them. It took weeks in Abu Ghraib – “my guards did it in five days”. Kids began having nervous breakdowns as early as 36 hours in, and five kids in general broke down.
Anonymity was a bit piece of the experiment – prisoners were reduced to numbers, and guards were anonymous, with sunglasses protecting their identity. It turns out that warriors, given the cloak of anonymity, are far more likely to commit violent acts – one study shows warriors with visible identities killing, torturing and mutilating prisoners in only one of 13 cases – in anonymous situations, it happened 12 in 13 cases.
Evil in systems comes from:
– mindlessly taking the first step
– dehumanizaiton of others
– diffusion of responsibility
– blind obedience
Understanding evil is not excusing it. We want to understand why people engage in evil so we can avoid designing systems that cause it. And we want to build models that make it easier where people can become heroes. Joe Darby, who exposed Abu Ghraib acted heroically. So did a woman who begged Zimbrano to stop the Stanford Prison study – he calls her his heroine, and he married her a year later. He asks us to inspire heroic imaginations, to help create heros-in-waiting, like Wesley Autrey, who saved a man he didn’t know from being hit by a subway train: “I did what anyone could do, and what everyone ought to do.”