This post is part of a series from the TED 2009 conference held in Long Beach, California from February 4-8th. You can read other posts in the series here, and the TED site will release video from the talk in the coming weeks or months. Because I’m putting these posts together very quickly, I will get things wrong, will misspell names and bungle details. Please feel free to use the comments thread on this post to offer corrections. You may also want to follow the conference via Twitter or through other blogs tagged as TED2009 on Technorati.
Dan Ariely suffered burns over 70% of his body as a teenage military trainee from a magnesium flare explosion. As a result, he spent a long time in a hospital burn ward and had ample opportunity to think about the decisions nurses made. Every day, the nurses removed his bandages, ripping them off quickly to “minimize” pain. He wondered – aloud – whether this was the right way to treat his pain. Maybe removing them carefully and gently would have helped him?
The nurses insisted they had the right model to minimize his pain and made it clear that his input wasn’t wanted. But the experience – understandably – haunted him. As a psychology student, he began looking for ways to hurt people and ask them about their reactions – he’d crush fingers in vises, put people in pain suits, gave them electric shocks and subjected them to loud noises.
What he learned: his nurses were wrong. They would have hurt him less if they’d moved more slowly, because it turns out we remember intensity, not duration. They should have started with the painful bandages on his face and moved to the less painful ones on his legs, because that progression minimizes pain. And they should have given him breaks in the whole experience.
If the nurses could be so wrong about treating his pain, what else do we get wrong? Ariely got interested in cheating in the wake of the Enron scandal. He began studying cheating in the lab, inviting students to solve a sheet of 20 math problems in five minutes, offering a dollar for each solved. When students showed their results, they averaged 4 solved problems. When students self-reported what they’d solved and shredded their answer sheets, they reported solving seven.
It wasn’t that some people cheated radically – nearly everyone cheated a little. Ariely tried changing the incentives, adding more money to the equation – it turns out that people are generally insensitive to these effects. It’s possible that we cheat just a little bit so we can feel good about ourselves – we don’t cheat more just because it’s lucrative because we’d feel bad about ourselves.
Psychological factors matter a great deal. When Ariely tried an experiment asking people to recall the ten commandments or ten books from childhood, he discovered no cheating in the ten commandments set… despite the fact that no one remembered the ten commandments.
We cheat differently with different commodities. In an unscientific experiment, Ariely filled fridges at MIT with coke cans and tracked their disappearance… and also put in plates containing six dollar bills. The half-life of the coke was very short, and very long for the bills.
On the other hand, introducing tokens into the equation – instead of actual currency – even for just a second – increases cheating.
In a set of experiments designed to test peer effects, Ariely used a confederate who announced, after a few seconds, that he’d solved twenty math problems. This person was obviously cheating – would this encourage or discourage cheating from the others? It turns out that it mattered what sweatshirt he was wearing – if Carnegie Mellon students thought the confederate was a fellow student, they’d cheat more. If they thought he was from the University of Pittsburgh, they’d cheat less.
It’s possible the Enron situation happened through a combination of peer effects and abstraction, where derivates acted like tokens, distancing people from actual currency.
Ariely closes by telling us about conversations with his favorite nurse. She pointed out that nurses might have been minimizing their own pain, as they certainly didn’t enjoy torturing their patient. Beyond that, she explained that she didn’t believe his intuition about pain was right, and she was unwilling to engage in such a potentially painful experiment. This is a situation many of us find ourselves in – we may need to change our intuitions, but it’s painful to undertake the experiments to see if we’re right or wrong.