Mathematical biologist Martin Nowak talks to us about the evolution of cooperation. Cooperation is a puzzle for biologists because it doesn’t make obvious evolutionary sense. In cooperation, the donor pays a cost and the recipient gets a benefit, as measured in terms of reproductive success. That reproduction can be either cultural or biological and the challenge to explain remains.
It may be simplest to consider this in mathematical terms. In game theory, the prisoner’s dilemma makes the problem clear to us. Given a set of outcomes where we’re individually better off defecting, it’s incredibly hard to understand how we get to a cooperative state, where we both benefit more. Biologists see the same problem, even removing rationality from the equation. If you let different populations compete, the defectors win out against the cooperators and eventually extinguish them. Again, it’s hard to understand why people cooperate.
There are five major mechanisms that biologists have proposed to explain the evolution of cooperation:
– kin selection
– direct reciprocity
– indirect reciprocity
– spatial selection
– group selection
Nowak works us through the middle three in some detail.
In direct reciprocity, I help you and you help me. This is what we see in the repeated prisoner’s dilemma. It’s no longer best to defect. As originally discovered by Robert Axelrod in a computerized tournament, the three-line program “Tit for Tat” wins:
At first, cooperate.
If you cooperate, continue to cooperate.
If you defect, defect.
While it’s a powerful strategy, it’s very unforgiving. If there’s a mistake, there’s an endless cycle of retaliation. Nowak wondered what would happen if natural selection designed a strategy. He created an environment to allow this, and permitting random errors to create a harder environment. If the other party plays randomly, the best strategy is to defect every time. But when tit for tat is introduced, it doesn’t last for long, but it does lead to rapid evolution. You’ll see “generous tit for tat” – if you cooperate, I will. If you defect, I will still cooperate with a certain probability. Nowak suggests that this is a good strategy for remaining married, and step towards the evolution of forgiveness.
In a natural selection system, you’ll eventually reach a state where everyone communicates, always. A biological trait needs to be under competition to remain – we can lose our ability to defect and become extremely susceptible to a situation where an always defect strategy can come into play. Cooperation is never stable, he tells us – it’s about how long you can hold onto it and how quickly you can rebuild it. Mathematically, direct reciprocity can come about if the benefits of cooperation, on average, outweigh the costs of playing a new round.
Indirect reciprocity is a bit more complex. The good samaritan wasn’t thinking about direct repayment. Instead, he was thinking “if I help you, someone will help me.” This only happens when we have reputation. If A helps B, the reputation of A increases. The web is very good at reputation systems, but we’ve got simple offline systems as well. We use gossip to develop reputation systems. “For direct reciprocity, you need a face. And for indirect reciprocity, you need a name and the ability to talk about others.” In indirectly reciprocal systems, cooperation possible if the probability to know someone’s reputation exceeds the costs associated with cooperation. And this only works if the reputation system – the gossip – is conducted honestly.
In spatial selection, cooperation happens based on people who are close geographically, in terms of graph theory. Graph selection favors cooperation if there’s a few close neighbors – it’s much harder to do with lots of loose collaborators. A graph where you’re loosely connected to a lot of people equally doesn’t tend towards cooperation.
Nice summary of important work. One questions though. You write “The web is very good at reputation systems.” I’m curious what makes you say so. For dissenting views on this point, see e.g. Craig Newmark and Clay Shirky.
The ‘voucher’ system of parking in Canberra, Australia provides an interesting example of something that fits into none of the above categories. You park your car, go to a machine, put in money for an amount of time you select, and put the resulting ticket under the windshield in your car. Many people, if they have a reasonable amount of time left on the ticket when they return, will go to a modest amount of effort to find somebody to pass it on to (I will definitely walk 10 meters, probably not 50). I’ve been on both ends of this kind of transaction for decades, but never with people I’ve known anything about, so it’s indirect reciprocity without reputation. One interesting thing about it, I think, is that a clever investigator coudl probably measure the average maximum amount of effort that people will expend on this very pure form of altruism, in terms of distance walked and time spent.
Pingback: Daily Links for June 10th | Akkam's Razor
Jonathan, that sentiment should be attributed to the speaker, not to me – I’m liveblogging his talk…
Comments are closed.