Overstating the true level of cooperation in a society can increase cooperative behavior overall, research finds.
Remember Napster? The peer-to-peer file sharing company, popular in the late 1990s and early 2000s, depended on users sharing their music files. To promote cooperation, such software “could mislead its users,” says Bryce Morsky, a postdoctoral researcher at the University of Pennsylvania.
Some file-sharing software companies falsely asserted that all of their users were sharing. Or, they displayed the mean number of files shared per user, hiding the fact that some users were sharing a great deal and many others were not. Related online forums promoted the idea that sharing was both ethical and the norm. These tactics were effective in getting users to share because they tapped into innate human social norms of fairness.
That got Morsky thinking. “Commonly in the literature on cooperation, you need reciprocity to get cooperation, and you need to know the reputations of those you’re interacting with,” he says. “But Napster users were anonymous, and so there should have been widespread ‘cheating’—people taking files without sharing—and yet cooperation still occurred. Evidently, obscuring the degree of cheating worked for Napster, but is this true more generally and is it sustainable?”
In a new paper in the journal Evolutionary Human Sciences, Morsky and Erol Akçay, an associate professor in the biology department, looked at this scenario: Could a cooperative community form and stabilize if the community’s behaviors were masked? And would things change if the community members’ true behaviors were eventually revealed?
Using a mathematical model to simulate the creation and maintenance of a community, their findings show, as in the example of Napster, that a degree of deceit or obfuscation does not impede and, indeed, can promote the formation of a cooperative community.
The researchers’ modeling relied on an assumption that has been upheld time and time again, that humans are conditionally cooperative. “They will cooperate when others cooperate,” Akçay says.
But the threshold of when someone will start cooperating differs from individual to individual. Some people will cooperate even when nobody else is, while others require most of the community to cooperate before they will do so too. Depending on the number of people with different cooperation thresholds, a community can wind up with either very high or very low levels of cooperation. “Our goal was to figure out, How can obfuscation act as a catalyst to get us to a highly cooperative community?” says Morsky.
To model this, the researchers envisioned a theoretical community in which individuals would join in a “naïve” state, believing that everyone else in the community is cooperating. As a result, most of them, too, begin cooperating.
At some point, however, the formerly naïve individuals become savvy and learn the true rate of cooperation in the community. Depending on their threshold of conditional cooperation, they may continue to cooperate, cheat, or get discouraged and leave the community.
In the model, when the researchers decreased the learning rate—or kept the true rate of cooperation in the group a secret for longer—they found that cooperation levels grew high, and savvy individuals quickly left the population. “And because those savvy individuals are the ones that don’t cooperate as readily, that leaves only the individuals who are cooperating, so the average rate of cooperation gets very high,” says Akçay.
Cooperative behavior could also come to dominate provided there was a steady inflow of naïve individuals into the population.
Akçay and Morsky note that their findings stand out from past research on cooperation.
“Typically when we and others have considered how to maintain cooperation, it’s been thought that it’s important to punish cheaters and to make that public to encourage others to cooperate,” Morsky says. “But our study suggests that a side effect of public punishment is that it reveals how much or how little people are cooperating, so conditional cooperators may stop cooperating. You might be better off hiding the cheaters.”
To continue exploring conditional cooperation, the researchers hope to follow with experiments with human participants as well as further modeling to reveal the tipping points for moving a group to either cooperate or not and how interventions could change these tipping points.
“You can see how conditional cooperation factors into behavior during this pandemic, for example,” Akçay says. “If you think a lot of people are being careful (for example, wearing masks and social distancing), you might as well, but if the expectation is that not many people are being careful you may choose not to. Mask wearing is easy to observe, but other behaviors are harder, and that affects how the dynamics of these behaviors might unfold.
“This is a problem that humans have had to solve over and over again,” he says. “Some amount of cooperation is required to have a society be worthwhile.”
Funding for the work came from the University of Pennsylvania.
Source: Penn