Collaborations that bridge the political spectrum produce higher-quality work than articles that moderate or one-sided teams edit, according to a new study of more than 200,000 Wikipedia pages.
Many studies have found that political polarization in the United States is rapidly increasing, particularly online, where echo chambers and social media have inflamed partisanship.
“…when we’re able to work together, then we’re able to produce a more complete and balanced perspective.”
But the new study finds that when editors with a broader range of political affiliation team up to create Wikipedia pages covering politics, social issues, and science, those pages rank better on Wikipedia’s own quality scale because of diverse perspectives, increased debate, and appeals to community guidelines.
The analysis, which appears in Nature Human Behavior, suggests that ideological diversity, in a system with well-defined policies, can actually create more productive and higher quality collaborations.
“This study doesn’t say we can always get along,” says James Evans, professor of sociology at the University of Chicago, director of Knowledge Lab, and a leading scholar in the quantitative study of how ideas and technologies emerge.
“But if we’re diverse along political lines, it actually means that we bring separate perspectives, and when we’re able to work together, then we’re able to produce a more complete and balanced perspective. If we’re imbalanced, then this study also suggests how bad it can be.”
Working across the aisle
The crowdsourced model of Wikipedia allows any user to edit most pages, as long as they follow the site’s guidelines on providing sources and avoiding bias. Other users enforce these policies in a decentralized fashion, and users conduct discussions over the legitimacy of edits on each article’s “talk page.” Articles on controversial events, topics, or figures, such as the Syrian Civil War, abortion, or George W. Bush, attract a higher rate of edits and discussion, and may include an extra level of protection where edits require community approval before appearing.
In the new study, the researchers first estimated the political affiliation of more than 600,000 Wikipedia contributors through how often they contributed to liberal or conservative articles. They then measured the overall political alignment of each editing community behind 232,000 different Wikipedia pages, considering editor groups with a broader range of ideological alignments to be more “polarized.”
When they compared this measure of polarization against Wikipedia’s six-category scale for article quality (ranging from “stub” up to “featured article”), the authors found that higher polarization associated with higher quality—not just for political articles, but also those on social issues and science topics.
“While political polarization is now regarded as toxic or brutal, it can work in our favor if it begets diversity of views, balanced engagements, and reasoned debates,” says Feng Shi, a data scientist with the Odum Institute for Research in Social Science at the University of North Carolina at Chapel Hill.
“Polarization of the editors is positively associated with the quality of their work, even controlling for article length, editing activity, previous editing experience, and other article and talk page attributes.”
Rules matter
The analysis also found that polarization drives different styles of discussion on article “talk pages.” By analyzing the content of these pages, researchers found that polarized teams engage in more debate but with less toxic conflict than ideologically uniform editing communities, where the efforts of lone, contrarian editors to “de-bias” articles provoke charged disputes. Polarized teams also refer to Wikipedia’s policies and guidelines more frequently, a structure that protects against the raw emotions and abuse found in many less-regulated online communities.
“Our work suggests that increasing oversight and bureaucracy can be highly beneficial for content,” says Misha Teplitskiy, a postdoctoral fellow at Harvard University’s Laboratory for Innovation Science.
“Another way in which Wikipedia is different is its well-known and well-publicized commitment to discourse and consensus. Strongly signaling such a mission upfront may induce self-selection of only those individuals who are willing to cooperate for a common good.”
“It’s important, and perhaps surprising, to note that Wikipedia’s guidelines are generative of not just quality articles, but a sustained culture,” says Eamon Duede, a PhD student in the University of Chicago’s Committee on the Conceptual and Historical Studies of Science.
“These are not just rules concerning what can be said and in what manner in an article. These stipulate acceptable social conduct, how editors treat one another in talk page debate. But, also how we, as researchers, engage with the community. It required enormous effort to earn the right to conduct this study. We had to become members of the community in order to understand the community.”
Though the current study focused solely on Wikipedia polarization, the authors suggest that its conclusions could expand to other collaborative sites, or even to the formation of ideologically diverse teams in the offline world.
“Wikipedia works because it has a culture where people can appeal to guidelines and recommendations, and they do, they rely on the laws of the community,” Evans says.
“In a community or media environment without laws, or with reducing norms, it becomes potentially a toxic environment where there are shorter conversations, less collaboration, and lower quality.”
Funding came from the National Science Foundation, the John Templeton Foundation, and the Air Force Office of Scientific Research.
Source: University of Chicago