Skip to main content

Verified by Psychology Today

When Polarization Is Beneficial

Evidence shows that some polarization, if well navigated, is useful.

Key points

  • Polarized Wikipedia teams with ideologically diverse editors produce articles of a higher quality than homogeneous teams.
  • The social media platform Polis provides one example of a design that encourages polarized actors to engage constructively.
  • It’s important that discussions of polarization not turn into calls for everyone to just agree with one another.

So much is being said these days about “fake news,” “culture wars,” and resulting polarization. A 2021 investigation by a team at the University of Oxford found 81 countries running coordinated campaigns on social media to manipulate public opinions. They documented governments using techniques like bots, carefully designed memes, and false information. There is also now real-world evidence of YouTube users being driven toward more extreme and hateful ideologies.

Attempts to shape and control the beliefs of populations are ancient, but they’re being turbocharged by the internet, which seems to be an important force driving some ideological camps further and further apart.

Each of us now faces a challenge if we want to avoid false or heavily biased information to get a more honest and fulsome picture of our world. What can we do?

We might start thinking less like Twitter and more like Wikipedia. The site is built from the shared work of volunteers from all over the world who hold very different views, and who don’t know or trust each other. They come together to edit articles, even on highly contentious political issues. How is this possible? Because of strong community guidelines that can be used to navigate disagreements.

A study on the effects of polarized teams

What Wikipedia shows is that, if embraced in the right ways, some types of polarization can be beneficial.

A study of Wikipedia articles found that “polarized teams consisting of a balanced set of ideologically diverse editors produce articles of a higher quality than homogeneous teams.”

Why is that? Imagine that you’re trying to learn about the current situation in Cuba and happen upon this statement by the Britain Cuba Solidarity Campaign:

On Sunday 11 July, some street protests took place against the scarcity of food, medicines and power supplies. The vast majority of these protesters have genuine concerns regarding these shortages. President Miguel Díaz-Canel travelled to San Antonio de los Baños, site of the original demonstration, and spoke to people about their grievances ...

Thousands of Cubans supporting the government have taken to the streets across the island in counter-demonstrations against US interference ...

The current emergency is a result of the ongoing US blockade, an additional 243 sanctions imposed by the Donald Trump administration, and the impact of the COVID-19 pandemic.

The Campaign writes about misinformation being secretly used against the Cuban government, coordinated from abroad. They state that protests in Cuba have gotten too much attention from biased corporate media, compared to other world events of a similar scale that have been ignored. They charge that while the US embargo on Cuba is the root cause of the country’s problems, it hasn’t been discussed in most international media coverage.

Luis F. Rojas/Wikimedia Commons
Protesters in Miami, Florida call for "help for Cuba," July, 2021.
Source: Luis F. Rojas/Wikimedia Commons

A Wikipedia article edited only by these authors would clearly provide you with some insights about Cuba. But the article would also omit significant information.

The Britain Cuba Solidarity Campaign didn’t make any mention of the Cuban government doing anything wrong. (To me this framing seems to unintentionally infantilize or dehumanize the Cuban leadership, making them one-dimensional and incapable of making a fully human range of choices, which includes harmful ones. The US is portrayed as the only actor with actual agency.)

Here’s a different framing:

On July 11, the people’s demands weren’t received with attentive listening… instead they were met with scorn, vilification and, worst, violent repression. In recent days, Cuban people have been brutally chased through the streets by the police, regular military forces and undercover agents… In the rare and shocking videos leaked despite internet restrictions, anyone who dares to watch could see the patrolling troops running, threatening unarmed people and beating them…. After three days of protests, an estimated 200 Cubans are considered disappeared, presumably arrested. The killing of one man has been officially disclosed.

It’s not that the factors the Britain Cuba Solidarity Campaign chose to focus on—the US embargo on Cuba and COVID-19—are unimportant. Learning of them is worthwhile. But that single viewpoint is obviously of a lower quality than taking both of the two viewpoints shared above and trying to work out in one Wikipedia article what is established fact and needs to be said.

Similarly, reading the viewpoint of just one media outlet that chooses not to mention the US embargo would provide us with significantly less insight than if we took that media’s viewpoint together with the writing of the Britain Cuba Solidarity Campaign.

So even if a Wikipedia article has dozens of caring editors, if they share beliefs, they’ll tend to think of the same information as important and worth including—like the impacts of the US embargo—and to omit or be unaware of the same things—like the violence of the Cuban police. Obvious gaps in the article will not be obvious to these authors, because of their particular positions within information bubbles and echo chambers. Adding more editors who think similarly won’t help. But polarized teams can.

The study uncovered how polarized teams spend more time having serious discussions about the article they’re editing and making frequent use of Wikipedia’s policies to navigate their disagreements. This process helps guide them toward a more valuable article in the end.

So it’s important that discussions of polarization not turn into calls for us all to just agree with one another or to give up our passions for certain issues or causes. In fact, people with different values, beliefs, and expertise can be great for problem-solving, as decades of research has shown, so long as there’s a process for them to talk and engage with each other.

A digital platform that boosts points of commonality

Consider another example of a digital space that’s designed to promote good quality interactions between people with very different views: the social media platform Polis. Here’s a description of it being used to debate controversial policy decisions in Taiwan:

As the debate began, Polis drew a map showing all the different knots of agreement and dissent as they emerged. As people expressed their views, rather than serving up the comments that were the most divisive, it gave the most visibility to those finding consensus—consensus across not just their own little huddle of ideological fellow-travellers, but the other huddles, too. Divisive statements, trolling, provocation—you simply couldn’t see these.

The way that Facebook and Twitter are designed incentivizes users to be negative about out-groups: That gets them more likes and interactions. Polis, on the other hand, is designed not to boost sensational lies or outraged negativity about perceived enemies, but to boost points of commonality that can be discussed further. And what the platform boosts is what most people see, which in turn changes how they behave on Polis. This is consistent with experimental evidence that how information is presented to parties in conflicts makes a big difference to the quality of the conflict that ensues.

The design of Polis sets a more constructive tone, bringing people with different ideologies together in ways where they can actually have more complex interactions, learn something from each other, and come up with better quality solutions to problems. Users can still express aggressive and combative views; they just get marginalized by the Polis algorithm, so they don’t become the center of attention.

The fact that Polis can do this suggests that other social media can too. What could get them to make this change is, of course, a very different question. But there is a glimmer of hope that many ways to successfully build understanding and overcome hate and extremism have been studied and developed. And many organizations are working on the problem of affective polarization, where we don't just disagree with an out-group but feel negatively about them as people. More needs to be done now to put what we already know into practice.

advertisement