Group membership biases perceptions of truth
Posted Mar 31, 2019
It's no secret that 'fake news' was a prominent feature of the most recent U.S. presidential election. Estimates show that fake news stories were spread around 30 million times on Facebook alone in three months prior to the election.
Fake news is simply information that is spread by a source that is known to not be true. This can be rather benign (like a parody article on theonion.com) or more serious (like the false information Russia spread to try and influence the U.S. election in favor of Donald Trump or the spreading of false information about EU laws that surrounded the Brexit vote in the United Kingdom).
It is not the same thing as bias or political interpretation. People can have views and different focuses of emphasis on things that have objectively happened (an interpretation of true events), but this only ventures into fake news when it leads to the spreading of false information. If I think the border wall is a bad idea and I say something like, "Trump is exaggerating the threat of immigration" that isn't fake news. If I say, "Trump kicks a Mexican baby to try and force Mexico to help fund the wall" that is fake news. I mean as much as I don't like the guy, he hasn't been kicking Mexican babies, not literally with his own shoes at least.
This all gets a bit tricky though because often what we interpret as true is biased by our personal values and group memberships. This isn't to say the truth is relative or changes. What happened happened. But it is to say that what we come to believe is shaped by these things.
The earliest study to test this directly was by researchers at Stanford University. They had undergraduate students that were for or against the death penalty read short statements about research conclusions that matched their views and that opposed their views. Both sides of the issue were more likely to believe evidence supporting their viewpoint than evidence opposing their viewpoints, even though the evidence was presented in largely identical ways.
A recent study tested this specifically using fake news that was spread during the U.S. election of Donald Trump. Participants read actual news taken from Facebook and then rated whether or not they thought it was true. It was found across studies that Trump supporters were more likely to believe fake news if it was supportive of Donald Trump or negative about Hillary Clinton. Conversely (but similarly) Clinton supporters were more likely to believe fake news if it was supportive of Clinton or hostile towards Trump.
There was a limit to this, however. People high in "cognitive reflection" were less likely to believe in false news regardless of the source and their pre-existing preferences than people low in cognitive reflection. This is a measure of people's tendency to override their "gut level" responses when making a decision.
In terms of the actual election outcome this can not speak directly to whether fake news could have influenced actual votes. Such a study would be extremely difficult to conduct because these biases often exert their influence outside conscious awareness, and people are motivated to deny their susceptibility to both social influences and being persuaded in general.
But the results do show that what we interpret as real or fake news is partially shaped by our pre-existing beliefs and what we want to believe. They also show that, perhaps contrary to what many people believe about themselves, fake news can be persuasive.