The Psychological Connection of Bias to Censorship
Does bias easily become censorious?
Posted September 12, 2020 | Reviewed by Devon Frye
Bias and censorship are often treated as entirely different phenomena for good reasons. The term “bias” has so many different meanings that each, by itself, can and has constituted entire subfields of research (consider bias as “prejudice” and as “heuristics”). Censorship, too, has many different aspects, including legal issues, moral issues, political issues, and self-censorship. However, bias and censorship intersect in socially, politically, and psychologically important and mutually reinforcing ways. In this post, I focus on academia, though the analysis probably has some relevance to other contexts, including law, media, and government.
Censorship requires power over others. Power means the ability to reward, withhold rewards, and punish. At its core, academia is a social-reputational system, meaning that one’s success is determined only indirectly by one's performance, and directly by others’ evaluations. This then provides insight into how bias can produce censorship. Others’ evaluations constitute the basis for receiving (or not receiving) rewards and punishments. Certain fields in academia can and have become theoretical or ideological echo chambers—in which claims become dominant and even canonized, either without much, or in some cases any, supporting data. Go here for examples from social psychology, astronomy, and medicine. Those reaching different conclusions, even when those conclusions are based on high-quality evidence, may have a very difficult time getting those conclusions into the published literature.
Similarly, in fields that have embraced social or political activism, there is a risk of activist goals trumping science. For example, some intellectuals may view climate change as such an important and pressing issue that anyone presenting any evidence that contests any aspect of climate arguments may be denounced and ostracized—see this case, where a scientist had strong data showing extreme weather was not increasing and in this paper, we tell the story of a zoologist who lost a faculty position for arguing that polar bear populations were not actually declining (contra dramatic claims otherwise) and provide links to stories by Canadian Geographic and the World Wildlife Fund showing she was right.
Similarly, when fields become ideological homogenous, as have most social sciences and humanities (which have become 80 to 90 percent leftwing, and, at elite universities, sometimes 100 percent), it becomes far easier to sneer at one’s ideological opponents, and to express shared values that are only shared by one’s comrades-in-arms, thereby creating a hostile environment for any who might not share those values—and create obstacles to their entry or continuance in the field.
This recent article tells the story of a medical researcher in the UK who claims to be having unusual difficulty publishing her work showing that one source of the ability to survive coronavirus infection is being exposed to it when young. She claims that her articles are rejected, not because reviewers claim they are wrong, but because at least some describe her work as "dangerous" (as if believing medical falsehoods could ever be safer than knowing the truth).
I have several prior posts at Psych Today that describe attempts (some successful, some not), to punish people for expression, such as here, here, here, here, and here. A living (ongoing, updated as these occur) list of such attempts targeting academics can be found here.
"Exit" is a concept developed by Harvard legal scholar and former Obama administration member, Cass Sunstein, in his book Going to Extremes (on how polarization produces political extremism). As he describes it, "... moderate members leave the group because they dislike the direction in which things are heading." He continues, "... only the true believers remain. Those believers regard themselves as 'best friends' and a substitute for family. These are the most dangerous conditions of all: The groups include extremists, unified by bonds of affection and solidarity, and prone to discussions only among themselves."
When all think alike, any dissent does not necessarily seem disagreeable; it seems outlandish, wrong, and sometimes, morally repulsive. Unfortunately, data only rarely neatly fits into simple left/right, racist/anti-racist, patriarchy/feminist narratives, so that the refusal to admit certain types of ideas or evidence into discussion impoverishes the ability to actually understand human social and psychological processes, and even the sources and solutions to social problems.
Even in academia, where supposedly, tenure provides bedrock protection against firing, tenured professors have, in fact, been fired for their speech; furthermore, tenured faculty can and have been removed or successfully pressured to step down from administrative positions (such as deans, provosts). Such events do not need to occur frequently to create a chilling effect; once people see they may be denounced and potentially fired (or pressured to step down or deplatformed or have their papers retracted), they may decide to self-censor, either by not addressing certain topics altogether, not reporting findings expected to court controversies, or by not publicly expressing ideas known to be sources of trouble in the academy.
These processes are likely exacerbated by a slew of psychological biases—ideological extremists routinely view their opponents as holding more extreme views than they really do, confirmation and misanthropic biases can lead people to see their opponents as evil and immoral, and my side and preference (aka as "desirability") biases can lead people to essentially tribalistic views (if someone on “my side” said it, it's fine, but if your side makes the same claim, it is evil incarnate).
These biases may not be a serious problem for a field that is reasonably well-balanced, and includes ample people with competing biases. However, when the political identities of those making up particular fields become heavily skewed, there is little to prevent or limit punishment of those one sees as enemies.
This gets us back to the "scientific game," which is mostly played by acceptance and rejection of peer-reviewed papers and grants. Bias then produces censoriousness in either of two ways:
- Those holding the most extreme views are often most certain of their views, no matter how wrong they are, so they can "honestly" (in some twisted sense) reject papers as "erroneous.'
- They know the game is played, and will make bald declarations that papers have "errors" without ever identifying them (because they often have no errors so that actual errors cannot be identified).
Even in the first case, the bias-masquerading-as-scientific-rejection-producing-censorship often manifests as claims of "errors" without actually identifying any, as described in case after case here.
One can see this witch's brew on full display here. Norman Wang published an article critical of the effectiveness of affirmative action programs and highlighting unintended negative side effects (such as producing higher dropout rates among those admitted). The article was retracted in response to a classic academic outrage mob that used name-calling, misrepresentation, and UFE's* to impugn the paper and the author without refuting it. One can see all of this on full display in response to the journal's announcement that it had retracted the paper. The journal's obsequious apology appears here, and the retracted paper, which is filled with data and over 100 references, can be found here.
Those subscribing to contrarian or dissident views, therefore, may be effectively silenced, either because it becomes inordinately difficult to get their papers published or grants funded, or because they self-censor in order to avoid punishment. As Michael Inzlicht stated, “What if I felt that overemphasis on oppression is a terrible idea, hurts alleged victims of oppression, and is bad for everyone? What if I was outspoken about this? I suspect I would face a lot more opposition. Even though not much could happen to my job security, I’d have a lot of people screaming at me, making my life uncomfortable. And truly, I wouldn’t do it, because I’d be scared.”
*UFEs = Unidentified Flying Errors. These are accusations of error that fail to actually identify any.
As usual, please read my Guidelines for Engaging in Controversial Discourse before replying here. In short, no snark, no personal insults, and please keep it short and on topic.