Biased Science Makes Bad Policy
Organizations frequently apply scientific findings to policy-making prematurely.
Posted June 29, 2021 | Reviewed by Abigail Fagan
- Bias among liberal social scientists may explain—at least in part—various false positive findings and failed interventions.
- Psychology professors reported that they would be punished by peers if they reported data portraying women or minorities in an unflattering way.
- Most organizations are liberal or answer to liberals, so their biases likely resemble those of liberal scholars.
A tempting intuition is that human reasoning works in precisely the way that we feel it does. We are exposed to certain facts and information; we think about them in a cool, rational manner, and then we decide whether and how to assimilate that new information to form more empirically accurate beliefs.
However, human cognition (like all animal cognition) evolved to promote fitness—to increase the odds of reproducing high quantity and quality offspring. Although in many cases, holding accurate beliefs can promote fitness (e.g., correctly distinguishing between nutritious food and toxic substances), humans face other challenges as well. Social acceptance and social status are critical for human fitness, and so humans evolved to reason in ways that help them attain and maintain social status and belonging.
For humans, social ostracization can be extremely costly—it can eliminate one’s ability to secure a high-quality mate and threaten one’s physical well-being and survival. Consider, for example, intragroup religious similarity and intergroup religious differences. Throughout much of human history (and even in present day), one could be persecuted, exiled, or killed for holding the wrong religious beliefs. And so people evolved to process and assimilate information to conform to important local beliefs.
Similarly, social status can be extremely beneficial—it can increase one’s ability to secure high quality mates and other resources. One of the best ways to attain social status is to provide benefits to other group members, and so humans evolved to signal loyalty and value to their social groups by being fierce defenders of the group’s interests.
In recent decades, secular tribal identities have risen to prominence in many advanced Western societies. Political ideology is now a primary source of ingroup identity and intergroup conflict, which makes political preferences and beliefs particularly vulnerable to tribal cognition.
Political tribal cognition mainly works in two ways. First, people selectively pursue information that supports their groups’ beliefs and avoid information that challenges their groups beliefs (e.g., by exposing themselves to a biased set of media outlets). And second, once people are exposed to information, they are highly credulous and accepting toward congenial information and excessively skeptical and critical of uncongenial information.
In the laboratory, political tribal cognition has been demonstrated dozens of times. For example, people evaluated the exact same policies as more effective when they were proposed by ingroup politicians than when they were proposed by outgroup politicians. And people evaluated the exact same scientific methods as higher quality when they supported rather than challenged their own political beliefs. Although scientific evidence may be the ideal basis for forming effective policy, our evaluations of evidence quality are biased such that we evaluate evidence as higher quality when it concords with our political groups’ beliefs.
A recent meta-analysis that my colleagues and I conducted, which summarized the findings of over 50 political bias studies, found that both liberals and conservatives are politically biased and to virtually identical degrees. Although scholarship often paints an unflattering portrait of political conservatives, it seems liberals are similarly prone to allowing their political identities to distort their reasoning.
However, liberals and conservatives have slightly different biases. Biases tend to reveal themselves for topics, values, and policy preferences that are highly central to a group’s identity. Generally, whichever group cares more about a particular issue—whichever group has stronger attachment to their preferred narrative—will demonstrate stronger biases.
Liberals are particularly prone to biases when evaluating information with significance to relatively lower status groups (e.g., women, ethnic and racial minorities). Liberals, on average, are more egalitarian than conservatives and have stronger desires to protect relatively low status groups. Consequently, they demonstrate biases against any information that portrays low status groups unfavorably (e.g., as less skilled at math, less proficient leaders, less intelligent, or more violent) compared to high status groups, whereas conservatives treat such information more similarly.
For example, across a couple of sets of studies, people evaluated science on sex differences more favorably when women were portrayed more positively than men (e.g., as better drawers, less prone to lying, and more intelligent) than when men were portrayed more positively than women. And these tendencies were stronger as participants were more politically left-wing.
In an ongoing project I'm working on, psychology professors reported that they likely would be punished or ostracized by their peers if they reported data that portrayed women or racial minorities in an unflattering way or explained group differences (i.e., the underrepresentation or underperformance of certain groups) with any explanation other than discrimination. Many also asserted a high degree of confidence that discrimination is not the only cause of group disparities, and those who did so were more likely to report self-censoring their own views.
This bias among liberal social scientists likely explains—at least in part—various false positive findings and failed interventions. For example, the existence of implicit racial bias, which conforms to liberals’ preferences for discriminatory explanations for group differences, was once thought to explain disparate outcomes in various domains (e.g., educational and career outcomes). However, recent meta-analyses of this literature reveal that implicit bias may be virtually unrelated to discriminatory behavior. And despite serious efforts to create implicit bias trainings and interventions, there is little to no evidence that these have had or will have any positive effect. Cases of unreliable research like this plague the social sciences.
These false positives and expensive failed interventions (and their apparent basis in left-wing political bias) have caused mistrust of social scientists and higher education broadly. Moreover, a growing body of research has uncovered discrimination in academia against conservative scholars, further elucidating reasons for the stark political imbalance in the social sciences. Politically homogenous communities not only deter original thinkers from entering those communities but intimidate those within the community who might occasionally disagree. This can create false consensuses, the elevation of expedient but empirically inaccurate ideas, and the stifling of better ones.
Credulity in Organizations
Over the past several years, most organizations have become increasingly politically leftwing. This is not a problem in principle, but the ideological homogeneity of such groups can create environments that encourage groupthink, foster systematic biases, and suppress dissent. Left unchecked, politically homogenous communities can become engines for false narratives that create inefficient and unreliable bases for effective policy.
Just as experts can be biased in their research, manipulating methods and analyses to confirm their preferred hypotheses, organizations and policy-makers can be biased in their recruitment of research and expertise, selectively attending to (and ignoring) research to support their political agenda or bottom line. Organizations frequently apply scientific findings to policy-making prematurely on unjustified assumptions about generalizations from laboratory studies to real-life situations. Moreover, organizations and the broader public can tend to falsely equate scientific support with truth and certainty. In reality, when Scientific Report A says X, there is often Scientific Report B saying Y. People and organizations can then choose to believe whichever report is more convenient.
Most organizations are liberal (or answer to liberals) and so their biases likely resemble those of liberal scholars (and likely include discrimination against conservatives and conservative perspectives). To minimize these concerns in organizational decision making, organizations might consider the reforms that science is seeking to adopt. First, organizations can increase accountability by promoting transparency in all decision making. Second, they can adopt an adversarial collaboration approach: solving problems by engaging both advocates and dissenters of particular perspectives. This could mean engaging adversaries directly, funding research that takes an adversarial collaboration approach, or consulting diverse expertise.
Any human-guided endeavor that addresses social issues will be vulnerable to biases that can steer people away from accuracy. Humans conceive questions and problems; humans collect data to address them; humans disseminate findings to other humans; and humans apply those findings to create social change. Just as in the game of telephone, where an original message is corrupted as it passes from person to person, so too in science, biases distort each stage of the research process. Despite these flaws, consulting experts and collective knowledge are still the best bases for making many kinds of decisions. When we do consult with experts, however, we should be sure to ask a critical question, “Which other experts disagree with you on this?”
This essay also appeared on Cast from Clay’s Vantage Point.
Brown, A. (2018, July 26). Most Americans say higher ed is heading in wrong direction, but partisans disagree on why. Pew. https://www.pewresearch.org/fact-tank/2018/07/26/most-americans-say-hig…
Clark, C. J. (2021). Are liberals really more egalitarian? Psychology Today.
Clark, C. J. (2021). How we empower political extremists. Psychology Today.
Clark, C. J., Fjeldmark, M., Baumeister, R. F., German, K., Tice, D., von Hippel, B., Winegard, B. M., & Tetlock, P. E. (2021). Taboos and self-censorship in the social sciences [Unpublished manuscript]. Department of Psychology, University of Pennsylvania, Philadelphia, PA.
Clark, C. J., Honeycutt, N., & Jussim, L. (2021). Replicability and the psychology of science. In S. Lilienfeld, A. Masuda, & W. O’Donohue (Eds.), Questionable Research Practices in Psychology. New York: Springer.
Clark, C. J., Keighley, D., & Vasiljevic, M. (2021). Being bad to look good: Competence reputational stakes can increase unethical behavior. Unpublished manuscript.
Clark, C. J., Liu, B. S., Winegard, B. M., & Ditto, P. H. (2019). Tribalism is human nature. Current Directions in Psychological Science, 28(6), 587-592.
Clark, C. J., & Tetlock, P. E. (in press). Adversarial collaboration: The next science reform. In C. L. Frisby, R. E. Redding, W. T. O’Donohue, & S. O. Lilienfeld (Eds.), Political Bias in Psychology: Nature, Scope, and Solutions. New York: Springer.
Clark, C. J., & Winegard, B. M. (2020). Tribalism in war and peace: The nature and evolution of ideological epistemology and its significance for modern social science. Psychological Inquiry, 31(1), 1-22.
Clark, C. J., Winegard, B. M., & Farkas, D. (2020). A cross-cultural analysis of censorship on campuses. Unpublished manuscript.
Cohen, G. L. (2003). Party over policy: The dominating impact of group influence on political beliefs. Journal of Personality and Social Psychology, 85(5), 808-822.
DeMarree, K. G., Clark, C. J., Wheeler, S. C., Briñol, P., & Petty, R. E. (2017). On the pursuit of desired attitudes: Wanting a different attitude affects information processing and behavior. Journal of Experimental Social Psychology, 70, 129-142.
Ditto, P. H., Clark, C. J., Liu, B. S., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F. (2019). Partisan bias and its discontents. Perspectives on Psychological Science, 14(2), 304-316.
Ditto, P. H., Liu, B. S., Clark, C. J., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F. (2019). At least bias is bipartisan: A meta-analytic comparison of partisan bias in liberals and conservatives. Perspectives on Psychological Science, 14(2), 273-291.
Duarte, J. L., Crawford, J. T., Stern, C., Haidt, J., Jussim, L., & Tetlock, P. E. (2015). Political diversity will improve social psychological science. Behavioral and Brain Sciences, 38, e130.
Durkee, P. K., Lukaszewski, A. W., & Buss, D. M. (2020). Psychological foundations of human status allocation. Proceedings of the National Academy of Sciences, 117(35), 21235-21241.
Eitan, O., Viganola, D., Inbar, Y., Dreber, A., Johannesson, M., Pfeiffer, T., ... & Uhlmann, E. L. (2018). Is research in social psychology politically biased? Systematic empirical tests and a forecasting survey to address the controversy. Journal of Experimental Social Psychology, 79, 188-199.
Ellsworth, P. C. (2021). Truth and Advocacy: Reducing Bias in Policy-Related Research. Perspectives on Psychological Science, 1745691620959832.
Fernbach, P. M., & Light, N. (2020). Knowledge is Shared. Psychological Inquiry, 31(1), 26-28.
Finkel, E. J., Bail, C. A., Cikara, M., Ditto, P. H., Iyengar, S., Klar, S., ... & Druckman, J. N. (2020). Political sectarianism in America. Science, 370(6516), 533-536.
Forscher, P. S., Lai, C. K., Axt, J. R., Ebersole, C. R., Herman, M., Devine, P. G., & Nosek, B. A. (2019). A meta-analysis of procedures to change implicit measures. Journal of Personality and Social Psychology, 117(3), 522-559.
Frimer, J. A., Skitka, L. J., & Motyl, M. (2017). Liberals and conservatives are similarly motivated to avoid exposure to one another's opinions. Journal of Experimental Social Psychology, 72, 1-12.
Hanania, R. (2021, April 21). Why is everything liberal? https://richardhanania.substack.com/p/why-is-everything-liberal
Honeycutt, N., & Jussim, L. (2020). A model of political bias in social science research. Psychological Inquiry, 31(1), 73-85.
Inbar, Y. (2020). Unjustified Generalization: An Overlooked Consequence of Ideological Bias. Psychological Inquiry, 31(1), 90-93.
Inbar, Y., & Lammers, J. (2012). Political diversity in social and personality psychology. Perspectives on Psychological Science, 7, 496-503.
Janis, I. (1991). Groupthink. In E. Griffin (Ed.) A First Look at Communication Theory (pp. 235-246). New York, NY: McGraw Hill.
Kaufmann, E. (2021). Academic freedom in crisis: Punishment, political discrimination, and self-censorship. Center for the Study of Partisanship and Ideology, 2, 1-195.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098-2109.
Lucas, B. J., & Kteily, N. S. (2018). (Anti-) egalitarianism differentially predicts empathy for members of advantaged versus disadvantaged groups. Journal of Personality and Social Psychology, 114(5), 665-692.
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., ... & Van der Laan, M. (2014). Promoting transparency in social science research. Science, 343(6166), 30-31.
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359-1366.
Singal, J. (2021). The quick fix: Why fad psychology can’t cure our social ills. New York, NY: Farrar, Straus and Giroux.
Stewart‐Williams, S., Chang, C. Y. M., Wong, X. L., Blackburn, J. D., & Thomas, A. G. (2021). Reactions to male‐favouring versus female‐favouring sex differences: A pre‐registered experiment and Southeast Asian replication. British Journal of Psychology, 112(2), 389-411.
Stroud, N. J. (2010). Polarization and partisan selective exposure. Journal of Communication, 60(3), 556-576.
Tetlock, P. E. (1994). Political psychology or politicized psychology: Is the road to scientific hell paved with good moral intentions?. Political Psychology, 15, 509-529.
Winegard, B. M., & Clark, C. J. (2020). Without contraries is no progression. Psychological Inquiry, 31(1), 94-101.
Winegard, B. M., Clark, C. J., Hasty, C. R., Baumeister, R. F. (2018). Equalitarianism: A source of liberal bias. Unpublished manuscript.