Cognition
Social Barriers to Critical Thinking
Thinking about the application of critical thinking in public settings.
Posted August 13, 2024 Reviewed by Michelle Quirk
Key points
- We must acknowledge our biases when evaluating research presented via media and strive to find the source.
- In situations that have important consequences, how we deal with the bias-based conflict is what matters.
- Those living in places where free speech is protected are lucky; this right should not be taken for granted.
I recently wrote a research paper on cognitive barriers to critical thinking (CT), discussing flaws in thinking associated with intuitive judgment, emotion, bias, and epistemological misunderstanding, as well as inadequate CT skills and dispositions (Dwyer, 2023). A colleague progressed this thinking by asking me about social barriers to CT, through a number of specific questions. After thinking about these questions for a bit, I thought it useful to answer some of them here as consideration for CT in social situations.
What happens when people believe they’re thinking critically, but they are just repeating some party line?
The simple answer is that because the individual isn’t thinking critically and they’re just telling you what they believe, it’s up to you to decide whether or not it’s worth the effort to tell them. This depends on who the person is and how open they are to changing their mind—which people are quite hesitant to do; so, this might well be a futile endeavour. I probably would avoid engaging unless it’s someone I care about, who’s about to make an important decision based on erroneous information. Context is important here.
Of course, the folly is an example of in-group bias. The individual likely believes that their "group" has thought critically about the topic in question because they believe said party is credible with respect to the information they present. Thus, the individual might fail to evaluate the claim themselves because they are using their party’s thinking as some form of "expert opinion," even when there might be no relevant expertise to cite.
But, let’s say some research has been cited. Though the individual is right to talk about the research in the sense that research represents the most credible source of evidence, it does not ensure that this particular piece of research is credible. For example, consider how most people hear about new research. Academics know to read the relevant peer-reviewed journals, but not everyone is an academic. Most people hear about research from the news. It’s easy for a TV program or news radio show to talk about new research, but how sure can we be that such sources know how to properly interpret said research? Moreover, how do we know that the research was adequately conducted? We are hearing about research from a secondary source as opposed to the people who conducted it. This is problematic because a lot can be lost in the translation from the initial source, through the "middleman," and onto the public. As consumers of information, we must acknowledge our own potential biases when evaluating research presented to us through media outlets and strive to find the source of the research to ensure that we’re getting the full story.
Does one’s ideology and self-interest play a role in CT?
Ideology and self-interest are essentially bias-based cognitive structures; so, yes, they can affect one’s CT. However, if your decision is made in light of ideology and/or self-interest, then what you’re doing is not CT. If the information a person is presented with aligns with their pre-existing worldviews, they are likely to treat it as new information or as additional knowledge. Simply, if the information supports what we already believe, we are more likely to trust it (i.e., consider confirmation bias). However, if the information contradicts such worldviews, we’re more likely to declare "fake news" without looking into it much further or, instead, pick flaws in it. This happens to the best of us from time to time, especially if the stakes aren’t particularly high (i.e., the decision you make doesn’t bear any important consequences).
But, in situations that have important consequences, how we deal with the bias-based conflict is what matters. Our intuitive judgment will always tell us our gut feeling on a matter, but whether or not we engage in reflective judgment and dig deeper into the matter will determine whether or not we think critically. A critical thinker will look further into an important idea that they initially considered silly and might find that it’s actually well-supported by evidence (or it may not be, but at least they made the effort to further evaluate). Such evidence might lead them to further question the perspective and, ultimately, change their mind.
Is it worth sharing one’s CT in environments that punish CT?
This is a tough question because there are two equally acceptable answers—an idealistic one (yes) and a practical one (no)—the application of which, again, comes down to context. Some environments might discourage or even punish CT if the conclusions drawn contradict what is deemed "acceptable" (be it socially, politically, or even legally). In such cases, staying "quiet" seems like a practical and prudent move (even though it contradicts what many might view as intellectual integrity). That is, what’s more important, being right or avoiding punishment? Another way of looking at this is thinking about whether speaking up is just a matter of being right, or the other party’s mistake is going to impact you in an important way. Is that "important way" worth potential punishment? Context is a key consideration here. Of course, environments where free speech is encouraged change things a bit; but if your CT contradicts the status quo, though you may not be "punished" for your conclusions, you might risk other negative knock-on effects. Sure, the ideal might seem more palatable in this context (i.e., sharing your CT), but there are many who might well stay quiet for reasons of practicality. Again, it depends on their own personal contexts (e.g., are you only risking offending someone or could you potentially put your employment in danger by stating your conclusions?).
All in all, each situation requires evaluation and appraisal of whether or not it is worth sharing one’s CT. From an idealistic perspective, this is a shame. Ideally, one should always feel free to share their thinking if CT has been applied. However, this is not always the practical strategy. Ultimately, what one can actually gain from sharing their conclusions (relative to what is likely to be lost), is what should determine whether or not such thinking is shared (e.g., Are you in a meaningful position to genuinely elicit positive change?). The only real conclusion I can draw in this context is that those living in places where free speech is protected are truly very lucky, and this right should not be taken for granted. It should be practiced and maintained, but it is also imperative that it is well-informed. If it’s not, someone else with the right to speak freely, who has conducted CT, will hopefully call out that erroneous information. Of course, I recognise how that might seem a bit idealistic, because, unfortunately, as discussed above, many people often believe they have thought critically, even when they have not.
References
Dwyer, C.P. (2023). An evaluative review of barriers to critical thinking in educational and real-world settings. Journal of Intelligence: Critical Thinking in Everyday Life (Special Issue), 11:105, doi.org/10.3390/jintelligence11060105.