Cognition
Why I Stopped Caring About Enhancing Everyone’s Thinking
The natural selection of critical thinking and the Socratic paradox.
Posted February 20, 2025 Reviewed by Michelle Quirk
Key points
- Critical thinking requires acknowledging what you don’t know.
- When events occur in the world, we often think in terms of a narrative chain to make sense of it.
- It takes intellectual confidence and integrity to refrain from inferring a conclusion when not appropriate.
Maybe it might be a symptom of some type of bias associated with self-reflection, but I’ve noticed a distinct "vibe" lately that comes through in my writing about critical thinking (CT). In a variety of instances, it seems to me that my writing might construe some level of being jaded by the whole concept of trying to "help the cause" of getting people to think critically. So, I thought about it, because I don’t want readers to get the wrong impression if my writing does indeed convey such a vibe. I’m not jaded… but at the same time, I’m not unrealistically optimistic either. With that, I’ve come to conclude that though the enhancement of CT is an important goal of mine, enhancing everyone’s CT is not.
I often make the point on this page that we should only really think critically in scenarios that are important to us or if the topic under the microscope is something we truly care about. Of course, much of this "importance" is subjective. With that comes personal responsibility. As the term implies, it’s personal: It’s your business, not mine. If you regularly make poor decisions, that’s on you—especially if you claim to have "done your research" (insert eye roll) or thought critically about it (even though you haven’t or maybe even don’t know what it actually means in practice). If you need to up your CT game, you need to seek out practice and opportunities to do it. CT researchers are not household names; you’ll only know about the research if you look for it. It’s ridiculous for researchers like me to believe we have a chance of reaching everyone. Academics and researchers are not rockstars, and, unfortunately, there are large numbers of people out there who distrust science and research. With that, if you do want to work on your thinking and get better at it, then that’s my business—you’re the type of person for whom I work.
CT isn’t about knowing something or not knowing it. It’s about knowing the difference between the two and applying thinking processes to efforts toward gaining that knowledge. You have to acknowledge what you don’t know.
Unfortunately, that’s not very common in the real world (e.g., consider the Dunning-Kruger Effect; Kruger & Dunning, 1999). There are people who apply CT (some more often than others) and then there are some who don’t do it at all. I used to get frustrated by this because of some idealistic sense of duty to the world that I should try to reach out to everyone in an effort to make this planet a better place through enhancing CT. But, as the years went by and youthful idealism faded, I’m finding myself no longer concerned with the people who don’t value good thinking. That is not a jaded perspective, just a realistic one. Some people cannot be helped—that is, not until they want the help. It’s also a practical perspective. Indeed, the most practical focus I can think of, with respect to my reach, is that cohort of people who know how to think critically but don’t do it as much as they should (yet want to) or people who value it, but struggle to do it.
The Socratic Paradox
I think the crux of this point really boils down to the aforementioned concept of knowing the difference between knowing something or not knowing it. Of course, I write about epistemology frequently on this blog, but I think the notion of the Socratic paradox, specifically, does best to exemplify what I mean.
I always thought Socrates was a badass. He didn’t care what people thought about him. He stood by his intellectual integrity, and he was killed for it. Ironically, perhaps it's because of what people "thought" about him that he didn’t care. Simply, they didn’t know anything. Socrates thought at length about knowledge and the knowledge of others. He knew he didn’t know anything, but neither did anyone else. According to this take on the Socratic paradox, when engaged with the Oracle of Delphi, who told Socrates that he was the wisest person in Athens, Socrates believed the Oracle. But how could a man who knows nothing be the wisest man? Socrates concluded that if he knew nothing and was wiser than everybody else, then it must be due to him being the only person in Athens who recognised their own ignorance.
It is also ironic that the most credible accounts of Socrates are those from his pupil, Plato. Socrates didn’t capture his thoughts in writing. Essentially, we are depending on a story told by Plato—a narrative—as evidence of Socrates’ teaching. Did Socrates really say/think that? I don’t know. But the ability to say "I don’t know" in this manner is such a boss move when it comes to CT—it takes intellectual confidence and integrity to admit that and refrain from inferring a conclusion when it is not appropriate to do so… it seems like such a move would make Socrates proud.
Narrative Fallacy
Fast-forward more than two millennia, and we have the work of the recently passed Daniel Kahneman—work that reinforces many similar Socratic teachings; for example, in reference to the “comforting conviction that the world makes sense rests on a secure foundation: Our almost unlimited ability to ignore our ignorance” (Kahneman, 2011). Similarly, in my first book (Dwyer, 2017), I wrote about humanity’s desire, in the midst of their information processing, to tie things up into nice, neat little packages. Indeed, this stance is well-reflected in what we know about narrative fallacy (Taleb, 2007). When events occur in the world, we often think in terms of a narrative chain to make sense of it. We dislike uncertainty; we want closure. Narratives provide a cause-and-effect tale for what happened—a linear, sequential means of processing of information for remembering, understanding, or some other cognitive application… a nice, neat little story.
Unfortunately, these stories are often fictionalised accounts of these events. We implicitly fill in the gaps with our own thoughts, feelings, and attitudes when information is missing. We draw links that may not exist (e.g., correlation, not causation). This tendency creates a barrier to our understanding that, sometimes, random events are just that—random; "black swans" in our everyday lives (Taleb, 2007). If the story is corroborated through our "makes-sense epistemology" (Eigenauer, 2024), we’re likely to go along with it. (A nice example of this is the timeline featured in this article by Barry Ritholtz.)
Of course, we’d rather "know" than not know, so we create "a knowledge." However, such knowledge isn’t necessarily fact. Through CT, we can evaluate such scenarios and determine that the truth might well be that we don’t know at all.
If you know you don’t know, great. Socrates would approve. But, if you’re the type who rarely acknowledges their lack of knowledge in various scenarios, perhaps it’s time to seek out some method for enhancing your CT.
References
Dwyer, C.P. (2017). Critical Thinking: Historical Perspectives and Practical Guidelines. UK: Cambridge University Press.
Eigenauer, J. (2024). Mindware: Critical Thinking in Everyday Life. Journal of Intelligence, 12(2), 17.
Kahneman, D. (2011). Thinking Fast & Slow. UK: Penguin.
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121.
Taleb, N. N. (2007). The Black Swan: The Impact of the Highly Improbable. New York: Random House.