Seven Steps Toward Better Critical Thinking
How to avoid knowing what isn't so.
Posted Sep 06, 2016
It's been said that the problems you encounter in life stem not so much from what you don't know, but from what you know for sure that isn't so. Who said it? We don't know, although many people are certain that it was Mark Twain. More on that later. For now, why would it be less hazardous—to your health, to productivity, to happiness—to not know a whole bunch of things than to believe things that aren't true? Because if you're sure that you know something, you act on it with the strength of conviction and resolve.
If you're sure that an alternative treatment will help cure your cancer better than "Western medicine," you'll forego the traditional treatment. This is exactly what happened to Steve Jobs—after being diagnosed with pancreatic cancer, he pursued a kind of new age, Northern-California alternative diet in lieu of medical treatment. By the time he realized it wasn't working, it was too late for medicine to help him.
If you're sure that your choice of political candidate is right, if you know it for sure, you're not likely to be open-minded about any new evidence that might come in that could—or should—cause you to change your mind.
I am a college professor, and one of the things I do for a living is train Ph.D. students in science. They come into my laboratory full of confidence. After all, they have been at the top of every class they've been in all throughout their school lives. If they hadn't been, they wouldn't have gotten into a first-rate college, and if they hadn't been at the top of their classes there, they wouldn't have gotten into the very competitive graduate programs at the universities where I've taught and those like them—the Stanfords, Berkeleys, Dartmouths, and McGills. But here's the problem: They come in thinking that they are hot stuff. They have learned massive amounts of information, and unfortunately, they are so sure that their knowledge is correct, they are wont to add new knowledge without questioning the foundations of the old. In their time under my tutelage, I spend most of my time trying to teach them that they don't know what they think they do. I don't teach graduate students so much as unteach them. This takes four to six years. In some cases, eight.
When a graduate student comes to me and says "I just realized I don't know anything about cognitive neuroscience" I congratulate them and tell them they're now ready to receive the Ph.D. The Ph.D. is effectively a license for someone to become a lifelong learner, certifying the kind of open-mindedness and critical thinking skills necessary to become a creator of knowledge. Knowledge can't be created in an environment where everything is already known. It can only be created in an environment where we're open to the possibility that we're wrong. For those of you steeped in Eastern philosophy, you'll recognize the Zen connection. A book was written about this by the philosopher Alan Watts—The Wisdom of Insecurity.
I wrote A Field Guide to Lies because I think that all of us are capable of this kind of critical thinking, regardless of our educational background. The kind of inquisitiveness and curiosity I'm talking about is innate. Every four-year-old asks a series of incessant "why" questions: Why is there rain? Because of condensation. Why is there condensation? Because of changing temperature conditions. Why are there changing temperature conditions? Et cetera. We have this beaten out of us early on by worn-down parents and teachers. But this why mode is the key to all critical thinking. Think like a four-year old. Ask "why" and "how." Ask them often.
- Don't believe something just because everyone else does. If you like Latin, this is called argumentum ad populum. Yes, there is such a thing as the wisdom of the crowds, but it has limited applicability, especially when the crowds aren't thinking critically. Believe something because you find the evidence compelling. (Think slavery.) As Tolstoy said, “Wrong does not cease to be wrong because the majority share in it.” Or St. Augustine: “Right is right even if no one is doing it; wrong is wrong even if everyone is doing it.”
- Don't believe something just because it is backed by a fancy website, or scientific terms or equations. Pseudo-science hijacks the words of science without using the methods of science to get you to believe things that aren't so. Too many of us are bamboozled by fancy terms, bold headlines, and testimonials. Take a moment to look more carefully at the evidence being presented. There is no miracle pill that will enhance brain function, no magnet bracelet that will enhance stamina.
- Don't reject a source just because it is occasionally wrong. Don't accept a source just because it got one or two high-profile things right. The New York Times is one of the most reliable and rigorously fact-checked news sources in the world. They do make mistakes and they print corrections every day. But on the whole, if you read something there, it has a very high likelihood of being true. Supermarket tabloids do occasionally get stories right, but on the whole, if you read something there, it is unlikely to be true. Elvis is not alive on a spaceship circling the moon, and Michelle Obama does not have a newly discovered identical twin sister.
- Check for plausibility. Many claims are just impossible; many more are improbable. A car that needs no fuel and can generate its own power seems to contradict the laws of physics. A 200-year old woman living in China whose secret to longevity is smoking two packs of cigarettes a day flies in the face of medical science. One widely reported statistic was that 150,000 girls in the U.S. die each year of anorexia. That can't be true: The total number of deaths for girls from all causes in a single year is only about 8,500 (or 55,000 if your definition of "girls" includes women under the age of 44). You'd find that out by checking reputable sources, such as the CDC.
- Correlation is not causation. Two things can change together, but it doesn't mean that one caused the other. Ice-cream sales tend to increase during months when people are wearing short pants, but you wouldn't want to conclude that eating ice cream causes people to wear shorts, or that wearing shorts causes people to eat ice cream. A third factor, high temperatures, could be said to cause both. But not all things that occur together are influenced by a third factor, either: The day that the stock market reached an all-time peak, I saw a whale jump out of the ocean in Washington state. I don't think he was celebrating the increase in his portfolio. The two events are probably unrelated.
- Does the evidence actually support the conclusion? Fast-talking, loose purveyors of information may flummox you with a whole bunch of data that aren't related to the claim. Sometimes they do this intentionally; sometimes they don't know that they're doing it. Consider an investment manager's claim that he can double your money in three years. He cites as evidence his academic degrees and a new system he has developed. Those are not evidence that he can do what he says—they may add credibility, but they are not evidence. Even previous high-yield performance is not evidence—investments are inherently risky, conditions can change, and the economic climate is a complex and unpredictable system.
- Look for a missing control condition. A new pill claims to cure headaches within four hours. A look at the evidence reveals that people with headaches were given the pill and reported that their headaches got better. What we don't know is how many headaches would have gotten better on their own in that time. To know that, you'd need a controlled experiment—that is, an experiment in which a control group of people, randomly selected, get no treatment (in the form of a placebo pill) and are compared with the treatment group.
A wealthy white socialite may not believe claims of police brutality because her interactions with those nice officers have always been so pleasant. But she is not controlling for the fact that her economic class, neighborhood, and race may be contributing to those interactions. In the language of science, she has not controlled for those factors in forming her opinion. White journalists Ray Sprigle and John Howard Griffin pioneered the study of such interactions by posing as African Americans and documenting very different treatment.
Allowing ourselves to realize that we don't always know what we think we know opens our minds to new knowledge, and allows us to navigate the world more effectively, choosing among options (or political candidates) that are more likely to maximize our success and well-being. In the current election climate, many people decided early on which candidate they wanted to support, based either on a gut feeling or the information they had back then. If they're not open to new information as it becomes available, they may support someone who is unlikely to embody the principles they value.
Mark Twain is widely cited as stating some version of the phrase that opened this article, that it ain't what you don't know, but what you know for sure that ain't so that will get you in trouble. Many people believe he said it. A thorough search of sources reveals that he not only didn't say it, but didn't say anything like it. The source of the quote is unknown. Sometimes you don't know what you think you do.
Daniel J. Levitin is James McGill Professor at McGill University, Dean of the College of Social Sciences at the Minerva Schools at KGI, and Distinguished Faculty Fellow at the Haas School of Business, UC Berkeley. His current book is A Field Guide to Lies: Critical Thinking in the Information Age.