18 Common Logical Fallacies and Persuasion Techniques
The information bombardment on social media is loaded with them.
Posted Aug 25, 2017
It has been suggested that approximately five exabytes (i.e. about 5,000,000,000 pickup truck beds full of information typed on paper) of data are created each day. What is tougher to decipher is how much of this information is simply spurious assertions, conspiracy theories or misinterpreted information.
Navigating this bombardment of information and process it appropriately requires not only attaining of knowledge, but also adapting in light of existing knowledge, through critical thinking. For example, when we engage with a information on social media, we must decide whether or not what has been presented in the post is a legitimate claim. However, what often increases "cognitive load" (Sweller, 2010) is the multitude of arguments presented, by social media users, in the ensuing comment threads.
The quality of each argument in a thread varies from comment to comment, with respect to credibility, relevance, logical strength, the balance of evidence and the level of bias. Generally, users will present an argument so as to persuade you to ‘see their side’ of the argument. There is nothing wrong with trying to persuade someone else to look at a topic from your perspective, particularly if you present credible evidence. Quite often, however, users will not have credible evidence and will use other devices of argumentation to sway thinking, such as logical fallacies.
Social media is many things: entertainment, education and networking, just to name a few. Unfortunately, it is also a vehicle for promoting faulty thinking. Below, I have compiled a list of 18 forms of persuasion techniques, illogical argumentation and fallacious reasoning that I commonly encounter in my use of social media. By learning about these devices, you will be more likely to recognise their use, avoid using them yourself, and better assess arguments presented to you.
1. Ad hominem (‘to the man’) refers to an attack on the person; for example, regarding their past or personal traits, as a means of undermining/opposing their argument, without having to provide any evidence. Loaded questions evoke a similar effect.
2. Anecdotal Evidence is personal experience. Anecdotes can be a very powerful tool of persuasion but are a weak basis for an argument. We cannot generalize one person’s experience to the population at large. Other people may have had very different experiences. If we account for many experiences (e.g. 1,000 instead of 1), then we might be able to make some generalizations.
3. The Appeal to Authority can be tricky, because it’s not always illogical. It would be wrong to think something is true just because an authority figure said it is; however, if it was an authority who is expert in the field relevant to the issue, then it might be illogical to believe the opposite. Expert opinions are a strong source of credibility, given that these opinions are often based on empirical evidence. However, experts do not always agree when it comes to evaluating the evidence; and sometimes, an expert makes a bold statement that lacks credibility because it lacks supportive evidence (in which case the appeal to authority would be a fallacy).
4. An Appeal to Emotion aims to manipulate emotions or evoke an affective response to gain acceptance, as opposed to using logically compelling evidence. Appeals to pity and compassion are among the most common forms of this argument.
5. The Bandwagon Argument is simply an appeal to popularity. For example, “Everyone else is doing it, so why don’t you?” or “Most people believe x, so x must be true.” The bandwagon argument is often based on common belief statements (e.g. “Everyone knows that opposites attract” a common adage that is actually not the case), which are generally weak with respect to credibility.
6. Begging the Question is based on circular reasoning (e.g. “We need to cut spending as too much money is being spent”), generally resulting from an individual taking a certain premise for granted.
7. The Black-or-White Fallacy is the provision of only two alternatives in an argument, when there are actually more options available. That is, numerous ‘shades of grey’ are also possible, but are not addressed.
8. The Burden of Proof Fallacy occurs when a claim is made and expected to be accepted because it has not been disproved or even adequately disputed. However, this does not mean the claim is true. As this issue often rests on potential (un)certainty, in such cases, it will require reflective judgment (King & Kitchener, 1994).
9. Card-stacking is a method of argumentation in which important counter-arguments are purposefully omitted, creating an imbalance of evidence in an effort to bias the argument.
10. The Fallacy Fallacy refers to dismissing a claim (which may be true) altogether solely because it has been poorly argued (e.g. illogical or with suspect evidence) or because a fallacy was used in arguing its case.
11. The False Cause Argument, or correlation not casusation, refers to the assumption that because two things are related means that one causes the other. For example, 100% of murderers drink water; therefore, drinking water causes people to kill.
12. The Gambler's Fallacy refers to the belief that streaks affect statistically independent phenomena. Simply, there is a one in two chance of a coin landing tails up, so based on this assessment, some might say if heads comes up on the first flip, then it seems likely the coin will come up tails on the second flip. This would be an incorrect assessment of probability, as coins do not have a memory. The same goes for roulette wheels. Every flip and every spin is new and is not dictated by what happened previously. Thus, the probability of flipping a coin and getting tails eight times in a row is the very same as getting HTHTHTHT. The conceptualisation of the gambler’s fallacy is quite similar to the Representativeness Heuristic (Kahneman, 2011; Tversky & Kahneman, 1974).
13. The Middle Ground Fallacy is almost the exact opposite of the black-and-white fallacy. For example, where two alternatives are proposed (generally extremes), the middle ground fallacy incorrectly supposes that the truth must rest somewhere in between (i.e. a shade of grey). However, it could very well be the case that truth rests in one of the two ‘extremes’.
14. Moving the Goalposts refers to adding related propositions with just enough content altered to continue an argument, in order to avoid conceding after the initial claim had been successfully counter-argued. Similar argument types that fall under this umbrella of fallacies include Special Pleading and No True Scotsman.
15. Personal Incredulity refers to the dismissal of a claim by an individual due to a lack of understanding of either the claim itself or the supports for that claim (e.g. an individual’s dismissal of evolution because they don’t understand it).
16. The Slippery Slope Argument is an argument that concludes that if an action is taken, other negative consequences will follow. For example, “If event X were to occur, then event Y would (eventually) follow; thus, we cannot allow event X to happen.” This is often difficult to refute because it is not possible for us to see into the future and guarantee that the subsequent event won’t occur. Often, after critically thinking about patterns in human history, it may be that the subsequent event is likely to happen, in which case, the slippery slope argument may not be illogical. However, such judgment depends on the context of the argument. Regardless, what makes the argument fallacious is that it avoids engaging the argument at hand. It adds a component that isn’t necessarily relevant to the initial argument. Furthermore, the added component is generally emotionally loaded (e.g. fear-evoking).
17. The Strawman Fallacy involves misrepresenting an argument to make it easier to attack. For example, someone in opposition to your argument refutes it, often irrelevantly, by claiming that you are actually arguing in favor of something else. In this case, the ‘something else’ is the strawman the opposition has purposefully built in order to make it easier to refute your stance, even though the ‘something else’ was never argued for in the first place. Simply, a strawman is built so it can be knocked down.
18. Tu Quoque (translated from Latin as ‘you too’), or the argument of hypocrisy, refers to avoiding refutation or critique by reverting the same criticism back on to the accuser, without addressing the initial refutation. Another way of looking at this fallacy is as challenging a claim by asserting that the claimant’s behavior is inconsistent with the conclusion they have drawn. In this context, it is a type of ad hominem that rejects a proposition based on the traits of the claimant. For example, in response to the claim that 'Eating fast-food is unhealthy': “But I saw you eat a burger and fries for lunch only a few hours ago!"
These are not the only logical fallacies or persuasion techniques out there—just the most common in my experience. If you’re interested in learning more about fallacies, I recommend checking out yourlogicalfallacyis.com. Given the ever-expanding ocean of worldwide information, it is important to learn about these argumentation devices so that you can become better able to navigate it.
Darling-Hammond, L. (2008). How can we teach for meaningful learning? In L. Darling-Hammond (Ed.), Powerful Learning, 1–10. San Francisco: Jossey-Bass.
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. UK: Cambridge University Press.
King, P. M., & Kitchener, K. S. (1994). Developing Reflective Judgment: Understanding and Promoting Intellectual Growth and Critical Thinking in Adolescents and Adults. CA: Jossey-Bass.
Sweller, J. (2010). Cognitive load theory: Recent theoretical advances. In J. L. Plass, R. Moreno, & R. Brünken (Eds.), Cognitive Load Theory, 29–47. New York: Cambridge University Press.
Dwyer, C. P. (2011). The evaluation of argument mapping as a learning tool. Doctoral Thesis, National University of Ireland, Galway.
Kahneman, D. (2011). Thinking, fast and slow. UK: Penguin.
Tversky, A., & Kahneman, D. (1974). Heuristics and biases: Judgement under uncertainty. Science, 185, 1124-1130.
Van Eemeren, F. H., Grootendorst, R., Henkemans, F. S., Blair, J. A., Johnson, R. H., Krabbe, E. C. W., Planitin, C., Walton, D. N., Willard, C. A., Woods, J., & Zarefsky, D. (1996). Fundamentals of argumentation theory: A handbook of historical backgrounds and contemporary developments. Mahwah, NJ: Erlbaum.