Karen Yu, Ph.D. and Warren Craft, Ph.D., MSSW

Choice Matters

Seeking Information in a Time of Uncertainty

Some less obvious yet significant challenges and some ways forward.

Posted Mar 17, 2020

In times of uncertainty, seeking additional information makes sense and can reduce anxiety. Yet information can take many forms, including misinformation and disinformation. Even when there is well-established, useful information available from reputable sources, misinformation and disinformation abound.

In cases where information is still emerging—e.g., in a rapidly evolving, unprecedented situation such as the current COVID-19 pandemic—misinformation and disinformation can rush in to fill the void. Most, if not all of us, have a general sense that the quality of our decisions and actions is influenced by the quality of the information that informs them. Yet there are a number of challenges to the quality of our information and beliefs that we may not be so aware of.

Pixelkult/Pixabay
Source: Pixelkult/Pixabay
  • True and false information does not spread to the same extent or with the same speed. The ready access we have to so many information sources and the amplifying power of social media apply both to accurate, reliable information and to misinformation and disinformation. It seems reasonable to assume that the amplifying power of the internet and social media would apply equally to true and false information. Yet research suggests that this is not so: Misinformation actually spreads more widely and more quickly than accurate information. Indeed, a large-scale study1 that examined a vast data set of over 126,000 stories, tweeted and retweeted on Twitter from 2006 to 2017 by some 3,000,000 unique users found that false stories were consistently propagated more widely, deeply, and quickly than true stories. For example, false stories spread to as many as 100,000 people and were retweeted by more unique users while true stories rarely reached more than 1000 people and took longer to do so. (One interesting question is the ways in which false information might differ systematically from true information that lead to more sharing—e.g., might it be more emotional, more surprising... something else?)
  • Warnings alerting people to false claims can have unintended consequences. Even steps to reduce misinformation and disinformation require care. A recent study2 shows that while placing warnings on stories that have been disputed by fact-checkers can in fact reduce people’s belief in those stories as well as the likelihood they will share those stories, such warnings can also have unintended effects. In fact, placing warnings on only a subset of stories can increase both the perceived truth of unflagged false stories and the likelihood that people will share them. Note that stories might not be flagged with a warning for at least two quite different reasons: (1) because their truth or falsity has yet to be determined, or (2) because they are in fact true. Interestingly, research tells us that people tend to assume the latter. That is, when some stories are tagged as false, people seem to implicitly interpret untagged stories as true: the implied truth effect. (One approach that various media platforms might consider, then, is indicating not only stories that have been disputed as false but also those that have been validated as true.)
  • Repetition alone can influence what we believe to be true. Being intentional about the information that we seek and interact with is also important given yet another phenomenon documented by research: the illusion of truth. Multiple studies across a wide range of conditions show that, all other things being equal and independent of the actual truth of the information, we are more likely to interpret information that we have encountered before as being true, oftentimes regardless of the credibility of the source. Even more challenging is the fact that the illusion of truth occurs largely outside of our conscious awareness: Without even realizing it, we can misinterpret the ease of processing that comes from repeatedly encountering particular information as indicating the truth of that information. Repetition occurs in ways we might not immediately realize: Even stating "It's not true that..." or "It's a myth that..." ends up ultimately repeating the falsehood and thus potentially increasing its perceived truth. And at least one studysuggests that the illusion of truth can occur not just for repeated statements about topics we know little about, but even for repeated statements about topics for which we have pre-existing knowledge that clearly contradicts the repeated information.

Given the above, what can we do? Here are just a few suggestions—you'll likely think of others:

(1) Seek out reputable, evidence-based sources—sources known for their emphasis on investigative approaches and reliance on peer-reviewed scientific, medical, and social science evidence, and organizations with a mission and track record of working to protect the public, independent of other influences (economic, political, and more). Seek out the direct communications of those sources, rather than second- or third-hand reports—messages can become distorted in the re-telling and we need only think of the childhood game of telephone to remind us of that. Increasing the likelihood that you are getting accurate information and analysis, will likely also decrease the likelihood that you will encounter distorted or untrue statements (which as noted above can come to seem even more true with repeated exposure). 

(2) Take care of phrasing and wording. Rather than saying, "It's not true that the earth is flat," which ends up repeating "the earth is flat" and—given the illusion of truth—potentially increasing its perceived truth, say what is true—e.g., "The earth is round," or, perhaps even better, "The earth is an oblate spheroid." This is an opportunity for intentionality and creativity, especially given that sometimes we don't yet have a clear sense of what is true but rather that some specific statement is false.

(3) Share cautiously, and verify before you share. It's so easy to share information these days and evidence suggests many people share information based on headlines alone. The potential combined effects of the phenomena described above highlight why verifying information before we share it and being circumspect about what we share is so important: As a result of the phenomena above and others, even with all good intentions we can quite unknowingly be actively sharing a considerable amount of false information. Verifying before we share is both more important and more challenging in times of crisis and uncertainty when we are so urgently trying to keep our loved ones, friends, and fellow citizens well-informed. This can lead us to share perhaps too quickly; we need to share responsibly.

Knowing these things alerts us to the importance of carefully curating our information intake. This does not necessarily mean eliminating social media, especially given that scientists and other scholars are increasingly using these platforms to share valuable information. It does mean exercising care in what we read and follow, lest we come to believe and act upon information that is untrue.

But this is about even more than the accuracy of our individual beliefs—it is also about our social responsibility to others. Upholding this responsibility means being circumspect about the information we share—especially in light of the effects described above. Being a responsible consumer and sharer of information is especially challenging in times of crisis when our individual and collective need and desire for information and the reassurance it can provide is especially great. And this is even more so in a time when information—both accurate and inaccurate—is for many of us so readily accessed and shared. In this time of uncertainty surrounding the COVID-19 pandemic, seeking and treating information with intentionality and care have never been more important.  

References

1. Vosoughi, S., Roy, D., & Sinan, A. (2018). The spread of true and false news online. Science, 359, 1146-1151. https://doi.org/10.1126/science.aap9559

2. Pennycook, G., Bear, A., Collins, E.T., Rand, D.G. (2020). The implied truth effect: Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without warnings. Management Science.   https://doi.org/10.1287/mnsc.2019.3478

3. Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144, 993-1002. https://doi.org/10.1037/xge0000098