Skip to main content

Verified by Psychology Today

Seeking Information in a Time of Uncertainty

Some less obvious yet significant challenges and some ways forward.

In times of uncertainty, seeking additional information makes sense and can reduce anxiety. Yet even when there is well-established, useful information available from reputable sources, misinformation and disinformation abound.

Particularly in situations where information is still emerging—e.g., in a rapidly evolving, unprecedented situation such as the current COVID-19 pandemic—misinformation and disinformation can rush in to fill the void. Most of us have a general sense that the quality of our decisions and actions is influenced by the quality of the information that informs them. Yet there are a number of challenges to the quality of our information and beliefs that we may not be so aware of.

Source: Pixelkult/Pixabay
  • False information spreads more quickly and more extensively than true information. The ready access we have to so many information sources and the amplifying power of social media apply both to accurate, reliable information and to misinformation and disinformation. It seems reasonable to assume that the amplifying power of the internet and social media would apply equally to true and false information. Yet research suggests that this is not so: Misinformation actually spreads more widely and more quickly than accurate information.

Indeed, a large-scale study1 that examined a vast data set of over 126,000 stories, tweeted and retweeted on Twitter from 2006 to 2017 by some 3,000,000 unique users found that false stories were consistently propagated more widely, deeply, and quickly than true stories. For example, false stories spread to as many as 100,000 people and were retweeted by more unique users while true stories rarely reached more than 1000 people and took longer to do so. One interesting question is how false information might differ systematically from true information in ways that lead to more sharing—e.g., might it be more emotional, more surprising... something else?

  • Warnings alerting people to false claims can have unintended consequences. Even steps to reduce misinformation and disinformation require care. A recent study2 shows that while placing warnings on stories that have been disputed by fact-checkers can reduce people’s belief in those stories and the likelihood they will share those stories, such warnings can also have unintended effects. In fact, placing warnings on only a subset of stories can increase the perceived truth of false stories that go unflagged. As if that weren't bad enough, it also increases the likelihood that people will share those false stories.

Note that there are at least two quite different reasons why stories might not be flagged with a warning: (1) because their truth or falsity has yet to be determined, or (2) because they are in fact true. Interestingly, research shows that when some stories are tagged as false, people implicitly interpret untagged stories as true: the implied truth effect. Knowing this, media platforms with an interest in accuracy should perhaps indicate not only stories that have been disputed as false but also those that have been validated as true.

  • Repetition alone can influence what we believe to be true. Being intentional about the information that we seek and interact with is also important given yet another phenomenon documented by research: the illusion of truth. Multiple studies across a wide range of conditions show that, all other things being equal and independent of the actual truth of the information, we are more likely to interpret information that we have encountered before as being true, oftentimes regardless of the credibility of the source.

And the illusion of truth occurs largely outside of our conscious awareness: Without even realizing it, we misinterpret the ease of processing that comes from repeatedly encountering particular information as indicating the truth of that information. Repetition occurs in ways we might not immediately realize: statements that "It's not true that..." or "It's a myth that..." repeat the falsehood they seek to refute, thus potentially increasing its perceived truth. Moreover, at least one study3 suggests that the illusion of truth can occur not just for topics we know little about, but even for topics about which we have pre-existing knowledge that clearly contradicts the repeated information.

Given the above, what can we do? Here are just a few suggestions—you'll likely think of others:

(1) Seek out reputable, evidence-based sources—sources known for their emphasis on investigative approaches and reliance on peer-reviewed scientific, medical, and social science evidence, and organizations with a mission and track record of working to protect the public, independent of other influences (economic, political, and more). Seek out the direct communications of those sources, rather than second- or third-hand reports—messages can become distorted in the re-telling and we need only think of the childhood game of telephone to remind us of that. Increasing the likelihood that you are getting accurate information and analyses will reduce the likelihood of encountering distorted or untrue statements—which, as noted above, can come to seem even more true with repeated exposure.

(2) Take care of phrasing and wording. Rather than saying, "It's not true that the earth is flat," which ends up repeating "the earth is flat" and—given the illusion of truth—potentially increasing its perceived truth, say what is true—e.g., "The earth is round," or, perhaps even better, "The earth is an oblate spheroid." This is an opportunity for intentionality and creativity, especially given that sometimes we don't yet have a clear sense of what is true but rather that some specific statement is false.

(3) Share cautiously, and verify before you share. It's so easy to share information these days and evidence suggests many people share information based on headlines alone. The potential combined effects of the phenomena described above highlight why verifying information before we share it and being circumspect about what we share is so important: As a result of the phenomena above and others, even with all good intentions we can quite unknowingly be actively sharing a considerable amount of false information. Verifying before we share is both more important and more challenging in times of crisis and uncertainty when we are so urgently trying to keep our loved ones, friends, and fellow citizens well-informed. This urgency can lead us to share too quickly; we need to share responsibly.

Knowing these things alerts us to the importance of carefully curating our information intake. This needn't mean eliminating social media, especially given that scientists and other scholars are increasingly using these platforms to share valuable information. It does mean exercising care in what we read and follow, lest we come to believe and act upon information that is untrue.

But this is about more than the accuracy of our individual beliefs—it is also about our social responsibility to others, about being circumspect not only about the information we consume, but also about the information we share. Being a responsible consumer and sharer of information is especially challenging in times of crisis when the desire for information and the reassurance it can provide is especially great. And this is even more so in a time when information—both accurate and inaccurate—is for many of us so readily accessed and shared. In this time of uncertainty surrounding the COVID-19 pandemic, seeking and treating information with intentionality and care have never been more important.


1. Vosoughi, S., Roy, D., & Sinan, A. (2018). The spread of true and false news online. Science, 359, 1146-1151.

2. Pennycook, G., Bear, A., Collins, E.T., Rand, D.G. (2020). The implied truth effect: Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without warnings. Management Science.

3. Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144, 993-1002.