Skip to main content

Verified by Psychology Today

Decision-Making

Do You Really Know What You Think You Know?

Do you trust yourself too much?

Key points

  • We are surprisingly likely to jump to wrong conclusions, often without ever realising it.
  • Research shows that we fail to consider if there is anything we don’t know when assessing situations.
  • It is helpful to get into the habit of questioning whether we really have all the facts.

Do you ever confidently arrive at a conclusion based on inadequate information? This was the question I ended up asking myself, in light of some interesting research findings, which I am about to share. I definitely think of myself as someone who wouldn’t do that. I would carefully weigh up all sides of a situation before deciding that something is right or wrong or one thing is the cause or another. After all, this is the very thing I spend much time helping others do in therapy.

And then I read the research findings and was not so very confident after all.

As social psychologist Hunter Gehlbach, from Johns Hopkins University, and his co-researchers put it in their recent paper, “People fail to account for the unknown unknowns. Accordingly, they navigate their social worlds confidently assuming that they possess adequate information … often without pausing to wonder how much they do not know.

“For example, many drivers have pulled up behind a first car at a stop sign only to get annoyed when that car fails to proceed when traffic lulls at the intersection. Drivers of these second cars may assume they possess ample information to justify honking. Yet, as soon as a mother pushing her stroller across the intersection emerges from beyond their field of vision, it becomes clear that they lacked crucial information which the first driver possessed.”1

Source: zamrznutitonovi / Envato Elements

Can you put your hand on your heart and be sure that you have never tutted at someone at the head of a long supermarket queue, seemingly taking an unnecessary length of time packing their plentiful purchases and not getting on with paying? Or fumed at someone standing chatting to the cashier, who, in turn, seems to have given up scanning items in favour of paying the customer’s story full attention, both oblivious of the frustration of the people behind – only to find that some faulty item had to be exchanged by another supermarket assistant, whose return they are awaiting?

I have certainly done it and, while this may not be that important in the big scheme of things, it signifies a tendency which, according to Gehlbach and colleagues, can have significant consequences.

In their study, they gave 1,261 people a short piece to read, entitled “Our school water is disappearing”. It described how the source of water local to a particular school was drying up and the school governors had to decide whether to bank on more rainfall soon or merge with a school in a better supplied area.

A third of participants saw arguments for both staying and merging; a third saw only arguments for staying and the final third saw only arguments for merging. They were all asked afterwards if they felt they had enough information on which to base a decision and were confident in it. Then half of the second two groups were shown the information they had not seen before, giving the other side of the argument, and were asked if they wanted to change their decisions.

Now, here’s the thing. The majority of participants felt that they had had enough information, regardless of how much they had seen. And, when shown the extra information, even more of them overall opted to stay with their first decision.

The researchers, surprised that more did not change their minds, make the case for a little more humility: “Teaching individuals to pause and question how much information they know that they know about a situation – and, importantly, how much they might not know – could be a helpful disposition to cultivate.”

Jumping to conclusions is especially a risk if the information we see or read only ever gives one side of the story – websites or newspapers, for instance, which have a particular political stance, or influencers who promote certain views.

This finding strongly validates the more rounded perspective that human givens therapists are taught to take, when working with clients convinced that their partner is uncaring (“He forgot about our anniversary – it means nothing to him”); that their boss has behaved unfairly (“She always gives the best jobs to my colleague because she likes her better”); that their child’s behaviour is inexcusable (“It is pure rudeness. He just shouted me down and banged the door”), etc.

“I wonder if there is another way of looking at this,” we might say, and encourage the person to come up with some different possible reasons for the offending party’s behaviour. At first, the offerings are likely to be negative: “He thinks he has more important things to do”; “She thinks my colleague is a better worker”; “He doesn’t care about my feelings”. And then other possibilities slowly surface: “Maybe he is super-worried about his mum right now and I haven’t recognised how much it is affecting him”; “Perhaps my boss thinks my colleague needs the easier jobs, because she is less experienced than I am”; “Maybe he felt defensive when he confessed what he had done at school and I immediately got angry instead of listening to why…”

In the future, I am going to remember to do a lot more of this myself, outside the therapy room.

References

1. Gehlbach, H, Robinson, C D and Fletcher, A (2024). The illusion of information adequacy. PLOS ONE, 19, 10, e0310216. https://doi.org/10.1371/journal.pone.0310216

advertisement
More from Denise Winn
More from Psychology Today