Journalists Hannah Allam and Mark Seibel have published an analysis of the evidence that Syria has chemical weapons , and they have been taking their case to other media outlets (e.g., I have no particular opinion as to the truth of the claim that Syria used chemical weapons on its people, in part because I don't particularly care (and my reasons for not caring are well expressed in a recent Oatmeal Comic However, as a psychologist, I think Allam and Seibel's analysis deserves some serious attention, because there are psychological phenomenon highly relevant to their conclusions, which should be considered by anyone trying to make sense of the situation. To summarize Allam and Seibel’s argument (all my words):

John Kerry told the American people that, "A) US intelligence detected preparations for an attack, B) we then detected missiles being fired, C) we then then had evidence of the attack as it was happening." He also claimed that the evidence was incontrovertible. However, a lot of the evidence is shaky, and the timeline was startling, as the US had a history of warning its allies when it thought chemical attacks were imminent, and no such warning was issued in this case. In contrast to what you might conclude from Kerry's presentation, it appears that what really happened was, "A) We were pretty sure they had used chemical weapons, and thought it was sarin gas, B) we went back and saw some evidence that might have been a relevant missile launch, C) we went back further and saw some evidence that might have been preparation for a chemical attack.”

While it is tempting to call this "hindsight bias", that's not what's happening. Hindsight bias occurs when you revise your beliefs in light of new facts and assert that your new belief was your old belief. For example, if you did not think that Obama would win the election, but then after-the-fact thought/told people that you knew he would win all along, that would be hindsight bias (see Frischhoff, 2007). What might be happening here is actually a related phenomenon called "confirmation bias". Confirmation bias occurs when you have a belief, and then, while investigating the belief pay much more attention to confirmatory information than contradictory information (Wason, 1959).

While all sources seem to agree that lots of people died in the attack under question, and most sources seem to agree that chemical weapons of some sort were used, the more detailed claims of the US intelligence agencies are very much in question, and they should be. If Allam and Seibel are correct about the timeline, which they probably are, then the investigators looking for evidence of a sarin attack most certainly suffered from confirmation bias. Their task was, apparently, to dig through past data looking for evidence to confirm that an attack had occurred. With such a goal, the investigators are not only are likely to ignore evidence against the desired conclusion, they are quite likely to experience as evidence information that they would not have experienced as evidence if given a more neutral task.

As I said at the beginning, this does not mean that the US is wrong, but it does mean that critics are quite justified in treating the intelligence reports with suspicion. A much higher level of scrutiny is required if this evidence was found in hindsight than if the events were predicted, or confirmed in real time. Personally, I think we should make a decision by examining the overall actions of the regime and the rebels, and I do not feel informed enough for my personal opinion to hold weight. However, if you are someone who agrees with the current US attitude that chemical weapons are a “red line” that must not be crossed, hopefully this analysis will help you examine the evidence more critically.


Fischhoff, B. (2007) An early history of hindsight research. Social Cognition, 25, 10-13.

Allam, H., & Seibel, M. (September 2, 2013). Retrieved from:

Wason, P. C. (1959). Processing of positive and negative information. Quarterly Journal of Experimental Psychology, 52, 133-142.

About the Author

Eric Charles, Ph.D.

Eric Charles, Ph.D., runs the research lab at CTRL, the Center for Teaching, Research, and Learning, at American University.

You are reading

Fixing Psychology

Deep Thoughts: The Stomach in a Jar Problem

Baffling philosophers and scientists for decades

Why Academic Writing Sucks, Part 2

Is Pinker right that I am doing something very wrong?

Why Inferential Statistics?

How to explain statistics on one page