Skip to main content
Disaster Psychology

It Can't Happen to Me

Why people are complacent and so often unprepared when disasters strike.

Key points

  • The human brain is structured to turn past experience into habit.
  • Habit can reduce the ability to understand and plan for novel dangers.
  • Original thinking requires energy that habituation is designed to conserve.
Fumiste Studios/Used with permission
Source: Fumiste Studios/Used with permission

Are humans simply unequipped to deal with large-scale catastrophes?

I started to wonder about this in the course of a recent talk at NYU's King Juan Carlos Center, during which climate scientist Sonali McDermid and NYU cultural theorist Mary Louise Pratt described the unprecented destructive power of recent wildfires in Los Angeles, California, and Fort McMurray, Alberta.

Citing the 2024 book Fire Weather, by John Vaillant, about the 2016 Fort McMurray wildfire—which burned with such vicious intensity that the heat reignited the fire's own smoke and reduced ceramic toilet bowls to heaps of ash—McDermid and Pratt speculated that humans might not be good at predicting, let alone dealing with, such large-scale catastrophes. They cited, in particular, the overall lack of concern in North American societies regarding global warming, which almost certainly contributed to the unprecedented destruction experienced in L.A. and Alberta, and is certain to cause wildfires bigger and more deadly worldwide than have ever been seen before.

They might be right. And one reason might simply have to do with how our brains work, because complacency regarding future catastrophe is essentially a form of habit: the notion that whether it be a wildfire, a car crash, or a flavor of ice cream, this is something we have experienced before and have liked or disliked but at least survived. And so, as long as the event bears some similarity to what we've already seen, we file it in a section of our brain that deals with familiar events. And thus, repetition becomes habituation.

How it works structurally is like this: The brain's prefrontal cortex (PFC) processes a new event and fires the resulting information, in the form of neurons, at the brain's decision-making center, a part of the limbic system called the corpus striatum. The striatum then decides what to do about it. If the event is repeated, however—especially if it becomes commonplace—the PFC-striatum channel, like a familiar and well-trodden footpath, becomes easier and easier for neurons to transit, and so fewer and fewer neurons are required to activate it. The brain is always interested in conserving energy (a fancy way of saying that original thought is exhausting), so if given the choice, if a new event is similar enough to one we've seen before, it will assume we can deal with it the same way we dealt with the last one.

I was exposed to this kind of habit-thinking while researching the loss of a huge American freighter, the SS El Faro, off the Bahamas in Hurricane Joaquin in 2015. El Faro's captain, Michael Davidson, was an experienced and competent professional, but he assumed Joaquin was an ordinary hurricane, something he was used to dealing with in this part of the world.

Davidson had also steered El Faro through bad storms off Alaska and figured she could weather Joaquin like other storms. Same kind of weather, same ship, meant he could use the same old strategies he'd habitually used. So he filed Joaquin in the "been there, done that" part of his brain—and went to sleep for the crucial period during which he could have monitored what the storm was doing, and had his ship take evasive action.

Because the sector of the southern North Atlantic Joaquin was crossing was unprecedentedly warm (again, most likely due to climate change), the storm grew unusually fast in power and speed. Hurricane Joaquin turned into a Category 3 storm that overtook El Faro off Samana Cays. The ship, Davidson, and 32 other crew disappeared on October 1, 2015.

What research energy has been devoted to understanding this kind of problem has largely been in the fields of systems engineering, focused in particular on how to prevent industrial accidents like the Fukushima nuclear disaster in Japan or the oil spill caused by a fire on BP's Deepwater Horizon rig in the Gulf of Mexico. The problem here, according to an article in the International Journal of Risk Reduction, is one of impaired "risk perception," which is another way of describing habit-based behavior in that the degree of perceived risk declines with the sense that one already knows how to deal with this kind of problem, resulting, as one trade journal puts it, in "self-satisfaction which may result in non-vigilance based on an unjustified assumption of satisfactory system state."

Whether it's called "habit" or "unjustified assumption," however; whether it comes down to complacency, faulty "risk perception," or, as the National Transportation Safety Board inquiry into El Faro's sinking declared, a lack of "situational awareness," the problem is as old as the first proto-humans who tried to deal with forest fires, floods, or blizzards: how to keep an open mind about what's happening when your brain is trying to tell you that you've seen it all dozens of times before, so why not save the energy you're about to expend on new and original thought?

References

Foy, George Michelsen (2018). Run the Storm: A savage hurricane, a brave crew, and the wreck of the SS El Faro. New York, Scribner / Simon & Schuster.

Vaillant, John (2024). Fire Weather. New York, Viking Press.

advertisement
More from George Michelsen Foy
More from Psychology Today