We depend on government agencies to manage emergency responses to incidents such as major fires, earthquakes, tornadoes, terrorist attacks, and the like. No single agency can cover it all. Emergency services need to draw on law enforcement, firefighters, medical resources, as well as communications and other technology specialists, possibly national guards. Even within each of these communities we will find sub-groups (such as large-scale incidents calling upon police and fire and medical resources from different municipalities).

Therefore, one problem is to ensure interoperability – the capacity for different agencies to communicate and coordinate with each other. In the natural course of events, the different specialties and agencies formulate their own tactics and obtain their own type of equipment, and each tailors its strategies to the most common types of problems it faces, not to the (thankfully) rare events requiring multi-agency coordination.

To counter this balkanization, governments try to impose standard procedures for how the different agencies are to talk to each other and coordinate during a large-scale emergency. The confusion and time pressure of emergencies make it difficult to rely on just-in-time adaptations. You want the procedures worked out in advance. As Baber and McMaster (2017) state, “The interoperability SOPs [Standard Operating Procedures] are partly seen as necessary to discourage improvisation…” (p. 172, italics in the original).

Unfortunately, as Baber and McMaster point out, large-scale emergencies often require the agencies to innovate and improvise: create new command roles, new tactics, and complex coordination arrangements.

The procedural mindset encourages agencies to train and exercise together, so that they can master the SOPs. However, actual emergencies often bring in new organizations and new and unfamiliar people.

Thus, the procedural mindset can result in a false sense of security. Organizations may believe that all they need to do is ensure that everyone follows the rules. In reality, complex situations can’t be handled by rules that are established in advance.

For example, the Department of Homeland Security has set up communications protocols for ensuring interoperability during emergencies. The protocols seem reasonable at first glance, but Baber and McMaster question some of the major assumptions:

The protocols kick in when there is a major incident, but it may not be obvious from the outset that the agencies are facing a major incident.
The protocols depend on a shared understanding of the incident, but there is a good chance that the different agencies will see the incident in different ways.

The protocols depend on having the agencies agree on what information needs to be exchanged, but in most major incidents each agency has its own beliefs about what counts as critical information.

The protocols depend heavily on digital communications, but most major incidents depend on voice data.

Emergency services organizations are aware of the possibility — the likelihood — that voice channels will get overloaded during an emergency, and so they have contingency plans. Thus, the City of London has an access overload control plan to restrict extraneous communications via cell phones. However, in one actual crisis the police unilaterally triggered this plan without anticipating how it would affect other emergency service providers. As a result, the plan created more confusion than it resolved, and it made it much harder for the police decision makers to communicate with responders on the scene.

Perhaps the issue here is that a procedural mindset is no match for a chaotic and unpredictable event. And the training and preparation can mask the need to improvise and adapt once the incident begins.

One of the clichés in Command and Control is that the plan means little — what counts is the practice of planning. This cliché gets repeated so often because it is quickly forgotten. Thus, the lessons learned from one incident, or even from a practice exercise, are captured as new SOPs and the debrief emphasizes what procedures would have helped, rather than what each agency learned about the operation of other agencies.

Organizations may be better served by training to handle unexpected glitches and failures that render the SOPs obsolete. Organizations should be following the notions of resilience engineering (Hollnagel, Woods & Leveson, 2006), that building resilience — rapid adaptability — is more important than constructing more and more SOPs. Organizations may be better served by reviewing the historical records of previous emergencies, and by facing multi-agency exercises that require adaptations of the SOPs or even abandonment of SOPs. That way they can learn interdependencies, not procedures. The essence of coordination is predictability, which means having each agency gain a richer mental model of how the other agencies operate — understanding why another agency would choose one approach rather than another, understanding the constraints that the other agency is facing, the priorities it is pursuing, the types of information it possesses.

One reason to set up procedures is to spare each agency from having to learn too much about the others, but interdependency and multi-agency adaptation depends on this kind of learning.

Of course, some amount of proceduralization is necessary. We don’t want to re-invent all the protocols on the spot.

So — procedures are necessary, but they aren’t sufficient. Organizations shouldn’t fool themselves into thinking that with enough procedures they’ll be OK.

I think that the important question is how to determine how much to proceduralize. Which types of activities should organizations proceduralize, and when should they try to build more expertise and richer mental models? The answer probably is that it depends — it depends on the complexity of the situational challenges and how that context will alter the procedures. It depends on the stability of the situation. And it depends on the quality and the turnover rates of the decision makers.

References

Baber, C., & McMaster, R. (2017). Grasping the moment: Sensemaking in response to routine incidents and major emergencies. CRC Press. 

Hollnagel, E., Woods, D.D., & Leveson, N. (2006). Resilience engineering: Concepts and precepts.  CRC Press. 

You are reading

Seeing What Others Don't

Retiring the Dreyfus Five-Stage Model of Expertise

It has served its purpose—time to move on.

The Age of Centaurs

Instead of building smarter machines, let's build machines that make us smarter.

The War on Experts

Five professional communities are trying to discredit expertise.