Drone Policy: Reducing the Human Cost

Social psychology can help us understand and address our drone policy problems.

Posted Oct 04, 2016

Naval Surface Warriors/Flickr
Source: Naval Surface Warriors/Flickr

If you had a 10% success rate at work for five months, how much longer do you think your employer would keep you around? Most of us wouldn’t be allowed to fail at that rate for five months, let alone receive additional time to make up for those shortcomings. And we generally hold those who are responsible for human life, such as medical professionals and law enforcement officials, to even higher standards of accuracy in their work.

Yet, as revealed in The Drone Papers last year, America’s drone warfare programs have at times operated at this low level of success. During one five month period between Jan. 2012 and Feb. 2013 almost 90% of those killed in American drone strikes were not the intended targets. This amount of collateral damage – of innocent lives lost – should not be acceptable. Nevertheless, neither substantive changes to drone programs nor individual punishments for lethal errors have occurred in the wake of these reports. 

There are many factors that shape America’s drone policy, most notably political and economic concerns, but a social psychological perspective offers another way to understand and potentially address some of the failures of drone policy. For instance, considering two psychological phenomena, the bystander effect and moral disengagement, and how they may function within our drone bureaucracies reveals ways that policymakers can improve the current infrastructure and save lives.

The Bystander Effect

The bystander effect, which has been defined as “the phenomenon that an individual’s likelihood of helping decreases when passive bystanders are present in a critical situation,” has been studied in laboratory and naturalistic settings for over 40 years (see Fischer et al., 2011 for an examination of over 50 studies). Three psychological processes have been highlighted as the main contributors to this effect: diffusion of responsibility, evaluation apprehension, and pluralistic ignorance.  Fischer et al. (2011) defined each of these terms as follows:

Diffusion of Responsibility - “the tendency to subjectively divide the personal responsibility to help by the number of bystanders. The more bystanders there are, the less personal responsibility any individual bystander will feel.”

Evaluation Apprehension - “the fear of being judged by others when acting publicly. In other words, individuals fear to make mistakes or act inadequately when they feel observed, which makes them more reluctant to intervene in critical situations.”

Pluralistic Ignorance - “the tendency to rely on the overt reactions of others when defining an ambiguous situation. A maximum bystander effect occurs when no one intervenes because everyone believes that no one else perceives an emergency.”

Essentially, these findings can be boiled down to this: in a critical situation, we are less likely to assist a person in need when we are uncertain about the appropriate action to take, are afraid of making a mistake or looking foolish, and are witnessing others who are not helping.

Much of the literature on the bystander effect focuses on how individuals react in everyday situations, and a litany of YouTube videos demonstrate how we can pass by strangers in public who need assistance. However, much less attention has been paid to the impact the bystander effect may have on everyday operations within organizations like the military.

Bystander Effects in Drone Bureaucracies

While it is important to note that the effects of diffusion of responsibility, evaluation apprehension and pluralistic ignorance have not been empirically studied within drone operation units, what we know about how these psychological processes inhibit people from taking action should make us concerned about the depersonalized nature of dehumanizing components of drone warfare.

Even before the Drone Papers, investigations of the situational and psychological pressures affecting teams of drone operators found that “it is most appropriate to approach [drone warfare] as a form of killing that has an elaborate and intentional bureaucratized structure,” (Asaro, 2013). Such bureaucratic chains of command literally diffuse responsibility throughout the social system and away from any individual operator, making it difficult to hold any one person responsible for a mistake.

For instance, when the LA Times covered the unreleased Pentagon review of a friendly-fire drone incident that resulted in the deaths of two enlisted service members in Afghanistan, they found that no individuals involved were held culpable for the killings. Instead, the report found that the incident resulted from a “fatal mix of poor communications, faulty assumptions and ‘a lack of overall common situational awareness.’”

Furthermore, accounts from former drone operators attest to the difficulties in trying to report concerns about fellow operators or mission objectives to their superiors. Thus, it seems likely that evaluation apprehension, in the form of anxiety or reluctance to voice alternative plans of action, factored into these friendly-fire deaths.

Feeling unable to voice dissent likely contributed to the pluralistic ignorance displayed in this incident as well, for the poor communication from those involved in these missions probably arose because many of them “believed that they had come to the same conclusions” despite having reservations about the selected targets (Asaro, 2013).

These three psychological processes seem to have contributed to bystander effects that allowed drone teams to make fatal mistakes, but there are additional psychological factors contributing to unintended deaths in drone operations.

Moral Disengagement and Euphemistic Language

Preeminent psychology researcher Dr. Albert Bandura notes that the aforementioned psychological processes are far from the only forms of moral disengagement discernable in military operations. For Bandura, moral disengagement encompasses the personal, behavioral, and environmental processes by which individuals enable themselves to violate their own moral norms while feeling ok about doing so.

For example, most of us would feel guilt or remorse if we stole money out of someone’s hand. But it is a lot easier to justify pirating music online from “file sharing” sites because we can more readily convince ourselves that nobody is really being harmed by our actions. The depersonalized nature of online interactions, the abstract nature of the victim, and many other factors contribute to why the latter scenario seems more morally ambiguous than the personal and concrete nature of snatching a purse or wallet.

Another mechanism of moral disengagement found in both civilian and military contexts is the use of euphemistic language. In his new book, “Moral Disengagement: How People Do Harm and Live with Themselves,” Bandura examines the literature on euphemistic language and how it is often used to “depersonalize doers from harmful activities,” (2016).  Research has shown that the acceptability of certain actions is influenced by what those actions are called, and using sanitized, agent-less, and technical terms instead of clear-cut, agentic, and common phrases enables us to do things we may not be comfortable with otherwise. “File-sharing” seems more justifiable than stealing in the same way that “servicing the target,” “visiting a sight” and “coercive diplomacy” seem more justifiable than bombing.

The Euphemistic Language of Drone Warfare

In addition to his concerns about the depersonalized and abstract nature of drone operations, Bandura worries that using agent-less jargon like “unmanned aerial vehicles” contributes to the lack of individual accountability reported in drone operations. However, the more troubling use of euphemistic language in drone operations, revealed in The Drone Papers, comes from the way targets on the ground are classified by military officials thousands of miles away:

“The documents show that the military designated people it killed in targeted strikes as EKIA — 'enemy killed in action' — even if they were not the intended targets of the strike. Unless evidence posthumously emerged to prove the males killed were not terrorists or 'unlawful enemy combatants,' EKIA remained their designation, according to the source. That process, he said, 'is insane. But we’ve made ourselves comfortable with that. The intelligence community, JSOC, the CIA, and everybody that helps support and prop up these programs, they’re comfortable with that idea.'”

From a moral and ethical standpoint, classifying potentially innocent victims as “enemies” by default is reprehensible; however, from an operational standpoint it makes perfect sense. Operators would be much more hesitant to pull the trigger if they were completely aware of how often they had killed innocent civilians during their missions. Bandura and others note the stress drone operators report despite being removed from the front lines and that, “Having to turn one’s morality off and on, day in and day out, between lethal air strikes and prosocial home life makes it difficult to maintain a sense of moral integrity,” (2016, p. 67). Officials in the intelligence community may justify their use of these designations as a way to protect the already strained psyches of their drone teams.

Feedback Loop of Moral Disengagement

An unintended consequence of these euphemisms is the implicit message that’s been conveyed down the chain of command: collateral damage isn’t a concern. Former drone operator and instructor Michael Haas claims that he was punished failing a student on a training mission in which the student insisted his targets were suspicious despite having no evidence to back the judgment: 

“Short on operators, his superiors asked him to explain his decision. 'I don’t want a person in that seat with that mentality with their hands on those triggers,' Haas says. 'It’s a dangerous scenario for everyone to have someone with that bloodlust.' But the student’s detached outlook wasn’t as important as training new recruits. Haas was ultimately punished for failing the student and barred from teaching for 10 days.”

On some level Haas’ superiors surely want to limit the amount of civilians killed in their attacks, but the euphemistic language on which their policy objectives are gauged allows them to dismiss Haas’ concerns and carry on with training a potentially dangerous recruit. When collateral damage is a hidden statistic, there’s no reason to be concerned when looking at the stat sheet: sanitizing the language of war continually enables fatal mistakes to be overlooked or go unpunished.

It is problematic that individuals within drone bureaucracies are morally disengaging while on the job and maintain the policy status quo, but the bigger problem is that the policies and systems of drone warfare internally manufacture these kinds of moral disengagements where they may not arise otherwise.

Simple Suggestions for Policy Improvements

Tackling these failures of drone policy seems like a daunting task, but psychological insights suggest that the solutions may be relatively simple.

In a wonderful TedX talk, Dr. Ken Brown shares something important that many people miss when they talk about the bystander effect: while the largest bystander effects are seen when participants are encouraged to remain passive, instructing one bystander to step into action can completely reverse the observed bystander effect. The data show that when one person actively helps, others are more likely to step in to aide further.

Organizations like the Heroic Imagination Project are already educating individuals about the social and psychological skills required to assess critical situations and take action when needed. Such programs encourage students of all ages to be “everyday heroes,” to lead by example and help others even when helping is difficult or costly.

Although teaching drone teams to recognize and address moral disengagement and bystander effects would be a good start, rooting out these problems will require institutional changes to drone bureaucracies that enable teams and individuals to speak out when warranted. For example, creating team positions focused on anonymously collecting and disseminating strategic concerns throughout an operation would allow drone operators to voice hesitations and questions without fearing punishment. Policymakers are needed to figure out how to implement these kinds of changes within the traditional hierarchies of defense and intelligence organizations.

Additionally, personalizing and literal language needs to replace the current agent-less and euphemistic verbiage of drone policies. This is only one tactic for instilling a greater sense of personal and moral responsibility for mission outcomes that is surely lacking given the preceding accounts. Combating moral disengagement within drone units, from inside these teams and from outside policymakers, requires honesty about civilian casualties and a shared language of responsibility and agency.

Summary

Given the political and economic advantages of drone warfare, it is likely to be a part of the American arsenal well beyond our lifetimes. It is obviously preferable to keep our soldiers out of harms way when we can, but we must be willing to acknowledge and contend with the weaknesses of this form of killing. War, by definition, reduces humanity, but in war we must aim to retain as much our humanity as possible.

As Albert Bandura explains, “To function humanely, societies must establish social systems that uphold compassion and curb cruelty. Regardless of whether social practices are carried out individually, organizationally, or institutionally, it should be made difficult for people to delete humanity from their actions,” (2016).  

Systemic changes are needed within drone infrastructures to embolden and protect individuals brave enough to dissent in the face of immense institutional pressure to remain passive. Policymakers need to openly address the psychological challenges and needs of drone operating teams in order to reduce the amount of collateral damage within control rooms and on the battlefield. 

Versions of this article originally appeared on The Decision Lab.

The Decision Lab
Source: The Decision Lab

The Decision Lab is a non-profit aimed at promoting discourse around how decision science can positively impact business, tech and policy. 

More Posts