Crime and Punishment

Knowing what punishment to mete out is no simple feat. Some forms of punishment may send the wrong message. For example, is it okay to solve problems with more violence? Here are some thoughts on this tough topic.

Tunnel Vision in the Criminal Justice System

Do cognitive biases increase wrongful convictions?

In 1982, a man named Marvin Anderson was convicted of robbery, forcible sodomy, abduction, and the rape of a woman in Virginia, despite very weak evidence supporting the prosecution's case, questionable eyewitness identification, and four alibi witnesses that testified to seeing him in the same place (nowhere near the crime). Twenty years later, DNA evidence conclusively proved that Anderson was innocent, and pointed to the true offender, Otis Lincoln; much evidence available during Anderson's trial also indicated Lincoln was the likely attacker, but this was never investigated after Anderson was chosen as the main suspect. Even after Lincoln confessed, the judge who presided over Anderson's trial refused to credit Lincoln's confession (finding it false), and Anderson served out his prison term and parole until DNA testing conclusively identified Lincoln as the attacker.

This appalling miscarriage of justice is recounted in a book chapter titled "Tunnel Vision" by University of Wisconsin Law School professor Keith A. Findley, also the co-director of the Wisconsin innocence Project and president of the Innocence Network. The chapter is forthcoming in the book Conviction of the Innocent: Lessons from Psychological Research (edited by B. Cutler; APA Press), and is based on an earlier and more detailed law review article written with his UW colleague Michael S. Scott. (No, not the guy from The Office—sit down, Dwight.)

Findley uses this case as an example of tunnel vision in the criminal justice system, identifying mistakes made at each stage of the process that can be traced to common cognnitive biases. In Findley's words, "tunnel vision is the product of a variety of cognitive distortions, such as confirmation bias, hindsight bias, and outcome bias, which can impede accuracy in what we perceive and in how we interpret what we perceive" ("Tunnel Vision," p. 6).

Findley discusses many such biases in his paper, but I will focus on two (both of which have been discussed often on other Psychology Today blogs). Confirmation bias describes the natural human tendency to interpret new information in a way that confirms our pre-existing beliefs, to remember previous events in a way that confirms those beliefs, and to discount or discard information that challenges them (also a part of belief perseverance). Confirmation bias led the various actors in the criminal justice system (from the investigating officers and detectives to the prosectors to the trial judge), once they had focused on Anderson as the key suspect, to exaggerate the relevance of evidence supporting his guilt, and to downplay contradictory evidence supporting his innocence (namely, that which pointed away from Anderson and toward Lincoln).

Working hand-in-hand with confirmation bias, hindsight bias describes another natural tendency to regard a past event as inevitable, or at least much likely than originally thought, after it is confirmed by later information. Otherwise known as the "knew-it-all-along" effect, it stems from the way we construct our memories of events, using all of the information gathered since the original occurence to arrive at a much more definite causal chain of events than is objectively warranted. Naturally, we want to make sense of things, but that drive may lead us to reach conclusions too hastily and stick to them too resolutely.

So once a suspect like Anderson becomes the focus of an investigation, and evidence is gathered (and interpreted) to confirm that focus, in hindsight that decision will be regarded as inevitable and correct. Also, an eyewitness may only vaguely remember who she saw at the crime scene, but after her identification of a suspect like Anderson is confirmed, she'll later think she remembered him better than she actually did at the time. Finally, the longer and more frequently that the authorities reaffirm Anderson's guilt, the more confident they become in their original judgment, to the extent that even presented with DNA evidence and a confession by another suspect, Anderson's judge was reluctant to set him free, since he'd been regarded as guilty for so long.

In addition to these natural human tendencies, there are also institutional factors within the criminal justice system which reinforce these effects, many of which can be attributed to a mismatch between the goals of the system itself and the incentives given to persons within that system to help reach those goals. The goal of the criminal justice system is, most generally, justice itself: to apprehend, convict, and punish the guilty and not the innocent. The problem is that guilt and innocence are never known with certainty (except to the suspects themselves), so the police, prosecutors, judges and juries have to make the best decisions with the information they have.

Also, as human beings, we can't count on all of these actors to pursue the cause of justice singlemindedly, so the system provides them with incentives, such as rewarding prosecutors according to conviction rates. But now we're back to the original information problem: justice is not measured by conviction rates per se, but by how many of the guilty—and how few of the innocent—are convicted. Since we can never know this (at least not at the time of trial), conviction rates (and similar measures throughout the system) may be the best incentive possible given the information available.

But therein lies the problem: if prosecutors are focused solely on convicting the defendant on trial, then they may be blind to any new information that frustrates that goal. The biases contributing to tunnel vison compound this problem, so even the most ethical prosecutor may unconsciously discount or dismiss information that suggest that the defendant may be guilty, leading to what some call "conviction psychology" (see Findley and Scott's law review article, p. 328). But if there is no way to know if criminal defendants are truly guilty or innocent—that's why we have jury trials, after all—there's no way to reward prosecutors (and other actors in the process) for achieving justice other than to use an inaccurate proxy for it.

This brings up an important point that Findley is careful to emphasize: nothing described here should be taken to indict anybody in the criminal justice system as unethical or irresponsible, since these cognitive biases are natural tendencies that all of us experience in our day-to-day lives. One may suggest that what needs to be done is to educate people in these positions regarding the existence or pervasiveness of these biases, but studies on the effect of education on the incidence of such biases is not encouraging. Certainly, if a detective knows that his thought processes in a particular instance are being swayed by, say, confirmation bias, and he deliberately chooses to ignore it, that's unethical. But we can't expect everyone in the criminal justice system, no matter how well-intentioned and informed, to be able to eradicate all effects of these biases.

If we can't design better reward mechanisms for the various actors in the system, and we can't eliminate the natural biases in our thinking, how do we deal with these problems that lead to wrongful convictions? Findley suggests fostering increased transparency at each level in the process, relying on studies that suggest people submit to their biases less often when they know they are been observed, and reforming institutional factors that reinforce tunnel vision (to the extent this is possible). Another answer—and it isn't a popular one—is to rely on lawyers for the defense to counter the biases from the police, prosecutors, and judges.

Defense lawyers get an incredibly bad rap in the press and popular media. Many people think they exist simply to "get the bad guys off," to craft clever arguments to get drug pushers and murderer acquitted, and to introduce outlandish alternative explanations of a crime to introduce that invaluable "reasonable doubt" in the minds of the jurors (what they called "Plan B" on the television show "The Practice"). But a better way to think about the role of defense lawyers is to ensure that the prosecution does its job right, by forcing them to make the best case they can, and also by making juries consider all the evidence that suggests their clients are not guilty (including evidence that another person is).

In other words, defense lawyers are the watchdogs of the criminal justice system—"institutionalized whistle-blowers" if you will. If the prosecution has a rock-solid case (and presents it well), no amount of clever maneuvering by a defense lawyer is going to save his or her client. (In fact, such cases will rarely reach trial due to pretrial plea bargaining.) But if the prosecution does not have such an open-and-shut case, then according to the facts the defendant's guilt is significantly in doubt, and it is the defense lawyer's reponsibility—both to his client as well as to the criminal justice system as a whole—to push that prosecutor to make the best case she can. If the prosecutor secures a conviction, under the watchful eye of the defense lawyer, that conviction will more likely stand up to appeal—thanks, ironically, to that defense lawyer.

In regard to tunnel vision, one thing the defense lawyer can do is point out the ways that cognitive biases influenced the case against his client. If the biases cannot be eliminated themselves, than at least their effects can be pointed out during the trial and can be negated when appropriate. I'm not claiming that more actively diligent defense lawyers are a magic bullet—in their law review paper (pp. 331-3), Findley and Scott also discuss instances of tunnel vision on the part of defense counsel that work against their clients—but it would seem that defense lawyers have a more direct incentive to combat these biases, given their specific responsibility to their clients.

Finally, none of what I've discussed here should be taken as being "soft" on criminal justice or the fight against crime. Wrongly convicting and punishing innocent persons has nothing to do with being "tough on crime." In fact, wrongfuly convictions may actually encourage crime; if innocent defnedants are convicted more frequently, a potential criminal may think he's more likely to go to jail whether he breaks the law or not, reducing his incentives to stick to the straight and narrow. The criminal justice system should be designed and implemented to maximize the likelihood of punishing the guilty while minimizing the chance of punishing the innocent. This cannot be done simply by convicting more or less defendants, but only by convicting with more certainty—something that, as Findley and Scott point out, is crucially compromised by the effects of tunnel vision.


You can now follow me on Twitter...

Crime and Punishment