Why the Typical Performance Review Is Overwhelmingly Biased
Even the assumption that managers accurately rate without bias reflects bias
Posted May 02, 2018
By David Rock and Beth Jones
The traditional performance review is a confidential, closed-door meeting between no more than two people. Research suggests it is also totally misguided.
Though we may think we’re making accurate, objective assessments during a performance review, the social and brain sciences have shown that bias is still baked into the brain.
Studies, for instance, have indicated that as much as 62% of a rater’s judgment of an employee is a reflection of the rater, not the person getting reviewed. Despite this, survey data from a recent summit we hosted on performance management indicated that 57% of companies weren’t doing anything to remove bias from their performance reviews. It’s no wonder companies that prize traditional reviews are quickly becoming dinosaurs.
Yet these evaluation decisions are among the most important processes that a manager is asked to do, and the consequences can make or break not only an employees’ contributions to a team, but a career. Digging into the research, it’s clear the smarter review process needs a less-biased approach: one based on crowdsourcing.
What the science says
The traditional review structure assumes that leaders who have tracked an employee’s behavior over a certain period of time are the best authorities to judge whether the employee has missed, achieved, or surpassed his or her goals. This assumption itself is biased.
It’s experience bias, or the all-too-human tendency to believe our own interpretations of the world constitute the whole, objective truth. In reality, people perceive the world differently from one another, and no one interpretation is objectively correct.
Experience biases can include a false consensus effect, in which we assume more people agree with our beliefs than is actually true; theblind spot bias, where we can pick out biases in other people but not ourselves; and many others. All of them demonstrate that isolated experiences aren’t enough to get at a full picture of the truth.
Biases like these get us into trouble for a couple reasons. The first is they happen outside conscious behavior, so it’s difficult to address them on our own. The second is they compel us to reject the beliefs of people who see things differently, since we conflate different with wrong.
How to conduct a less-biased review
To conduct smarter reviews, managers should solicit the perspectives of other people. Ideally, these will be people who don’t work in the same capacity as the leader or think along the same lines.
Bias is built into brain function, which means it can be hedged against, but not erased. Everybody has their own subjective biases, but by surveying across people, you can get closer to an objective, more rigorous version of the truth. A manager can look for patterns in feedback rather than relying on his or her own singular, biased view.
Granted, even collective feedback will have an element of bias in it, but if five of seven colleagues notice X about Tom, then it’s a reasonable bet that Tom should address X. It’s like seeking multiple opinions for a medical diagnosis, or how a reporter interviews a range of sources to get to the facts that matter for a story.
Leaders who crowdsource reviews can ask about a person’s value outside of the typical “key” metrics, in order to get a fuller, qualitative understanding of the roles the person plays. Perhaps a team member who missed his sales target actually uplifted several other people to hit their targets. A manager would miss this insight unless he or she asked around.
On its face, this may feel like a breach of privacy—Performance reviews are supposed to be confidential, aren’t they?—but, in fact, publicizing the data-gathering works in everyone’s favor.
The long-term benefits of asking around
The earliest benefit to crowdsourced reviews is that candidates feel more comfortable knowing their good work will be seen. Having more advocates generally translates to greater praise. But people can also take solace in the entire team being held accountable in the long run. Asking around gives leaders extra intel that may expose that seemingly competent employees are actually underperforming.
For crowdsourced reviews to be effective, it’s essential for managers to let their teams know prior to review cycles that others will be asked about performance. This creates a positive social pressure to do well, in addition to boosting transparency across an organization.
Beth Jones is Lead Consultant and David Rock is Director at the NeuroLeadership Institute.
This article originally appeared on Quartz.