Skip to main content

Verified by Psychology Today

Career

Want People to Behave Better? Give Them More Privacy

Workers often do their best when bosses don't monitor their every move

Would you tell your six-year-old there's no Santa Claus, and then describe exactly how you bought and wrapped all the Christmas gifts? If your spouse asked who was on the phone, would you say "it's your best friend about the surprise party we're planning for your birthday"? Would you tell your boss about your plans to test your new idea, with its 50-50 odds of success, at work next Monday?

If you answer "no" to these questions, then you have a healthy and normal appreciation for the value of hiding some facts, from some people, some of the time. It's part of life as an autonomous adult to make these hidden pockets of information, which give people room to try new things, to make difficult decisions, to protect others or to make them happy. There are many good, even noble, reasons for this kind of version-control. When you protect the boss from blame for your gamble, it's a form of loyalty. When you don't disclose where the Christmas presents come from, it's a form of loving care.

So why do people often throw this instinct out the window when they think about companies, governments and other institutions? Individuals who protect their personal privacy nonetheless assume that in groups of people, total openness must be the most efficient, effective and only morally right way to operate.

As a consequence, even privacy's defenders often concede that it's a kind of sand in the gears—gears that would turn better if nothing were hidden. We only accept the sand, they say, to preserve other things we value. We could catch terrorists if the government spied on all communications, but you don't want cops reading your email. You'd get more out of workers if you tracked their every move 24/7, but you don't want companies snooping on you in the bathroom. "Commercial aviation would be safer if we were all required to fly stark naked," the journalist Nicholas Kristof wrote some years ago, "but we accept trade-offs — such as clothing — and thus some small risk."

This kind of argument is so familiar that it's kind of shock to wonder if it's true. Yet it's just these unquestioned assumptions about privacy that need examining as it comes under increased pressure from increasingly powerful technology. It is well worth asking: Must a government or business lose competence when it gives people some power to decide what is known about them, and who knows it?

Ethan S. Bernstein, a Harvard Business School professor, expected to find that the answer is yes. He had, as he wrote a few years ago, accepted what he calls "the gospel of transparency" in business. But his research on companies in China, the U.S. and elsewhere changed his mind. He found, in fact, that giving workers a certain measure of privacy led them to perform better than their more heavily monitored peers.

There's a natural state of heightened attention to the self when we know we're being watched, Bernstein notes. "Our practiced response become better," he told me, "our unpracticed responses become worse." So actions that have been drilled by the boss may well turn out better when everyone believes the boss is watching. On the other hand, for behavior that isn't already learned—where the best response needs unselfconscious focus on the problem, and the chance to try something new without fear—being watched makes things harder. Attention that could have gone to one's actions goes, instead, to managing the appearance of one's actions.

The "gospel of transparency" declares that this is not a problem, because workers should stick to management's script. But in one vast Chinese factory that Bernstein studied, workers who craftily deviated from standard procedure often improved the plant's productivity.

For example, after Bernstein embedded Chinese-speaking Harvard undergrads as workers in the factory, the students soon discovered something surprising about behavior that looked to outsiders like employees just chatting and fooling around. Within each team, workers were quietly training others to do their jobs. That way, the team could keep things moving on a day when someone was missing or falling behind.

Yet the workers determined to keep their actions secret. In the plant, where all workers were visible to anyone, and whiteboards and computer terminals displayed data about how they were doing, experienced employees quickly taught his undercover researchers how to hide their deviance when an outsider swung into view. It was a triumph of craftiness, he recalls. "They weren't necessarily hiding specifically from managers. They were also hiding from their peers as well."

Why wouldn't the workers want management to know they were, in effect, helping the company? The answer is obvious to anyone who has ever been an employee. Their innovations weren't in their job descriptions. When they were being watched, they had to play to their audience.

Workers were well aware of this paradox, and they didn't like it. One of Bernstein's observers overheard a worker say the team would do better if they could be hidden from the rest of the factory.

Bernstein decided to test that idea. He got the factory management to curtain off—literally, with a curtain—four of 32 production lines that made mobile data cards. Over the next five months, those curtained lines were 10 to 15 percent more productive than their more exposed colleagues.

Team members had a kind of collective privacy—they were hidden from the constant scrutiny by management and other workers, even though within their shared workspace they were still visible to each other. The effect, Bernstein writes, was to shrink the size of the surveillance "audience," and confine it to people the workers had a personal connection with. This kind of "privacy within team boundaries," he says, has been associated with better results in many workplaces, from Google to hospital emergency rooms.

Of course, in the modern workplace, observation doesn't just mean literally being watched. It also involves the relentless collection and analysis of data about the worker. Many a boss has used these "digital bread crumbs," as Bernstein calls them, to help determine whether employees get raises, reprimands or termination notices. But there's a privacy-preserving alternative: Limit who can see the data, and limit how it can be used to affect employee's lives.

For example (as I mention in this piece, just out in the May/June issue of PT) many trucking companies use cameras that automatically record a driver whenever there's sudden braking, swerving or speeding up. But in one company Bernstein studied, the videos never go to management and are not used in performance reviews (unless the driver is texting-at-the-wheel dangerous). Instead, a team of coaches, whose only job is help drivers improve, receives the videos. Drivers, he says, like and trust that the system is there to help them, because it keeps their mistakes within a trusted circle of people who are not wielding power over their lives.

Another form of privacy that enhances work is to reduce the degree to which every decision is observed by management. So, for example, a retail chain that uses an algorithm to assign work shifts now lets individual store managers revise the algorithm's schedule without having to clear the decision with headquarters. After it granted its local managers this modicum of privacy, the chain's profits rose.

Yet another form of restraint on surveillance is a decision not to monitor time as closely as possible. Though the technology is in place now to measure exactly how each minute of work time is used, many organizations decline to track that closely. Giving employees hours, days or even months in which to work without close scrutiny, Bernstein writes, has enhanced productivity instead of harming it.

In instituting these four forms of privacy—privacy within team boundaries, privacy limits on employee data, privacy in decision-making, and privacy about time—the organizations Bernstein studied refused the temptation to observe (or try to observe) everything. That refusal did not cost them profits or effectiveness. Instead, respect for privacy enhanced their success.

That suggests that privacy's defenders should not concede that total surveillance is safest (or most efficient or most profitable), before going on to say that it would be creepy to have to fly naked. That's important in a society where monitoring technology is ever cheaper and ever more powerful, and the notion is spreading that surveillance, and the data it generates, can solve any problem. Privacy, so often depicted as the enemy of efficiency in public life, can be its friend.

advertisement
More from David Berreby
More from Psychology Today
More from David Berreby
More from Psychology Today