In a famous experiment you probably studied in school, Stanley Milgram demonstrated the average person’s obedience to authority by inducing two-thirds of his subjects to administer what they believed were extremely painful shocks to other subjects in what they believed was a memory training exercise. None of the subjects checked on the welfare of the screaming and moaning “subject” in the next room; none demanded that the experiment end. Milgram wondered if the Nazi horrors were partly explained by widespread obedience to authority, in this case a scientist in a white lab coat, even when instructed to do something that conflicted with the person’s value system. Generally, the experiment stands for the proposition that either New Haven adults at the time are too obedient to authority or, for the less squeamish, for the proposition that we all are too obedient to authority. “Obedience to authority” might be translated as a belief that the authority knows better than the individual how to behave.
Here’s another interpretation of the results: When someone who is clearly a bona fide scientist assures you that no harm will come from a procedure, you can trust science over your own lying eyes. The great, often overlooked fact about Milgram’s experiment is that, indeed, no harm came to the apparently suffering person in the next room—the scientist could be trusted after all. In other words, the belief that scientists know better than the individual remains true after the experiment in which, as advertised, no one was hurt.
Science, like a parent, is always telling us things about the world that conflict with our own perceptions—the molecular composition of ordinary objects, the movement of the earth that feels stationary, the finite speed of light, the irrelevance of the previous outcomes in games of chance, for examples. In this respect (and in darn few others), science is like any other belief system; it asks members of its community to defer to community standards. Of course, in the culture of science, the community standards are supposed to be based on evidence and logic, whereas all other cultures hold some tenets (based on tradition, revelation, faith, and so on) more dear than those based on evidence and logic. The culture of science—not always all scientists, who are all-too-human, but science itself—is concerned only, in Skinner’s phrasing, with generating statements that lead to effective action. This limited purpose assures that, in science’s name, intentional harm to other people is rare (but it does happen, as in the Tuskegee experiments). When it does happen, it’s only in the name of science if the intent of the harm was to increase knowledge; otherwise, it’s in the name of power.
So I draw two inferences from the fact that Milgram’s subjects were right, after all, to trust the scientist. One, evidence and reason often produce truths that make us uncomfortable, but evidence and reason teach us to trust the process (think of all the good that has come from science including, possibly, your very existence if medicine ever saved or agricultural science ever fed one of your ancestors). Two, if you are going to trust someone in authority, first inquire into the overt, stated, openly endorsed values the person espouses. You should subsequently find out about the authority’s covert values, but many, many authority figures can be disqualified on the overt ones, especially on the issue of how they suggest treating outsiders. If outsiders (people who don’t obey the same authorities you obey) are to be treated badly, now or in the hereafter, you know you are dealing with a tribal system, medieval at best, designed to empower one group of people over others. This is what Dostoyevsky meant by saying that you can tell how civilized a society is by looking at its prisons, criminals being people who are restricted for not obeying the state’s authority but who otherwise need not be treated badly.