Some of our most significant scientific revolutions have led to revelations about the place of human beings in the cosmos, and those insights have led consistently to what I call “anthro-diminution,” the recognition that we simply aren’t as central to the universe as many people — especially those committed to the Abrahamic Big Three — crave to believe. After the Copernican revolution came the Darwinian revelation that we, along with the rest of the living world, aren’t the products of Special Creation, but rather the results of a natural, material process that physically connects us to all other organisms. Even now, opponents of evolution cling desperately to the illusion that human beings — and, in some cases, living things generally — are so special that only a benevolent Creator could have produced them.
The third major leg of this troublesome triad was initiated by Freud, who (despite his occasional crackpot flights of fancy) came up with at least one solid and highly consequential discovery: the existence of the unconscious. Regardless of what one thinks of “penis envy,” the “Oedipus complex,” and so forth, there is general agreement that the human mind is like an iceberg, with much of its mass hiding below the conscious waterline.
So not only have we been kicked out of our presumed astronomical centrality (Copernicus, Kepler, Galileo, et al.), immersed in a world of materiality, and deprived of our widely assumed creaturely uniqueness (Darwin), but we also aren’t even masters in what seemed to be left to us, our pride and joy — our rational, conscious minds (Freud).
Of course, there are many people for whom the more we learn about the natural world, the more wonderful it is revealed to be and, thus, the more magnificent its Creator. It is likely, nonetheless, that insofar as human beings are perceived as “natural,” and thus explicable in terms of generally applicable scientific principles, rather than uniquely fashioned by supernatural intervention, the more resistance will be evoked among those committed not just to human specialness, but also to perceiving this specialness as evidence for divine power and intervention. It is hard enough to adjust your opinion — think of how much easier it is to change your clothes than to change your mind — harder yet to relinquish a cherished perspective. Especially one that has the blessing of religious belief. As Jonathan Swift noted centuries ago in his essay, Seeking Wisdom, "You cannot reason a person out of a position he did not reason himself into in the first place."
The only constant, nevertheless, is change. The story is told of an ancient ruler who tasked his advisers to come up with a statement that would be true at all times and for all occasions. Their response: “This too shall pass.” But although the world’s factual details are constantly shifting (as the philosopher Heraclitus pointed out, you cannot step in the same river twice, and, as Buddhists note, all things are impermanent), the basic rules and patterns underlying these changes in the physical and biological world are themselves constant. So far as we know, light traveled at the same speed during the age of dinosaurs, during the Peloponnesian War, and today. The Second Law of Thermodynamics is true and was true long before Carnot discovered this principle, just as special and general relativity was valid before being identified by Einstein.
Compared to the apparently unchanging nature of physical law, our insights are always “evolving,” along with living things themselves, although recognizing and understanding these insights often requires a major paradigm shift. Interestingly, although much has been learned (and more yet, hypothesized!) about how science proceeds to generate reliable knowledge, relatively little is known about how and why people — including scientists themselves — change their personal beliefs. On the one hand, we have Max Planck’s famous quip, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” And on the other, the more optimistic and probably widespread notion that, eventually, the truth will out (despite fervent efforts by some to deny reality that doesn’t comport with their preferred worldview).
To be clear, I am not claiming that clinging to factual error is necessarily the result of benighted religious prejudice, malignant narcissism, or the simple psychology of denial. Sometimes, incorrect scientific ideas enjoy popularity because they are a good fit with current empirical data. Initially, nearly all competent astronomers resisted Copernicus’s model, at least in part because it didn’t accord any better with astronomic observations than did the regnant Ptolemaic one. However, at least some of that resistance was due, as well, to the painful emotional and theological reorientation necessitated by its acceptance.
“All truths are easy to understand once they are discovered,” wrote Galileo. “The point is to discover them.” Much as I revere Galileo, I am not at all sure that in this regard he was correct. Sometimes, the problem isn’t simply to discover truths, but to accept them, which is especially difficult when such acceptance requires overcoming the bias of anthropocentrism, whereby people put their own species at the center of things. Although my hope is that seeing Homo sapiens through the bright glass of science will contribute to humanity understanding and accepting itself, given the stubborn persistence of anthropocentric thinking, I cannot promise success. The writings of the “new atheists” offer a possible parallel: Dawkins, Harris, Dennett, and Hitchens do not appear to have converted people to atheism so much as they have helped initiate a discussion, such that even though atheism is not nearly (yet?) mainstream, it has become more respectable.
Thomas Kuhn famously suggested that each science operates within its own paradigm, which limits the ability of its practitioners to conceive other approaches — until a “paradigmatic revolution” supplants the prior intellectual system with a new one, which in turn is similarly limiting. A related problem is that of “unconceived alternatives,” whereby our ability to make sense of natural phenomena is restricted by our own failures of imagination. (After all, an evolutionary perspective suggests that the human mind has evolved to maximize the fitness of its possessor, not necessarily to provide accurate information about the world.) To this must be added a seemingly focused resistance to anthropo-diminution, whereby alternatives that demote humanity’s already wounded self-image are especially hard to conceive.
I must emphasize, au contraire, that our naturalness does not diminish us, except with respect to the notion that human distinctiveness derives from being in some sense cut off from the rest of the natural world. By contrast, pointing out humanity’s materiality — and, thus, our profound linkage to everything else — only enlarges and thereby enhances our situation: not as lords of creation, but as full-fledged members of the wonderfully natural world.
David P. Barash is professor of psychology emeritus, University of Washington and author of Through a Glass Brightly: using science to see our species as it really is, forthcoming in 2018 from Oxford University Press.