Are we selfish creatures who can never trust one another, or kind souls who respond empathetically to one another’s distress and are capable of feeling inspiration from the example of unselfish role models? Is there a true underlying human nature that’s fundamentally selfish, with whatever “better angels” we can summon being an overlay cooked up to make us look better in others’ eyes?
Economics as a field has long identified itself with the assumption of self-interest. Adam Smith set the science on its voyage with his memorable declaration of 1776 that “It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest.” Three generations later, John Stuart Mill declared in the definitive economics text of the mid-19th century that the starting point of “political economy” is the assumption of the rational economic man that he famously dubbed Homo economicus. Francis Edgeworth, a pioneer of the more mathematical approach that came to dominate the academic discipline in the 20th century, wrote in 1898 that “The first principle of Economics is that every agent is actuated only by self-interest.”
But Smith was also a proponent of a theory that featured innate feelings of sympathy. In his 1759 work The Theory of Moral Sentiments, he had written: “How selfish soever man may be supposed, there are evidently some principles in his nature, which interest him in the fortune of others, and render their happiness necessary to him, though he derives nothing from it except the pleasure of seeing it.“ This was not a young man’s sentimentality only; he continued to revise and reissue the book even after publication of his economics tract, The Wealth of Nations. And in a previous posting, I quoted Edgeworth’s contemporary, Alfred Marshall, as affirming that “No doubt men … are capable of much … unselfish service [to others]”.
The assumption that we’re fundamentally selfish and display care towards others only as an afterthought, due to upbringing, or to make a good impression, receives support from a variety of observations. Our families and societies make enormous efforts to inculcate more moral and altruistic behaviors in us but seem to have little need to indoctrinate selfishness, which evidently spring up of its own accord. Rewards and punishments of the child, such as the giving or withholding of candy and other treats, and promises and threats to the adult, e.g. of a heavenly or hellish afterlife, are used to induce less selfish behaviors. Since the converse, using moral rewards to induce selfishness, is never observed, it certainly seems like selfishness is the bedrock, morality a socially prescribed overlay.
The Smith quote suggests, though, that at least one foundation of morality and kindness springs up of its own accord out of human nature. Not only are we naturally pained by the suffering of others whom we perceive as innocent and as sharing in our humanity (which explains the donations that follow on the publication and broadcasting of compelling photos of earthquake, tsunami and famine victims); we also naturally long for others’ approval and take pride in whatever attributes we can muster that we think others might admire.
A recent publication in Nature by David Rand, Joshua Greene and Martin Nowak provides other observations that seem to warn us against overly simple conclusions. The authors studied social dilemma games including the voluntary contribution problem I’ve discussed in several postings. Subjects are placed in groups and asked to allocate funds either to a group account, which makes them jointly better off, or to private accounts, from which they alone benefit. Focusing on the timing of decisions, the authors find that individuals who acted more quickly were more likely to cooperate, contrary to the assumption that the selfish action comes to mind first and must be overridden by conscious deliberation and self-control. While parts of the study involved reanalyzing the data of previous experiments in which decision timing varied spontaneously, they also conducted new experiments in which they forced some participants to make their allocation decisions quickly, while making others wait several seconds before choosing. Again, those forced to act with less deliberation were actually more cooperative than those required to wait.
The results are reminiscent of those in the Trust Game on which I reported in my posting “On the Past, Present and Future of Fairness” (July 21, 2012). Ordinarily, this game of potential cooperation is played by having a first decision-maker send some or no money (a potentially trusting move), having those funds be multiplied by the experimenter (representing potential gains from cooperation), and having a second decision-making send back some or none of the tripled funds (a potentially trustworthy or reciprocating move). We conducted a novel version of the experiment that left out all social “code words” and that translated the decisions into a purely geometric form. The first mover simply selected a line to highlight, the second mover chose a point on that line. While some of the “first movers” seemed to react with greater fear that their counterparts would forsake social obligations in this more socially sterile game setting, “second movers” who had been given the opportunity to make a choice corresponding to the returning of funds were in fact as reciprocating or trustworthy on average as were players in the non-geometric version of the game. They seemed to “get” that their decision was about sharing, and many didn’t hesitate to share despite the excuse of the seemingly morals-free geometric setting.
Neither of these experiments provides the last word on the “good” or “bad”-natured question. The Nature authors, for instance, report that those who proved more cooperative when acting more quickly also reported that they found cooperation to be reciprocated in their everyday lives, whereas those answering that cooperation isn’t helpful in their lives showed no difference in behavior in the slow versus fast decision settings. So observance of social norms seems to depend, at least in part, on what works and what doesn’t work in our everyday lives. Nonetheless, the fact that cooperativeness can become so automatic to so many may be related to the evidence of the natural sympathies that Smith noted two centuries ago and that much scientific work has corroborated, of late.* Such findings add weight to the view that it’s who we are, and not just something bribed and threatened into us, that makes those “better angels” part of our nature.
* Studies supporting the idea of innate human empathy are reported, for instance, in the reader-friendly book The Age of Empathy by primatologist Frans de Waal.