Prison Images/Public Domain Pictures
Source: Prison Images/Public Domain Pictures

You’d probably agree that in an ideal universe you’d do best living your life in accord with the golden rule. But in the real world attempting such an existence is (to say the least) precarious. For almost daily you’re likely to be confronted with situations that warrant suspicion. These are circumstances in which other people or institutions have shown themselves to be untrustworthy, their underlying motives or intentions dubious. In these instances, ought you, self-protectively, to oppose them—or simply turn the other cheek (and by doing so, leave yourself wide open to be taken advantage of a second time)?

Historically, fundamentalist tenets of most religions, directly or indirectly, have recommended the latter choice. And traditionally, if reacting passively when you’ve been deceived ultimately guarantees an eternity of bliss, not retaliating or defending yourself makes very good sense. Yet if we examine this reaction to an external provocation rationally and secularly, it hardly makes sense at all.

That is, viewed empirically, steadfastly adhering to the golden rule could be understood as either masochistic . . . or downright obtuse. And by “obtuse” I mean that you’re consciously deciding to act against your rudimentary self-interest, your inborn right to safeguard your survival. Perhaps that’s why in the Old Testament—which focuses more on revenge than on love or compassion—you’re given the option to counter-attack (as in, “An eye for an eye. . . ) .”

If, on the other hand, in a relationship you’re intent on pragmatically pursuing your own advantage, might it not make sense to disregard your partner’s preferences, and not so much cooperate with them but capitalize on their possible trust in you?

Considering all these thorny questions, what might the so-intriguing field of Game Theory—which at once relates to behavioral economics, mathematics, evolutionary biology and psychology, political science, social psychology, and moral philosophy—say about all of this? After all, conceptualizations, and speculations, on human nature in this ever-expanding research area have over the past half-century received substantial academic attention. And the various experimental “games” devised to explore elemental questions of decision-making are unquestionably relevant here. Moreover, the researchers’ findings—specifically, as regards morality and ethics—also deserve serious attention.

The so-called “Prisoner’s Dilemma” game, extensively studied and explored in many different forms, involves two individuals given certain cooperative/competitive options. To describe the classic—and most colorful version—of this dilemma (Albert Tucker’s version based on an early work by Merrill Flood and Melvin Dresher in 1950):

Krystian Olszanski/Interrogation Room/Flickr
Source: Krystian Olszanski/Interrogation Room/Flickr

You and an accomplice have been arrested for robbing a bank, and you both care more about your own freedom than that of your accomplice. The district attorney makes you the following offer: “You can either confess, or remain silent. If you confess, and your accomplice remains silent, I will drop all charges against you and see that your partner is put away for some serious time. If they confess and you don’t, they go free and you do the time. If you both confess, you’ll both be convicted, but I’ll see to it that you get an early parole. If neither of you confess, I’ll prosecute you both for firearms possession, and you’ll get small sentences.” You cannot communicate with your accomplice by any means: you must make your decision alone. (As characterized in Chris Bateman, “Tit for Tat,” June 01, 2007, OnlyaGame.typepad.com.)

The more generalized version of this dilemma, though, doesn’t pertain to different degrees of punishment but financial gain. Here each individual can decide to work cooperatively with the other for some medium-sized, shared reward. Or they can be governed by narrow self-interest or greed, and so choose to exploit the other—and if successful, get the entire reward for themselves (the other person walking away with nothing). Lastly, if both try to take advantage of the other, each of them will end up with a tiny fraction of what they might otherwise have received. This is, in fact, the essence of pivotal academic experiments in the early 80s, executed by political scientist Robert Axelrod (Univ. of Michigan) when he conducted “tournaments” with many academician participants, each of whom “played with” another multiple times.

Note that in this game which alternative is chosen hinges on matters of trust, without which no amount of cooperation can be expected, as well as levels of selfishness or egoism. But note, too, that each of these games presents a “one-time-only” scenario.

In the real world, however, relationships typically aren’t limited to single engagements. Consequently, later versions of the Prisoner’s Dilemma, by Axelrod and others, mostly depict repeated or (as more commonly termed) “iterated” encounters. And this is where crucial considerations of tit for tat come into play. For while the first time neither party can know what the other will do, going forward both of them can remember how the other person acted, or reacted, previously, which will effect their subsequent moves. And they’ll also be mindful of how their present reaction may affect the other player’s move toward (or against) them in the following round. So, considerations of retaliation and reward—and, most of all, trust—become increasingly prominent in the strategy they choose.

In Axelrod’s tournaments, of all the strategies (many of which were quite intricate) later assessed by him, the one that regularly culminated in the most successful outcomes was, unexpectedly, tit for tat. This simple strategy, devised by mathematical psychologist Anatol Rapaport (Univ. of Toronto), involves cooperating with your partner on the first round, then adjusting your behavior to match your partner’s (as in, you do to them what they just did to you—admittedly, a less selfless tactic than the golden rule would prescribe). If, reciprocally, your partner cooperates, you continue to cooperate; if they defect, you respond in kind by immediately retaliating against them. This formula has, ironically, been characterized as “conditional niceness,” since it advocates for a kind of provisional golden rule.

And it’s a constant winner. No completely selfish strategy (and many ingenious ones have been contrived!) is able to beat it.

Needless to say, the connotations of tit for tat are decidedly negative. They suggest childish vengefulness and payback; a lack of empathy or willingness to see as valid any position other than one’s own; and an escalating cycle of revenge, retribution, and hostility. But curiously, game theory has “redeemed” this conventionally unfavorable notion of tit for tat—even coming to see it in many instances as the most viable ethical alternative to a broad and complicated array of less humane, or more self-interested and manipulative approaches.

It should be added that the tit-for-tat strategy is greatly enhanced by including an element of forgiveness. For absent this, it can eventuate in a vengeful cycle of each person’s defecting ad infinitum. But in “tit for tact with forgiveness” the innocent participant generously provides the other a second chance to cooperate after they’ve initially chosen to defect.

Chris Bateman, concisely summing up Axelrod’s conclusions from his experiments with the Prisoner’s Dilemma, notes that the most successful strategy requires that a player “be nice (never the first to defect), [be] retaliating (willing to defect), [be] forgiving (willing to attempt to regain trust by breaking a defection cycle), and [be] non-envious (not specifically attempt to outscore individual opponents).” (“Tit for Tat,” June 01, 2007, OnlyaGame.typepad.com.)

It’s reasonable, then, to deduce that someone who is basically selfish can still best pursue their self-interest simply by deciding to be nice—even if it’s only a stratagem. Another way of putting this is that reduced to its essentials cooperation can beat competition . . . or that, well, nice guys can finish first. (See Axelrod’s The Evolution of Cooperation, Basic Books, 1984.)

Admittedly, all of the above needs to be qualified since in the real world any two (or more!) humans can work harmoniously together to achieve dishonorable or corrupt ends—as in prisoners’ plotting an underground escape, corporate heads colluding to dupe the public, or students’ “cooperatively cheating” on an exam (e.g., see, especially, Ben Y. Hayden, “Rethinking the Morality of the Prisoner’s Dilemma,” Psychology Today online, July 28, 2013). But barring such negative, real-life possibilities, the tit-for-tat manner described for dealing with others handily triumphs over the less practical golden rule—which (strictly defined at least) doesn’t sanction any vengeful retaliation. On the contrary, tit for tat:

  • protects the individual against “mean” opponents, never allowing themselves to be brutally exploited because of being too nice [i.e., turning the other cheek, and so setting themselves up for further exploitation];
  • avoids the ongoing losses that come from combating another’s “evil” strategy with their own [i.e., mutual vengefulness leads to both individuals’ sacrificing their self-interest for the non-meritorious purpose of getting even]; 
  • through reciprocating—then forgiving—rewards others for cooperating while punishing them for defecting, thereby prompting them to play fair;
  • in making its intent crystal clear, devoid of any hypocrisy or double-dealing, it ends up being the most “trustworthy” of strategies—and so typically wins the confidence, and cooperation, of other participants.
     

Tit for tat is very much in line with evolutionary theory in that it supports the contention that cooperation—or at least measured cooperation—is instrumental not only in helping humans co-exist peacefully but, by extension, in insuring the species’ survival. In an interview with Peter Singer, currently at Princeton University and author of A Darwinian Left: Politics, Evolution and Cooperation (Yale Univ. Press, 2000), this prominent bio-ethicist is quoted as declaring that “we have evolved not to be ruthless proto-capitalists, but to enter into mutually beneficial forms of co-operation” (see Francis Steen, “Peter Singer: Ethics in the Age of Evolutionary Psychology,” The Philosopher’s Magazine, March 7, 2000).

In his own voice, Singer’s interviewer Francis Steen concludes by stating:

Put crudely, if you model the survival prospects for different kinds of creatures with different ways of interacting with others—from serial exploiters to serial co-operators and every shade in between—it turns out that the creatures who thrive in the long one are those that adopt a strategy called 'tit for tat'. This means that they always seek to co-operate with others, but withdraw that co-operation as soon as they are taken advantage of. Because this is the attitude which increases the survival value of a species, it would seem to follow that humans have evolved an in-built tendency to co-operation, along with a tendency to withdraw that co-operation if exploited. Hence, it is argued [that an] essential feature of ethics—reciprocity—is explained by evolution.

And yet, one final—and major—reservation ought to be added here. And John Robinson, in his Web article “The Moral Prisoner’s Dilemma,” is just one of many theorists who make it. As he notes: “Because the [analogical Prisoner’s Dilemma] model is so abstract, and has artificial constraints against communication, its application to real-world problems must be done with care.”

All the same, to end this piece with one final quote that pointedly, though qualifiedly, recommends tit for tat over the golden rule (though the author’s name is never given): “Tit for Tat is not the best of ethical standards—that of Jesus, Gandhi and Dr. King, all murder victims [ahem!], may well be—but [it] may in fact be the best ethics available for those who wish to survive in our imperfect world.” (“An Ethic Based on the Prisoner’s Dilemma,” The Ethical Spectator, Sept. 1995).

For those readers interested generally in the golden rule, I've written a four-part series on this ethical ideal. Here are their (sub)titles and links:

"Part 1: Don't Take It Literally!"

"Part 2: What's It Missing?"

"Part 3: Its Uncanny Resilience"

"Part 4: Dreams of Utopia"

If you learned anything useful from this post, and think others you know might also, kindly forward them its link.

To check out other posts I’ve done for Psychology Today online—on a broad variety of psychological topics—click here.

© 2016 Leon F. Seltzer, Ph.D. All Rights Reserved.

---Finally, to be notified whenever I post something new, I invite readers to join me on Facebook—as well as on Twitter where, additionally, you can follow my frequently unorthodox psychological and philosophical musings.

You are reading

Evolution of the Self

Did You Get the Parental Guidance You Needed Growing Up?

What can go wrong when you can’t comfortably confide in your caretakers.

Do You Have an Inner Taskmaster? How Can You Tell?

Might you have a work ethic that makes it difficult for you to relax?

How Is Rewriting History the Goal of All Therapy?

Psychological de-programming and re-programming are key to effective therapy.