Skip to main content

Verified by Psychology Today

Jealousy

Robo-Envy and Why People Flock to Authoritarian Leaders

It would be easy to program a computer to behave like a total jerk.

In the annual Turing test, people converse with a computer and a human and guess which is which. A prize goes to the most convincingly human-like computer program and to the most human human, the person who can best demonstrate that they’re not a computer.

In movies, the most convincing computers are often dolled up as the sexiest female robots. It gives them a leg up in convincing men that they're not computers.

Apparently, it’s not all about sophisticated programming. Our appetites, gullibilities and vulnerabilities also influence whether we interpret a machine to be worthy human company.

We’re ambivalent about human company too. In some ways we prefer computers. Computers are more reliable. They’re loyal to us without demanding reciprocation. They are easily reprogrammed. What you teach them stays taught. They don’t forget. Interacting with a computer, we get to make it be all about us, not them. No wonder corporations replace workers with computers and robots. No wonder there’s a trend away from dating and sex, toward video games and porn.

These days we're discovering our ambivalence about whether we prefer to interact with humans or computers. Though most debate focuses on whether computers will overrun or replace us, little attention goes into how much we'd like them to replace us, or at least the company we keep. Would you rather take driving guidance from GPS or a backseat driver?

Many people might wish sometimes that their friends, family and colleagues were programmable robots. Imagine a sexy female robot singing “Don’t you wish your girlfriend was bot like me?”

Robo-envy is real. We might even wish we were robots sometimes. Unlike us, they learn complex things instantly and they don’t make impulsive mistakes. When we talk about re-wiring or reprogramming our brains, that’s wishful thinking robo-envy talking. We’d like to be insta-learners like computers.

Compare a growing attraction to computer company to some people’s preference for pets over partners. Sure, a dog asks more of us than computers and they’re slower learners. Still, they’re lopsidedly loyal and reliable. They don’t talk back. They make it all about us. Like sexy robots, they’re cuddly too.

Computer reliability has another advantage, alluring but terrible, exploited by corporations but applicable to political power struggles too. You can program computers as power-sensing homing devices.

Think of how authoritarian leaders gain power through a steadfastness grounded in empty rhetoric. It wouldn’t be difficult to program a "douche-bot"—a computer that acts like a total jerk, a kind of social rectifier.

A rectifier converts alternating current to direct current by flipping all negatives to positive. When negative current flows, it turns the tables, reversing the negative to positive. Tyrants do something similar. Tyrants turn the tables on good and bad so that all the good flows to them and all the bad flows to their opponents. It doesn’t even require any attention to what’s good or bad.

Programming a basic table-turning tyrannical rectifier douche-bot would be easy. Just collect two lists of words, words with positive connotations and words with negative connotations.

In conversation, debate or interviews, whenever one of the positive terms is referenced, the computer would ape common syntax (easily done with current AI) and declare that the computer possesses that quality. Whenever a negative term is used the computer declares that its opponents possess that quality.

For example, “transparency” and “The Constitution” have positive connotations. The douche-bot would therefore be programmed to say something like, “Well, it turns I’m the most—and I think most of you would agree to this—I’m the most transparent president, probably in the history of this country,” or “No one values the constitution more than I do.”

The computer wouldn’t have to be programmed to know what transparency or the constitution are, just that they have positive connotations. That’s the heart of empty rhetoric—confident use of terms based only on their positive or negative connotations without any regard for what they mean.

The Turing Test grants awards to the most human computer and the most human human. There ought to be a table-turning test too, a competition for the most computer-like human, the "douche-bot" that is most effective in seducing people into loyal devotion the way an authoritarian leader would.

Because sexy robots aren’t the only way to fool people. You can fool some of the people all of the time just by turning the tables all of the time, rectifying the positive and negative signals so that all positives flow to the tyrant and all negatives flow to the tyrant’s opponents.

advertisement
More from Jeremy E. Sherman Ph.D.
More from Psychology Today