Skip to main content

Verified by Psychology Today

Sarah Langan
Sarah Langan
Gender

On a Social Dilemma: Regulation

Part II: It's not that hard.

There’s this thread in the Netflix documentary “Social Dilemma,” where the guy who played Pete Campbell on “Mad Men’ (Vincent Kartheiser) takes up residence inside the social network’s algorithm, representing different aspects of the capitalist machine, and also reflecting the specific user it’s working to keep engaged, to the detriment of that user.

It’s a fun idea. What interests me even more than the algorithm, though, is the users of social media—the people typing and checking and updating and posting and liking and loving and sharing. Who are they, when they do these things?

In Neurologist David Eagleman’s Incognito, he posits that we’re not an entirely sentient species. We’re a collection of instincts. When we drink alcohol or alter our rational states, the side of us that’s less rational might take over. So, when people say, “She isn’t herself,” it might be true, literally. The uninhibited, risky behavior side might have come out to play.

What side comes out, in the dark, when we’re looking at our screens? Who’s driving when we post?

 Netflix / The Social Dilemma / Fair Use
Source: Netflix / The Social Dilemma / Fair Use

I’ve come to think of Facebook as a black box. It’s a room we can’t read. Because we’re unable to properly assess the level of threat that negative comments on our posts might represent (we have no idea who’s seeing them; Facebook posts don’t convey tone or state of mind, there is no socially agreed-upon protocol), our hindbrains tend to assume maximum danger. Consequently, our responses to negative posts are often charged. A fight ensues.

If two rabbits fight, or especially draw blood, they never forget it. They remember the scent of the enemy rabbit for the rest of their lives, and if they smell it, they attack. It’s not conscious. They never evaluate this instinct. It lives in their hindbrains.

Something similar happens when we fight online, I think. The memory goes to a deep place that we can’t access. I’d argue that this is a reason we keep checking Facebook so often, particularly after making a post. It’s not simply the dopamine hit of a like, it’s our confused survival instincts, checking for danger.

There isn’t much data on gender and Facebook, and I think there’s a reason for that. Having been active for years, my anecdotal evidence is that guys are total dicks online. Whenever I post a real opinion, I get trolled. Even by friends, and always by guys. Which is why I don’t often post opinions. I’ve been silenced.

Geena Davis founded the Institute on Gender in Media. In interviews, she talks about how she’d tell studio execs that more women needed to be in films, and they’d say: we fixed that already. We’re woke. It’s not a problem. She funded a study that showed that women were very underrepresented. The execs were shocked.

I wonder what we’d find if we ran a study of the percent of negative comments made on other peoples’ pages. Would the angry people mostly be white guys? Would the people with mean comments posted on their pages mostly be women and minorities?

My theory is that Facebook isn’t talking to our conscious minds. It’s talking to our hindbrains. Something odd happens, even to “woke” America, once it enters the black box. Those conscious efforts we make to be “woke” fall away. Our ingrained, tribal behavior emerges.

These interactions have infected us on subconscious levels, reinforcing our least useful beliefs: Women are hothouse flowers. Men are dicks. Liberals are arrogant. Conservatives are morons. At some point, these reinforced beliefs enter a black box. We don’t take them back out. They’re too upsetting. We just believe them.

We now fight with people in real life over politics, lose friends over it. But we’re not even arguing about real facts. We’re arguing about personalities. Snippets we read in links we never followed. It’s shorthand for the national unconscious. Word salad. It leads nowhere.

Facebook spent more than $17 million on lobbyists in 2019, according to the Center for Responsive Politics. I’m curious about what it’s lobbying for.

According to The Guardian, an engineer within Facebook found company-wide gender bias in 2017. Women’s coding was rejected 35 percent more often. Facebook buried this, claiming that news of such a study would damage their hip, woke brand. It then conducted a new study and said there was no bias whatsoever. At the time, only 17 percent of technical employees at Facebook were women.

In 2018, The New York Times reported that Myanmar’s military employed 700 people to pose as celebrities and real news outlets on Facebook, Twitter, and other social networks, posting inflammatory rhetoric, false rape accusations, and purporting to be news websites, when in fact, they were just Facebook pages. These actions spurred the genocide of at least 10,000 Rohingya Muslims. 700,000 Rohingya fled the country.

Facebook has admitted it acted too slowly to remove the accounts that led to the murder of those 10,000 people, in what the UN characterized as ethnic cleansing. But they can’t be too sorry about those murders, Because, in 2020, Time Magazine reported that The West African nation of The Gambia tried to hold Myanmar’s military accountable. The Gambia asked Facebook to provide all evidence, even the evidence its platform had taken down. Facebook demurred. Specifically, according to Time Magazine:

Earlier this month, the company filed its opposition to The Gambia’s application. Facebook said the request is “extraordinarily broad,” as well as “unduly intrusive or burdensome.” Calling on the U.S. District Court for the District of Columbia to reject the application, the social media giant says The Gambia fails to “identify accounts with sufficient specificity.” The Gambia was actually quite specific, going so far as to name 17 officials, two military units, and dozens of pages and accounts.

Facebook is saying: It’s too hard. All we do is analyze, package, and sell data, but we can’t do it for free for you! Underneath, it’s also saying: we’re not going to help you blame us for 10,000 dead and at least 700,000 displaced. Because we know that means we’ll one day have to pay for it. And when this blight comes to America, as it inevitably will, we’ll also pretend it’s not our fault, we didn’t know, and anyway, it’s not our job. We’re a platform. You’re the users. And the consumers. And the product. And sometimes, the bodies.

Facebook could easily have a strict Interaction policy. It’s not too hard to police. They have all the data. They made 700 billion dollars in 2019. Surely, they could hire some people and pay them benefits. They could make transparent, how often hostile posts happen, and to whom. They could flag false articles. They could lead instead of playing dumb. This leading isn’t supplying more content, like, “Are you registered to vote?” It’s changing the algorithms. It’s making those algorithms manifest and reporting to a robust FCC. It's accepting that this medium has the capacity to do great harm, and correcting for it.

advertisement
About the Author
Sarah Langan

Sarah Langan has received three Bram-Stoker awards, and her work has been included in numerous best-of-the-year anthologies. Her upcoming book is Good Neighbors.

More from Sarah Langan
More from Psychology Today
More from Sarah Langan
More from Psychology Today