Skip to main content

Verified by Psychology Today

Media

Modern-Day Sin-Eaters

Low paid workers consuming the worst horrors of the Internet

Key points

  • Social media content moderators are developing PTSD.
  • There is evidence that some content moderators are becoming radicalised.
  • Social media companies must do far more to protect their workforce.

Chris Gray is awake in bed thinking about the baby. Was it dead? He thought so at the time, can still picture it on the ground, an adult’s boot pressed against its tiny chest. But maybe he was wrong. If Chris’s auditor concludes that the baby wasn’t dead then the mistake will go against his quality score and he’s a step closer to getting fired. So he lies awake late into the night, seeing the image over and over, trying to formulate an argument to keep his job.

This is one of the stories Chris shares with me as he tries to explain the impact that working as a content moderator for Facebook has had on his mental health. He's now suing the company in the Irish courts for psychological trauma.

Bhaveshgoswami / Pexels on Pixabay
Source: Bhaveshgoswami / Pexels on Pixabay

Let’s go back a step: When was the last time you looked at your social media? Probably not so long ago. Maybe it’s open right now in another tab even as you read this. But how often do you think about the people whose job it is to ensure your feed is filled with pictures of amusing cats and interesting articles on psychology, rather than the unspeakably nightmarish horrors that lurk in the darkest corners of the web? ‘Content moderators,’ says Dr. Jennifer Beckett, a lecturer in Media and Communications at the University of Melbourne, ‘are the modern-day sin-eaters of the Internet. Their job is to soak up the worst of us so that we don’t have to deal with that.’

Thousands of moderators spend their working days filtering through masses of flagged content, including sexual assaults, suicide, revenge pornography, and murder. Perhaps unsurprisingly, many of them are getting unwell. Dr. Beckett believes that most moderators will go on to develop some form of Post-Traumatic Stress Disorder (PTSD) and that adequate support is not being offered.

Certainly, Chris believes it was his job as a moderator that has resulted in his PTSD, although it took him a while to accept this. ‘I'm a grown man,’ he tells me. ‘I've been all over the world. It never occurred to me that this kind of stuff could be traumatic.' It was only after leaving the job, and as his symptoms became overwhelming – a constant feeling of being on edge, breaking down in tears, getting into angry disputes – that he finally sought the help he needed.

Getting radicalised

For Chris, one of the most difficult parts of the job wasn’t the disturbing posts that he took down, but rather the posts that he believed were racist or intended to incite hatred but that didn’t technically break the platform’s rules.

It’s a dilemma I raise with Dr. Jennifer Beckett who points out that the effects are even worse for moderators who themselves belong to a minority group that’s being attacked on the site. ‘You can see that someone is inflaming hate or being racist,' she explains. 'But that platform says no it isn't. Then you, as a member of that minority group, are then forced to leave it up. And that just works to reinforce the dehumanizing message. Imagine you're seeing all of this trauma, all of these bad things, it's a constant daily grind. And now you're being forced to leave things up that are damaging to people like you. The amount of cognitive dissonance that's required in that role is mind-boggling to me.’

What of the flip-side? Might the very act of looking at so much radical content result in moderators themselves becoming radicalised? According to Dr. Beckett, there’s evidence that this is happening. Content moderation is a mostly low-paid job in the gig economy. The people doing it are frequently at a low mental ebb already: they feel disenfranchised, exhausted, and are lacking agency over their own lives. In short, they are the perfect cohort for radicalisation.

Lolcats

It's clear that social media companies must do far more to address this problem, which experts agree is only getting worse with greater numbers of moderators processing ever-greater amounts of harmful content.

Tranmautritam from Pexels
Source: Tranmautritam from Pexels

Ryan Broderick, a journalist who investigates web culture, is unequivocal in his assessment: If a social media company can't adequately protect its users and workforce then it doesn't deserve to exist. He draws an analogy with a restaurant. ‘If you run a restaurant and you can't tell people that the food won't make them sick, you can't have that restaurant!’ The same applies to other businesses. ‘Welcome to the mall,' he jokes. ‘Certain stores are on fire. We can’t tell you which ones until you’re inside. Good luck!’ Such a mall would obviously not be allowed to operate, and rightly so.

It’s a humorous example but it makes a serious point: We’ve come to think of the big social media companies as necessarily existing, as though this is an immutable law of physics. It may be time to rethink that. When you next receive a cute picture of a cat, spare a thought for the modern-day sin-eaters who are working to keep us all safe, but at what cost?

References

You can hear more on this story, including how Chris Gray is progressing with his ongoing legal case at my podcast Modern-Day Sin-Eaters.

advertisement
More from Nathan Filer PhD
More from Psychology Today