Do You Rely on Facebook to Police Online Behavior?

Digital citizenship starts by creating social media policies at home.

Posted Jan 21, 2021

Between Jan. 6 and Jan. 20, we saw that social media policies and community standards could be enforced. Although it was a bit of ‘backs against the wall’ action, social media companies such as Twitter, Facebook, Twitch, Reddit, Google, YouTube, and others flexed their muscles and banned or restricted some accounts due to violation of their community standards, such as misinformation and inciting violence. The Cyber Civics kids say, “It’s about time,” and we agree. But now is better than never if it gets us taking action. 

 Robin Worrall @robin_rednine/Unsplash
Source: Robin Worrall @robin_rednine/Unsplash

Why are we so frustrated with social media companies' ability to censor or monitor (depending on your point of view) content? Do we expect them to be the content and behavior police? Or are we offloading our responsibilities as citizens and parents by expecting the Facebooks and YouTubes to identify and remove misinformation or inciting and abusive content?

I think we are. Here’s why.

Social media policies need to start at home and they need to reflect our values—those early lessons we teach our kids: honesty, fairness, determination, consideration, and love. It’s time we rolled up our sleeves and pitched in on the fight. Social media isn’t going away nor should it. Let’s reclaim it. If ever there was a time to be an ‘upstander’ rather than a bystander to combat inappropriate, abusive, or fraudulent content, it is now, when we can build on the lessons most recently learned. 

The last two weeks raised a couple of disparate questions for me:

  1. Did it have to get to outright lies and inciting violence before social media platforms took action—whatever happened to trying to prevent simple threats and bullying
  2. How many people have actually read the platforms’ community standards or think about what’s OK when it comes to online behaviors?

Community standards are available for all the platforms if you care to look. Facebook, for example, publishes its Community Standards and outlines its approach to balancing free speech and giving everyone a voice with the responsibilities of good citizenship, such as safety, privacy, authenticity, and respect. No easy task. And as every parent knows, laying out guidelines is not the same thing as enforcement. But what about our social media policies and community standards?

Who carries the burden of enforcement? The New York Times reported that Zuckerberg did not feel that Facebook should police content. Many disagreed with his position and argued that social media companies should not only monitor but be held accountable. Aside from the problems of operationalizing all those terms for a court of law, there’s ‘what should be’ and ‘what is,’ what’s aspirational and what’s achievable. Not legally, but in practice.

Pew Internet (2019) reported that nearly 75% of Facebook users visit the site every day, a number unchanged from 2018. Users are apparently undaunted by privacy, political censorship, abuse, and fake news concerns. They carry on, reading, posting, and sharing even though the majority of users believe all these problems exist online.

If you consider the volume of content being posted on social media platforms, you can start to see the challenge. On a daily basis, 350 million photos are uploaded on Facebook, over 400,000 hours of video posted to YouTube every minute (300 hours/minutes), 500 million tweets sent per day, 95 million photos and videos shared on Instagram and that’s only a few of the major social media platforms. Even the best algorithms and most vigilant review teams stand little chance of getting it all, much less getting it right. It is further complicated by the fact content is subjective, culturally-bound, and idiomatic and that no human, however ‘perfect’ and evolved, is without some form of cognitive bias. It’s how our brains work.

 Kelly Sikkema/Unsplash
Solid values are the best gift we can give our children to keep them safe on and offline.
Source: Kelly Sikkema/Unsplash

What does this mean? 

  • Social media policies have to start at home or at the very least at school. Social media policies and community standards need to be taught as part of the basic lessons and values parents work so hard to teach their children: responsibility, respect, safety, and self-regulation. It's the best gift we can give our children. Digital citizenship, digital etiquette, and digital safety all translate from offline values to the digital landscape.
  • Social media companies need to do their best to monitor abuses but there are always going to be a lot more of us than of them. We need to partner with social media companies to take some of the responsibility for flagging inappropriate and abusive content. We are the experts in the subjective content we see. Kids are surprisingly proactive reporters of misdeeds, having not become jaded over the ‘ways of the world.’ Yet we hear from kids all the time that nothing is done when they report hate speech, bullying, and other forms of abuse. The appropriate burden on social media companies is to beef up their ability to not just flag but verify and be responsive to reports of abuse.

References

Perrin, A., Anderson, M. (2019) Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018. Pew Research Center. https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/