What Makes Social Media Harmful?
Facebook and Instagram misuse basic psychology mechanisms for commercial gain.
Posted September 17, 2021
- An investigation revealed that Instagram has kept evidence of doing harm to teenage girls' mental health a secret.
- The design of social media platforms works on the principles of personalized persuasion and has negative effects on the most vulnerable.
- Legislation is necessary to protect young social media users from the unethical and abusive design of these platforms.
Once again, Facebook is making international media headlines—this time because its sister platform, Instagram, held back evidence of data misuse. A Wall Street Journal investigation revealed that the company had data on teenage girls who reported increased mental health problems due to the use of the platform, but kept it secret.
The story is frustrating on several levels. It is frustrating because research is supposed to be publicly available and transparent and Facebook’s consumer research is a glaring exception from open science protocols. It is frustrating that Facebook does not act on accumulating evidence of harm, and leaves vulnerable young people at the receiving end of their addictive design. And it is frustrating that the harmful effect is not unique to this new story, or to the Instagram platform. It is happening on all SOcialMEdia that are designed to pander to the SO ME culture.
The effects of social media
Social media rely on visual content, with images and videos shared anytime and anywhere, from an individual’s hand to a global audience. Researchers have documented both the positive effects (including finding connection, support, and inspiration) and negative impacts of social media use (including anxiety, depression, feeling of loneliness). The balance between positive and negative effects continues to be discussed but there is no doubt that the negative effects could be significantly reduced had the platforms been designed differently.
What makes social media addictive?
Whether you look at the design of Facebook or Instagram, you will see that the platforms use personal data and personalized algorithms. They were not designed for advancing collective and collaborative thinking but to splinter individuals’ experiences into data points.
Each photo, written word, or tap by a user is a data point that the algorithms use to compute a representation of the individual. The algorithms are optimized for commercial profit, so they use personal data for categorizing individuals according to market segments. You can be watching videos with cats, slimming programs, or suicidal content—the algorithms will push you content similar to what you engaged with before. This locks you into an echo chamber of your own interests, offering more and more content within that category, with an increasing level of specification and extreme engagement potential.
One would think that users get quickly bored with receiving the same content over and over but psychologists know that the opposite is true. The SoMe algorithms are designed following the principles of persuasion and habituation, both of which are implicated in repeated, impulsive, and addictive behaviors. While these can be found in many websites and commercial products, SoMe hyped up the power of both processes with personal data and personalized algorithms. The design sends users into a spiral with precisely personalized content.
How does personalized persuasive design work?
Russian psychologist Eugene N. Sokolov studied habituation in 1963 in relation to survival strategies. For Sokolov, habituation was an autonomous bodily response. Humans get used to their environment but they automatically respond to new and surprising elements (an unknown sound in the forest) to protect themselves from potential danger (an approaching bear). Fast forward to social media—when your phone beeps with a notification, you are propelled to check it immediately. The surprise and novelty factors are too strong to ignore even for adults with usually high self-control.
Social media use the principle of novelty and surprise in the form of persuasive design. Nir Eyal described four “hooks” that direct users' desire and attention to an external stimulus (which in this case is the social media platform):
- Triggering mechanisms (think of a notification when someone clicks on your Facebook profile).
- An action linked to a reward (for example the more Likes you give to others the more likes you are likely to receive back).
- Variable reward (the images and news in your feed sometimes relate to what you followed in the past, sometimes to your friends, sometimes to your friends’ friends).
- An investment that becomes a trigger for the next loop (the “engagement” of others in the form of comments or likes on your photo is the trigger for posting more).
The devil of addiction embedded in SoMe platforms lies in deep personalization: The algorithms do not only use all hooks but intensify them with personal data. The notifications are not random alerts of the world’s news but personal notifications relevant to your profile. The comments and likes you read are those of your friends and acquaintances, people you know and care about. The exact workings of the platforms’ algorithms are top secret, fueling even more interest in why some of your posts attract more views and likes than others. In brief, the Hook Model in Facebook and Instagram would not be as persuasive if it was not personalized. The sad reality is that SoMe platforms recruit users’ attention for a money-making platform.
Heavy users of social media display some symptoms of drug addiction, but the analogy puts the spotlight on the victim rather than the harmful substance. We need to focus on harmful design and features that exacerbate negative behaviors and environmental influences. Unlike drugs, SoMe platforms cannot be removed without losing their benefits. Facebook and Instagram are deeply ingrained in teenagers’ everyday communication and socializing that quitting them altogether would remove the positive socializing function they play for many teens (and their parents and grandparents). ‘Scrollfree’ and ‘ScreenFree’ initiatives might help temporarily but the way to take back control is not to shun technology altogether.
Instead, we need to speed up the implementation of legislation that protects young users from unethical and abusive design (the Children’s Digital Act in the U.K. is an excellent example of this). We need to support developers who design more ethical algorithms and safer social media. Above all, we all need to be more aware of how the platforms harm our everyday interactions, how they work, and that we have a choice in using them.
Kucirkova, N. (2021). Quantity and Complexity. In The Future of the Self: Understanding Personalization in Childhood and Beyond. Emerald Publishing Limited.
Sokolov, E. N. (1963). Higher nervous functions: The orienting reflex. Annual review of physiology, 25(1), 545-580.