Skip to main content

Verified by Psychology Today

Friends

Did Facebook Go Too Far?

A controversial new study shows how much we’re influenced by Facebook.

Peter Bernick/Shutterstock

How do you feel when you read the Facebook posts of your friends? A recent experiment manipulated emotionally laden Facebook posts to see the effects on users. Though the ethics of this study have come into question, what can we learn from this study about our own state of mind?

We’ve all been subject to the process of social comparison, in which our feelings about ourselves are influenced by the successes or failures of our friends. When things aren’t looking so good for you, it’s likely that you can console yourself by figuring that you’re better off than others. Conversely, you might feel bad when you are constantly exposed to the success (if not bragging) of those in your social circle. Fear of missing out is another threat to the well-being of Facebook users. Your friends post photos and minute-by-minute reports of the action at the best party ever. Unfortunately, you weren’t invited and now feel the acute pain of rejection.

Just how much your mood is influenced by Facebook postings became the focus of a study of hundreds of thousands of users who, unwittingly, became part of a large-scale social experiment. Facebook researcher Adam Kramer teamed up with UCSF Tobacco Institute researcher and Cornell computer scientist Jeffrey Hancock to investigate whether the emotions of Facebook users would vary according to the emotional tone of news feeds. Though it wasn't published in a psychological journal, the article was edited by Princeton University social psychologist Susan Fiske.

Psychologists have long known that people engage in social comparison among their networks of friends and associates, but it’s also an established fact that our moods are influence by the moods of the people around us. Known as “emotional contagion,” this is the reason that you laugh more heartily when you’re watching a movie in the theater than you do when you’re home viewing it by yourself. This is the phenomenon that Kramer and his team were investigating on Facebook. The question was whether emotional contagion could spread through a social network according to the type of information being shared.

Prior research established a correlation between people’s mood and the positive or negative content of their Facebook feeds. However, as any psychologically sensitive individual knows, correlation does not equal causation. People may feel more negatively when viewing sad Facebook postings by their friends because they generally hang out with a more depressed group of people. It’s also possible that your being in a sad mood much of the time leads others in your group to post content that they think you’ll understand and appreciate.

It’s only through an experiment, then, that you can establish a causal link between the emotional content of a Facebook post and the reaction of the user. In designing their experiment, Kramer and his team used three conditions: 1. deleting 10% of news feed posts containing positive-emotion words, 2. deleting 10% of news feed posts with negative-emotion words, and 3. deleting a comparable percentage of random posts. Because twice as many Facebook posts contained positive-emotion words (yes, we do tend to share our good news!) this meant that matching the positive emotion condition required deleting twice the percentage of positive than negative emotion words.

Researchers gauged the impact on users' emotional state using the percentage of all words expressing positive or negative emotions in their posts. Over the one-week period of the study, the authors analyzed over 122 million words in over 3 million posts. The posts of people who had their positive news feed items reduced did in fact post fewer positive words; comparable results were seen for those in the negative-screening condition. Though the effects were slight (yet significant, because the sample was so massive), the authors concluded that their results show “emotional contagion.”

These statistically significant but small effects, the authors conclude, can add up immeasurably when you consider the millions of people around the world who rely on this form of social media. Just tinkering with a tiny percent of those posts can spread either positive or negative emotions in a way that, they believe, could have global health implications. If you’re feeling less positive, you might engage in riskier health-related behaviors or increase your risk of negative consequences like heart disease.

In the aftermath of the study’s publication, however, critics are crying foul, arguing that the authors violated the privacy of the participants who never knew they were the subjects of any kind of research.

Normally, research involving the manipulation of any psychological variable requires that participants provide informed consent in which they learn the benefits and risks of the study. Their involvement in the study is voluntary and they are free to discontinue at any time without risk of negative consequences. After they’re through with the study, the experimenter provides them with an explanation of the study’s purpose and is there to manage any negative reactions.

The Facebook study provided no such opportunity. The rationale for not doing so was that when users sign up for Facebook, they accept the terms that include gathering data. It’s probably fair to say that most of us who sign this agreement figure that Facebook is passively gathering information (location, advertiser “likes,” and so on). Less clear is the possibility that Facebook controls its feeds.

Even if you consider yourself a Facebook pro, you may not realize that its engineers filter your feed according to posts that they think are most relevant to you. As stated by Kramer et al., “Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interests of showing viewers the content they will find most relevant and engaging” (p. 8788). In a sense, the experiment on emotions fits into this context of helping Facebook determine what’s most “engaging” for its users.

If we are to believe the study’s findings, these users could have suffered a number of ill effects. Their moods were surreptitiously altered not so much by what they were told, but what they were not told. They didn’t get the good news from their friends about, say, a job promotion, a child’s achievement, or another joyous personal or family event.

Taking this one step further, people who are vulnerable to a depressive disorder were also deprived of potentially mood-enhancing information. Four of your Facebook posts in a week may not seem like a lot to miss, but depending on their content, they could have taken the form of the tiny stress-busters we call “uplifts.” By the same token, people whose posts reflected unhappiness may have been looking for support from the people in their social circle. The risk is small, but their posts might have been missed by one or two people from whom they were really hoping to get help.

The even larger issue, though, is related to social engineering. If Facebook engineers wanted to manipulate your moods, or the moods of a nation—or the world—this experiment would have given them the ammunition they need to get them started. The authors, it’s interesting to note, stated that there was no “conflict of interest” in their involvement in this research. However, the main author works for Facebook, so it’s difficult to see how this could be completely true. Facebook wouldn’t have invested the resources it needed to do the study if they didn’t see some practical value.

For you, the consumer and possible Facebook user, the findings have three clear implications:

1. With each new security-related change on Facebook, be sure to read the fine print. The odds are small that you were actually part of this study, but who’s to say you won’t be in the future?

2. Maybe you should pay more attention to the good news and happy stories—the cheer may rub off on you.

3. If you’re feeling down, reading about the travails of your social circle probably won’t make you feel better. Those are times you might want to skim, or if it’s a truly close friend, send a private message of consolation.

As we know, Facebook can be put to good use and bad. Learning about its potential uses can make you a more informed, if not happier, consumer.

Follow me on Twitter @swhitbo for daily updates on psychology, health, and aging. Feel free to join my Facebook group, "Fulfillment at Any Age," to discuss today's blog, or to ask further questions about this posting. Copyright Susan Krauss Whitbourne, Ph.D. 2014

References:

Noser, A., & Zeigler-Hill, V. (2014). Investing in the ideal: Does objectified body consciousness mediate the association between appearance contingent self-worth and appearance self-esteem in women?. Body Image, 11(2), 119-125. doi:10.1016/j.bodyim.2013.11.006

Kramer, A.D.I., Guillory, J.E., & Hancock, J.T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. PNAS, 111, 8788-8790.

advertisement
More from Susan Krauss Whitbourne PhD, ABPP
More from Psychology Today