This post is co-written with James Pennebaker, Professor and Chair of the Department of Psychology at The University of Texas at Austin.
Since the publication of the Facebook experiment in the highly regarded Proceedings of the National Academy of Sciences (PNAS), a firestorm has erupted about its science and ethics. Although the scientific and ethical issues surrounding the paper have been discussed at length, it is instructive to use this as a case study for future collaborations between science and business.
A closer look suggests that this was good science, very good in fact. Ethically, the study raises some red flags. Nevertheless, it is also the kind of study that good companies should do.
The Facebook study reflects a revolution that is taking place in the social sciences. Using Big Data and text analysis methods, Adam Kramer, Jamie Guillory, and Jeffrey Hancock studied almost 700,000 Facebook users to see how their posts were affected when their News Feeds were subtly altered. An in-depth and thoughtful analysis of the research methods has been described by Tal Yarkoni.
The crux of the study was that people received subtly different displays of their friends’ status updates in their News Feeds. One group was randomly assigned to receive a reduced number of positively-valenced posts (i.e. friends’ statuses using words like happy, nice, sweet) whereas another group saw fewer negative posts (i.e. friends’ statuses using words like sad, bad, worried). These tweaks in users’ News Feeds resulted in the Facebook users altering the emotions in their own status updates. The findings support the existence of emotional contagion.
Statistically, the effects were quite weak compared to earlier research conducted in highly controlled lab studies. But this is what made the study important. We are able to see how a robust laboratory effect behaves in the wild. It certainly exists among a diverse array of people but the patterns are subtle.
Most of the outcry about the study has been that Facebook systematically altered some people’s News Feed without telling them. On social media many have expressed a sense of betrayal by Facebook — they were in an experiment without their consent. Facebook countered that being part of a research project was part of the Terms of Service agreement that people agreed to. Although technically true, it is little wonder that many people balked at their brief responses.
As discussed in multiple places, people were not given the option of opting in or out of the study, were not told about the study afterwards, and the ethical review procedures were done in-house at Facebook. The reality, however, is that most ethics review boards would likely have approved the study without serious questions. Technically, such a study did not place people at heightened risk of emotional or physical harm.
But there’s a bigger question. Facebook is always testing to see which News Feed algorithms result in the most engagement of its users. Little experiments are always ongoing, causing Kashmir Hill in her expose in Forbes to wonder, “what other kind of psychological manipulation users are subjected to that they never learn about because it isn’t published in an academic journal?”
But wait a minute. Isn’t the constant testing of a product good business? Virtually every vibrant company is constantly conducting little experiments to see how to improve the sales, service, or efficiency of its product. The difference here is that Facebook took the initiative to publish an important psychological finding that benefits scientific thinking.
The Path Forward
We should encourage collaborations like this with the goal of more rigor and more awareness.
Most of us never hear about the thousands of little tests that well-meaning (and not-so-well-meaning) companies are running on us every day. We should be impressed that Facebook was willing to share their findings. It worked with credible scientists using the peer-review publication process. This could have been a confession on Secret or a white paper for marketing purposes. Instead, the scientific enterprise moved forward and Facebook learned more about increasing engagement of its users.
This should be a case study for future collaborations between social science and business. We have highly desirable assets to offer each other. With greater openness about research methods and more explicit informed consent among clients or customers, the goals of business and science will be advanced.
The kerfuffle over this experiment has brought academics, journalists, entrepreneurs, technofiles, and a handful of other pundits to the table. Let’s continue talking to ensure we bring businesses and academics together in the right way.