Like a Rat in a Twitter Box

An experiment to see whether you are being manipulated.

Posted Nov 28, 2020

Have you ever felt your social media behaviour is not entirely under your own control—like you’re pressing levers for outcomes, and you don’t really know why? What is keeping you responding, and where is the control coming from? You may think you enjoy doing it—but, as B.F. Skinner noted1, getting people to like the conditions controlling them is one of the best ways to control them. Are you responding like a rat in a ‘Skinner Box’—a human in a ‘Twitter box,’ if you will?

There has been much discussion regarding the ways social media companies may, or may not, manipulate people—in terms of what is seen, advertisements presented, and even ways users feel2-4. A further issue is the manner in which ‘likes,’ and similar ‘social’ reinforcers, impact users’ moods and behaviours—questions arising are: to what extent is manipulation occurring; how might it be achieved; and what can be done about it? There is ample knowledge available about the ways in which these outcomes alter behaviour, and there is a simple experiment that you can do for yourself, in order to see if you are, indeed, being manipulated.

The focus of this type of debate, typically, has been on the use of algorithms to present material that seems related to that which the person, themselves, has previously looked at. In itself, this seems an innocuous enough marketing strategy—albeit one that may tend to restrict individuals’ horizons5. More problematic is the way in which this practice could develop and reinforce echo chambers5, which may serve to promote potentially dangerous collective narcissism—forming ‘in’ and ‘out’ groups6. A practice that can have rather sinister political overtones5,6. Similarly sinister to some, was the covert experiment conducted on users of Facebook (without the smallest effort to solicit consent), in which newsfeeds were manipulated to produce either positive or negative moods in those individuals3. All of this debate is well-rehearsed, and people will take differing views on the importance of these manipulations. 

The impact of the behaviour of other social media users, in ‘liking’ or ‘disliking’ posts, has been studied, and this is as important to understand as the behaviours of the social media companies. However, the question is: are social media companies manipulating the manner and timing of the presentations of ‘likes’? Do you get a ‘like’ immediately after somebody has ‘liked’ your post, or do you get it when the social media company feels it will be most helpful to them? Are ‘likes’ just a way to keep you using the platform?

What started a train of thought about this possibility, and its potential relationship to what we know about the effects of schedules of reinforcement on rat behaviour, was a post on Reddit (now some years old): “I started posting really great photos…..Immediately post likes went up from 3/4 to 20-50 likes…..The next few days the page got 600 new likes…..Since then, I've continued posting great pics and info hoping for a repeat. The likes are creeping up slowly ... 3 a day or so. ... Is it possible or likely that Facebook gave me a lot of exposure that first week just to give me an addictive taste for likes?7. The answer is ‘possibly,’ and there would be very sensible business reasons for social media companies to make this manipulation. 

The description in the Reddit post looks, for all the world, like a description of a Progressive Ratio schedule of reinforcement—that is, a schedule where you have to make a number of responses to get your reward. The number varies from one reward to the next—in this case, the number required increases with each reward8. First, you need to emit a few responses for a reward, then a few more, then a few more still, and, before you know it, you’re emitting huge numbers of responses for the reward that you initially ‘bought’ for a few responses. For those interested in conditioning, this is old news, and the application to understanding drugs of addiction has long been known9; but maybe the social media companies know about this too?   

Is this possible? Can a simple ‘thumbs up’ symbol on your social media screen serve as a reinforcer for sophisticated technologically-savvy humans, like a food pellet in a conditioning chamber does for the humble rat? Of course it can. There are many well-researched models of how social media ‘likes’ act as reinforcers10, and, concerningly, this may impact the vulnerable, socially-isolated, depressed young person, more than others. As one researcher puts it: “There is ample evidence that receiving likes serves in the same way as any other reinforcer. Social media users search for happiness and validation each time they post, share, like, comment, or send an invitation online.11. Seeing a ‘like’ attached to an image on a post, even an image associated with harm, like alcohol or drugs, increases adolescents’ liking for that image, when it is shown to them—the apparent popularity of a photo changes the manner in which others perceive that photo12. Photos with more ‘likes’ from peers are more liked. The fantastic thing, from social media companies’ points of view, is that they do not even have to rely on their own manipulation to generate the ‘likes’; the very act of sending a ‘like’ triggers some of the same neural pathways of reward that receiving the ‘like’ triggers13—you are doing it to yourselves!

So, if ‘likes’ are readily available without the need for producing them (although this is done too by some unscrupulous companies4), what is left for social media companies to do? Perhaps it is in the way ‘likes’ are scheduled to be presented. The recent move to hide ‘likes’ from some Facebook screens14, ostensibly to combat their negative impacts on mental health, is not really that comforting. It merely shows the control social media companies can exert over what we see of these ‘likes,’ and illustrates that the delivery of ‘likes’ can be manipulated—perhaps in the same way as the companies tried to exert control over their users’ emotions3

This may all sound like something out of the programme The Social Dilemma, but think of the above question posed on Reddit. In fact, the great thing about science is that you can run the experiment yourself. Do not just take the ‘dopamine rush’ that you receive when you get a ‘like’—chart it! See the patterns in which your ‘likes’ come to you as a result of your posts. Ask yourself whether there are any patterns to their presentation? Are you simply ‘a rat in a Twitter box,’ responding more and more, for less and less, on a social media company’s Progressive Ratio schedule? 

One thing we know about being human, and being exposed to a schedule, is that our awareness can modify our behaviour15. It is also worth noting that ‘likes’ only work if you need them to. People with a strong inner-purpose are not so susceptible to the effects of external validation from ‘likes.’16 Just as a rat will reduce responding for food if they are not hungry, a teenager will reduce their search for validation from social media if they have real friends; and we won’t really care whether others ‘like’ what we are doing, if we know what we want to do for ourselves.

All in all, this seems like a fairly bleak assessment of the way people can be, and are, manipulated by social media companies. Remember, however, knowledge and awareness can help you navigate around any potential manipulation. Social media companies are not providing their services for our sakes—they are there to make money—and there is nothing necessarily wrong with that, as long as it doesn’t exploit or hurt anybody. Firms have always attempted to control and manipulate their customers—that’s what marketing is all about—it’s just that social media companies are very, very good at it, and have harnessed the power of Learning Theory. Luckily, they didn’t invent the theory, and that theory can be used to show their customers what could be going on.


1. Skinner, B. F. (1971). Beyond freedom and dignity. Hackett Publishing.

2. The Guardian (19.6.17). Facebook and Twitter are being used to manipulate public opinion – report.

3. The Guardian (30.6.14).  Facebook reveals news feed experiment to control emotions.

4. Tech2 (6.12.19). Fake likes can be bought and social media sites are doing fairly little to prevent this manipulation.

5. Reed, P. (2019).  Are Echo Chambers a Threat to Intellectual Freedom?. Psychology Today.

6. Reed, P. (2020). How to Spot Collective Narcissism in Social Media Posts.

7.  Does Facebook manipulate likes?

8. Hodos, W. (1961). Progressive ratio as a measure of reward strength. Science, 134(3483), 943-944.

9. Bradshaw, C. M., & Killeen, P. R. (2012). A theory of behaviour on progressive ratio schedules, with applications in behavioural pharmacology. Psychopharmacology, 222(4), 549-564.

10. Lindström, B., Bellander, M., Chang, A., Tobler, P. N., & Amodio, D. M. (2019). A computational reinforcement learning account of social media engagement. PsyArXiv, (78mh5).

11. Lindgren, S. (2019). Why We Like Likes. BU Well, 4(1), 6.

12. Sherman, L. E., Payton, A. A., Hernandez, L. M., Greenfield, P. M., & Dapretto, M. (2016). The power of the like in adolescence: effects of peer influence on neural and behavioral responses to social media. Psychological science, 27(7), 1027-1035.

13. Sherman, L. E., Hernandez, L. M., Greenfield, P. M., & Dapretto, M. (2018). What the brain ‘Likes’: neural correlates of providing feedback on social media. Social cognitive and affective neuroscience, 13(7), 699-707.

14. Marketing tech (20.1.20) Do we need to kiss goodbye to social media likes? Exploring visibility and health.

15. Bradshaw, C. A., & Reed, P. (2012). Relationship between contingency awareness and human performance on random ratio and random interval schedules. Learning and Motivation, 43(1-2), 55-65.

16. Burrow, A. L., & Rainone, N. (2017). How many likes did I get?: Purpose moderates links between positive social media feedback and self-esteem. Journal of Experimental Social Psychology, 69, 232-236.