The "Vegas Effect" of Our Screens
There are many reasons we check our screens. The "Vegas Effect" is one of them.
Posted Jan 04, 2019
We all know what it feels like to be tethered to technology these days. It's all of us, not just the kids. As of 2018, there were about 2.5 billion smartphone users in the world. Considering there are about 7.7 billion people on the planet, that’s a LOT of smartphones! Then there’s social media. Facebook alone has over 2.25 billion users. Instagram has about 400 million users and Snapchat has about 200 million.
The typical American spends about 1460 hours per year on their smartphone. Assuming 8 hours of sleep per night, this translates in to 91 waking days per year spent just on the smartphone. However, they do have a strange power over us that makes them hard to resist. This is why people text and drive, ignore their children, friends, and partners, and otherwise disengage with the people and the world around to check their phones. Whether screens are truly addictive is still up for debate, but it's worth noting that many drug-related references have been used in relation them, such as Crackberry, Snapcrack, World of Warcrack, and news junkies.
Why do we have such a curiously difficult time resisting our screens? In previous blogs, I discussed how both classical conditioning and supernormal stimuli are mechanisms that can, in part, explain their virtually irresistible pull. There’s another mechanism that hooks us into compulsively checking our screens for the latest news feed, text, social media post, or email. It is also the draw of "loot boxes" within many video games, such as Counterstrike and Star Wars Battlefront II. This mechanism is known as a variable ratio reinforcement schedule.
A Little About Reinforcement Schedules
If you ever took an introductory psychology course, chances are you ran across B.F. Skinner. He was a psychologist and behaviorist who looked at how behavioral responses were established and strengthened by different schedules of reinforcement. For instance, a rat in a cage that is taught to press a lever to earn a food pellet (reward) might be taught that it gets one food pellet for every three presses of the lever. This would be an example of a fixed interval reinforcement schedule.
Although there are a variety of types and subtypes of reinforcement schedules that can affect the likelihood of different behavioral responses, let's take a closer look at variable ratio reinforcement schedules because it can explain some of the pull of our screens.
Variable Ratio Reinforcement Schedule
A variable ratio reinforcement schedule occurs when, after X number of actions, a certain reward is achieved. Using the rat example, the rat doesn’t know how many presses of the lever produces the food pellet. Sometimes it is one, others it is five, or 15. The researcher randomizes the distribution so that the rat never knows how many pushes will yield the food pellet. It soon learns that the faster it pushes the lever though, the sooner it will receive the pellet.
Researchers have found that variable ratio schedules tend to result in a high rate of responding (refer to the VR line in the graph above). Also, variable ratios are extremely resistant to extinction. In the case of the rat, if the researchers stops giving pellets of food after lever presses, the rat will push the lever frequently for a very long time until it finally gives up (which is the extinction part). Slot machines, as well as most games of chance, are a real world examples of a variable ratio schedule.
Variable Reinforcement in Our Daily Lives
It turns out, variable ratio reinforcement schedules are involved in many behavioral addictions, such as gambling. Yes, that’s right. In a sense, compulsively checking our phones is much like compulsive gambling. We could call this the "Vegas Effect," meaning that we can experience an almost feverish compulsion to engage in a particular behavior. In fact, many “obsessions” and hobbies also involve this variable ratio reinforcement schedule, such as:
- Basically any type of collecting (e.g., collecting Pokemon cards, stamps)
- Looking for bargains while shopping at the mall, flea markets, eBay, or garage sales
- Channel surfing on TV (seems that Internet surfing has largely supplanted that past time)
Why Are Variable Reinforcement Schedules Powerful?
Variable reinforcement schedules are not bad. They are an important part of the the motivation and learning systems within our brains. We learn casual relationships from “connecting the dots.” From an evolutionary perspective, learning causal connections enhances our chances of survival. For instance, if I do “Action A” then it can be important that I learn whether “Outcome B” is the likely result. When there is a variable relationship, that means when we perform “Action A” then “Outcome B” might be the result. The reward system in the brain releases dopamine in variable situations to motivate the organism to pay attention so that it might learn the causal connection. This is sometimes referred to as incentive salience. In essence, the brain is motivating the organism to “crack the code.”
Importantly, this dopamine reward system tends to be more involved in wanting versus liking. It's released more often in anticipation that something might happen. Let's say Johnny just bought a pack of Pokemon cards and is about to open it. The dopamine is being released prior to actually opening the card pack. In effect, the dopamine is incentivizing little Johnny to open the card pack (well, and possibly to buy the cards to begin with!).
Variable Reinforcement and Screens
It is easy to see how technologies such as social media, texting, and gaming work on a variable reinforcement schedule (Sidebar: Some screens might pull on us more through variable interval versus variable ratio schedules, but it's likely the same result). Like a box of chocolate, we never know what we are going to get. Who posted to Facebook? What did they post? Who commented on my post? What did they say? I need to check my email—something important might have come in! My cell is buzzing—what could this be about? What's the latest on Trump? Let me check my news feed just one more time...
The moment our smartphones buzz or chime, this dopamine reward system is activated. Again, it is the anticipation phase that is key to the activation of this reward system. We just HAVE to find out this information, whatever it is. It feels like an itch that needs to scratched or a thirst that needs to be quenched. Like the rat pushing the lever in hopes of getting a food pellet, we keep checking our phones. As much as we'd like to believe we are above getting hooked into such compulsive behaviors, we often behave like these rats in a cage.