In my last blog post I wrote about using a continuous reinforcement schedule when you want to establish a new behavior. And I hinted that you should change that schedule after the behavior is established.
One of the reward “schedules” that B.F. Skinner researched is called a variable ratio schedule. It’s called “variable” because you don’t reward the behavior every time. You vary how often the person gets a reward when they do the target behavior. And it’s called “ratio” because you give a reward based on the number of times a person has done the behavior (rather than, for example, rewarding someone based on time—for example, giving a reward the first time the person does the behavior after 5 minutes has elapsed).
In a variable ratio schedule you may decide that you are going to reward the behavior, on average, every five times the person does the behavior, but you vary it, so sometimes you give the reward the third time they do the behavior, sometimes the seventh time, sometimes the second time, etc. It averages out to every five times.
Let’s take the example of trying to get your employee to turn in expense reports on time. At first you would reward them every time they turn in the expense report on time (as discussed in the previous blog post on continuous reinforcement).
Once the behavior is established, however, you would then switch to only rewarding them every three or five or seven times on average. This is the variable ratio schedule.
Skinner found that variable ratio schedules have two benefits:
a) they result in the most instances of the behavior than any of the other schedules (i.e., people will keep handing in the expense report on time), and
b) they result in behaviors that Skinner said were “hard to extinguish”, which is “psychology speak” for the idea that the behavior persists over time, even when rewards aren’t being given any more.
If you want to see another example of a variable ratio schedule, go to a casino. Slot machines are a very effective example of a variable ratio schedule. The casinos have studied the science of rewards and they use them to get people to play and keep playing.
Can you think of any more variable ratio schedule examples that you’ve experienced or tried?