Table of ContentsView AllTable of ContentsHow Does a Variable-Interval Schedule Work?Characteristics of a Variable-Interval ScheduleExamplesOperant Conditioning
Table of ContentsView All
View All
Table of Contents
How Does a Variable-Interval Schedule Work?
Characteristics of a Variable-Interval Schedule
Examples
Operant Conditioning
Close
In operant conditioning, variable interval refers to aschedule of reinforcementwhere a response is rewarded after an unpredictable amount of time has passed. A variable interval schedule is the opposite of afixed interval schedule. This schedule produces a slow, steady rate of response.
At a GlanceA variable-interval schedule is just one way to deliver reinforcement when trying to teach or change a behavior. By only offering a reward sporadically, people tend to respond at a moderate but steady speed.One major plus about using a variable-interval schedule is that it leads to more extinction-proof behavior. That means that the things that are learned are more likely to stick.
At a Glance
A variable-interval schedule is just one way to deliver reinforcement when trying to teach or change a behavior. By only offering a reward sporadically, people tend to respond at a moderate but steady speed.One major plus about using a variable-interval schedule is that it leads to more extinction-proof behavior. That means that the things that are learned are more likely to stick.
A variable-interval schedule is just one way to deliver reinforcement when trying to teach or change a behavior. By only offering a reward sporadically, people tend to respond at a moderate but steady speed.
One major plus about using a variable-interval schedule is that it leads to more extinction-proof behavior. That means that the things that are learned are more likely to stick.
So, a variable-interval schedule means that reinforcement is delivered at varying and unpredictable time intervals.
Imagine that you are training a pigeon to peck at a key to receive a food pellet. You put the bird on a variable-interval 30 (VI-30) schedule. This means that the pigeon will receive reinforcement an average of every 30 seconds.
It is important to note that this is an average, however. Sometimes the pigeon might be reinforced after 10 seconds; sometimes, it might have to wait 45 seconds. The key is that the timing is unpredictable.
A variable-interval schedule has a few important characteristics that distinguish it from other reinforcement schedules. Key characteristics of a variable-interval schedule:
One possible downside is that response rates tend to be more moderate. However, this can also be seen as a plus.
Examples of Variable-Interval Schedules
To understand more about how variable interval schedules work, it can be helpful to look at a few different real-world examples:
Checking Your Email
Typically, you check your email at random times throughout the day instead of checking every time a single message is delivered. The thing about email is that, in most cases, you never know when you will receive a message.
Because of this, emails roll in sporadically at entirely unpredictable times. When you check and see that you have received a message, it acts as a reinforcer for checking your email.
Your Employer Checking Your Work
Immediately after one of these check-ins, you might briefly pause and take a short break before resuming your steady work pace.
Pop Quizzes
Your psychology instructor might issue periodic pop quizzes to test your knowledge and to make sure you are paying attention in class. While these exams occur with some frequency, you never really know precisely when they might give you a pop quiz.
One week you might end up taking two quizzes, but then go a full two weeks without one. Because you never know when you might receive a pop quiz, you will probably pay attention and stay focused on your studies to be prepared.
Schedules of Reinforcement in Operant Conditioning
Operant conditioningcan either strengthen or weaken behaviors through reinforcement and punishment. This learning process involves forming an association with behavior and the consequences of that action.
PsychologistB.F. Skinneris credited with the introduction of the concept of operant conditioning. He observed thatreinforcementcould be used to increase a behavior, andpunishmentcould be used to weaken behavior. He also noted that the rate at which a behavior was reinforcement had an effect on both the strength and frequency of the response.
What This Means For YouIf you are trying to change a behavior, using the right reinforcement schedule is important. A variable-interval one means you’ll only give or receive a reward after a random period of time. When you use this schedule, you’re more likely to plug along at a steady, moderate pace because you never know quite when you’ll finally get rewarded.
What This Means For You
If you are trying to change a behavior, using the right reinforcement schedule is important. A variable-interval one means you’ll only give or receive a reward after a random period of time. When you use this schedule, you’re more likely to plug along at a steady, moderate pace because you never know quite when you’ll finally get rewarded.
4 SourcesVerywell Mind uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read oureditorial processto learn more about how we fact-check and keep our content accurate, reliable, and trustworthy.Lattal KA.Delayed reinforcement of operant behavior.J Exp Anal Behav. 2010;93(1):129‐139. doi:10.1901/jeab.2010.93-129Marshall AT, Kirkpatrick K.Everywhere and everything: The power and ubiquity of time.Int J Comp Psychol. 2015;28.Nevin JA.Resistance to extinction and behavioral momentum.Behav Processes. 2012;90(1):89‐97. doi:10.1016/j.beproc.2012.02.006Bouton ME.Why behavior change is difficult to sustain.Prev Med. 2014;68:29‐36. doi:10.1016/j.ypmed.2014.06.010
4 Sources
Verywell Mind uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read oureditorial processto learn more about how we fact-check and keep our content accurate, reliable, and trustworthy.Lattal KA.Delayed reinforcement of operant behavior.J Exp Anal Behav. 2010;93(1):129‐139. doi:10.1901/jeab.2010.93-129Marshall AT, Kirkpatrick K.Everywhere and everything: The power and ubiquity of time.Int J Comp Psychol. 2015;28.Nevin JA.Resistance to extinction and behavioral momentum.Behav Processes. 2012;90(1):89‐97. doi:10.1016/j.beproc.2012.02.006Bouton ME.Why behavior change is difficult to sustain.Prev Med. 2014;68:29‐36. doi:10.1016/j.ypmed.2014.06.010
Verywell Mind uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read oureditorial processto learn more about how we fact-check and keep our content accurate, reliable, and trustworthy.
Lattal KA.Delayed reinforcement of operant behavior.J Exp Anal Behav. 2010;93(1):129‐139. doi:10.1901/jeab.2010.93-129Marshall AT, Kirkpatrick K.Everywhere and everything: The power and ubiquity of time.Int J Comp Psychol. 2015;28.Nevin JA.Resistance to extinction and behavioral momentum.Behav Processes. 2012;90(1):89‐97. doi:10.1016/j.beproc.2012.02.006Bouton ME.Why behavior change is difficult to sustain.Prev Med. 2014;68:29‐36. doi:10.1016/j.ypmed.2014.06.010
Lattal KA.Delayed reinforcement of operant behavior.J Exp Anal Behav. 2010;93(1):129‐139. doi:10.1901/jeab.2010.93-129
Marshall AT, Kirkpatrick K.Everywhere and everything: The power and ubiquity of time.Int J Comp Psychol. 2015;28.
Nevin JA.Resistance to extinction and behavioral momentum.Behav Processes. 2012;90(1):89‐97. doi:10.1016/j.beproc.2012.02.006
Bouton ME.Why behavior change is difficult to sustain.Prev Med. 2014;68:29‐36. doi:10.1016/j.ypmed.2014.06.010
Meet Our Review Board
Share Feedback
Was this page helpful?Thanks for your feedback!What is your feedback?HelpfulReport an ErrorOtherSubmit
Was this page helpful?
Thanks for your feedback!
What is your feedback?HelpfulReport an ErrorOtherSubmit
What is your feedback?