A schedule of reinforcement is a rule that describes how often the occurrence a behavior will receive a reinforcement.  On the two ends of the spectrum of schedules of reinforcement there is continuous reinforcement (CRF) and extinction (EXT).

Continuous reinforcement provides a reinforcement each and every time a behavior is emitted.  If every time you hear the doorbell ring and there is someone on the other side of the door with a package for you, that would continuous reinforcement.

With extinction, a previously reinforced behavior is no longer reinforced at all.  All reinforcement is withdrawn with a schedule of extinction.  An example of this is if every time you go to the grocery store with your child, when they ask for a treat, you give it to them.  One day, you decide to put this behavior into extinction and try to reduce the “asking for candy” behavior by not giving it to them any more.  You are now putting the behavior into extinction, which can have the affect of temporarily increasing aggressive behaviors as a side effect.

Intermittent schedules of reinforcement (INT) are when some, but not all, instances of a behavior are reinforced.  An intermittent schedule of reinforcement can be described as either being a ratio or interval schedule.  Ratio schedules are when a certain number of responses are emitted before reinforcement.  An interval schedule is when a response is reinforced after a certain amount of time since the last reinforcement.  The interval or ratio schedule can be either fixed or variable.  A fixed schedule is when the number of responses or the amount of time remains constant.  A variable schedule is when the number or time between reinforcements changes according to an average.

Post-reinforcement pauses are associated with fixed schedules of reinforcement.  While both fixed ratio and fixed interval show a post-reinforcement pause, the fixed ratio has a high steady rate. This type of schedule shows a scalloped effect when graphed.  This is due to the fact that immediately after the reinforcement is delivered there is a decrease in responding, and before the next scheduled opportunity there is an increase in responding behavior.  Post-reinforcement pauses and scalloped graphed effects are not present with variable schedules and conjunctive schedules of reinforcement.

Compound schedule of reinforcement

Concurrent schedule (conc)
Occurs when 2+ contingencies of reinforcement operate independently and simultaneously for 2+ behaviors.
Uses choice making
Matching Law
3 Types of Interactions associated with concurrent schedules are:

  1. the frequency of reinforcement (i.e. the more frequently a behavior receives reinforcement, the higher the likelihood that responding will increase),
  2. reinforcement vs. punishment (i.e. the behaviors associated with the punishment schedule will decrease, while the behaviors associated with reinforcement schedule will increase), and
  3. reinforcement vs. aversive stimuli (i.e. rate of avoidance responding to the aversive stimuli will increase with the intensity and frequency of the aversive stimulus schedule).

Multiple schedule (mult):

  1. alternating  two or more component schedules of reinforcement for a single response
  2. only has one schedule in effect at any time
  3. uses an Sd to signal that the particular schedule is in effect

Chained schedule (chain): Presents the schedules in a specific order and may use the same or different behaviors for all elements in the chain.

Mixed schedule (mix)

  1. alternating  two or more component schedules of reinforcement for a single response
  2. only has one schedule in effect at any time
  3. NO Sd to signal the schedule in effect

Tandem schedule (tand)

Alternative schedule (alt)

Conjunctive schedule (conj)

Progressive Schedule: Systematically thin each following reinforcement opportunity regardless of the learners behavior.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *