Theories of Personality 9th Edition

(やまだぃちぅ) #1
Chapter 16 Skinner: Behavioral Analysis 473

reinforcement value. Behavior can be shaped and responses learned with general-
ized conditioned reinforcers supplying the sole reinforcement.


Schedules of Reinforcement


Any behavior followed immediately by the presentation of a positive reinforcer or
the removal of an aversive stimulus tends thereafter to occur more frequently. The
frequency of that behavior, however, is subject to the conditions under which train-
ing occurred, more specifically, to the various schedules of reinforcement (Ferster &
Skinner, 1957).
Reinforcement can follow behavior on either a continuous schedule or an
intermittent one. With a continuous schedule, the organism is reinforced for every
response. This type of schedule increases the frequency of a response but is an
inefficient use of the reinforcer. Skinner preferred intermittent schedules not only
because they make more efficient use of the reinforcer but because they produce
responses that are more resistant to extinction. Interestingly, Skinner first began
using intermittent schedules because he was running low on food pellets (Wiener,
1996). Intermittent schedules are based either on the behavior of the organism or
on elapsed time; they either can be set at a fixed rate or can vary according to a
randomized program. Ferster and Skinner (1957) recognized a large number of
reinforcement schedules, but the four basic intermittent schedules are fixed-ratio,
variable-ratio, fixed-interval, and variable-interval.


Fixed-Ratio With a fixed-ratio schedule, the organism is reinforced intermittently
according to the number of responses it makes. Ratio refers to the ratio of responses
to reinforcers. An experimenter may decide to reward a pigeon with a grain pellet
for every fifth peck it makes at a disc. The pigeon is then conditioned at a fixed-
ratio schedule of 5 to 1, that is, FR 5.
Nearly all reinforcement schedules begin on a continuous basis, but soon the
experimenter can move from continuous reward to an intermittent reinforcement. In
the same way, extremely high fixed-ratio schedules, like 200 to 1, must begin at a
low rate of responses and gradually build to a higher one. A pigeon can be conditioned
to work long and rapidly in exchange for one food pellet provided it has been previ-
ously reinforced at lower rates.
Technically, almost no pay scale for humans follows a fixed-ratio or any other
schedule because workers ordinarily do not begin with a continuous schedule of
immediate reinforcement. An approximation of a fixed-ratio schedule would be the
pay to bricklayers who receive a fixed amount of money for each brick they lay.


Variable-Ratio With a fixed-ratio schedule, the organism is reinforced after every
nth response. With the variable-ratio schedule, it is reinforced after the nth response
on the average. Again, training must start with continuous reinforcement, proceed
to a low response number, and then increase to a higher rate of response. A pigeon
rewarded every third response on the average can build to a VR 6 schedule, then
VR 10, and so on; but the mean number of responses must be increased gradually
to prevent extinction. After a high mean is reached, say, VR 500, responses become
extremely resistant to extinction. (More on rate of extinction in the next section.)

Free download pdf