Encyclopedia of Sociology

(Marcin) #1
BEHAVIORISM

Reinforcement may depend on the number of
responses or the passage of time. A schedule of
reinforcement is a procedure that states how con-
sequences are arranged for behavior. When rein-
forcement is delivered after each response, a con-
tinuous schedule of reinforcement is in effect. A
child who receives payment each time she mows
the lawn is on a continuous schedule of reinforce-
ment. Continuous reinforcement produces a very
high and steady rate of response, but as any parent
knows, the behavior quickly stops if reinforcement
no longer occurs.


Continuous reinforcement is a particular form
of ratio schedule. Fixed-ratio schedules state the
number of responses per reinforcement. These
schedules are called fixed ratio since a fixed num-
ber of responses are required for reinforcement.
In a factory, piece rates of payment are examples
of fixed-ratio schedules. Thus, a worker may re-
ceive $1 for sewing twenty pieces of elastic wrist-
band. When the ratio of responses to reinforce-
ment is high (value per unit output is low), fixed-
ratio schedules produce long pauses following
reinforcement: Overall productivity may be low,
leading plant managers to complain about ‘‘slack-
ing off’’ by the workers. The problem, however, is
the schedule of reinforcement that fixes a high
number of responses to payment.


Reinforcement may be arranged on a variable,
rather than fixed, basis. The schedule of payoff for
a slot machine is a variable-ratio schedule of rein-
forcement. The operant involves putting in a dol-
lar and pulling the handle, and reinforcement is
the jackpot. The jackpot occurs after a variable
number of responses. Variable-ratio schedules pro-
duce a high rate of response that takes a long time
to stop when reinforcement is withdrawn. The
gambler may continue to put money in the ma-
chine even though the jackpot rarely, if ever, oc-
curs. Behavior on a variable-ratio schedule is said
to show negative utility since people often invest
more than they get back.


Behavior may also be reinforced only after an
interval of time has passed. A fixed-interval sched-
ule stipulates that the first response following a
specified interval is reinforced. Looking for a bus
is behavior that is reinforced after a fixed time set
by the bus schedule. If you just missed a bus, the
probability of looking for the next one is quite low.
As time passes, the rate of response increases with


the highest rate occurring just before the bus
arrives. Thus, the rate of response is initially zero
but gradually rises to a peak at the moment of
reinforcement. This response pattern is called
scalloping and is characteristic of fixed-interval re-
inforcement. In order to eliminate such patterning,
a variable-interval schedule may be stipulated. In
this case, the first response after a variable amount
of time is reinforced. If a person knows by experi-
ence that bus arrivals are irregular, looking for the
next bus will occur at a moderate and steady rate
because the passage of time no longer signals
reinforcement (i.e., arrival of the bus).

The schedules of reinforcement that regulate
human behavior are complex combinations of
ratio and interval contingencies. An adjusting sched-
ule is one example of a more complex arrange-
ment between behavior and its consequences (Zeiler
1977). When the ratio (or interval) for reinforce-
ment changes on the basis of performance, the
schedule is called adjusting. A math teacher who
spends more or less time with a student depending
on the student’s competence (i.e., number of cor-
rect solutions) provides reinforcement on an ad-
justing-ratio basis. When reinforcement is arranged
by other people (i.e., social reinforcement), the
level of reinforcement is often tied to the level of
behavior (i.e., the greater the strength of response
the less the reward from others). This adjustment
between behavior and socially arranged conse-
quences may account for the flexibility and varia-
bility that characterize adult human behavior.

Human behavior is regulated not only by its
consequences. Contingencies of reinforcement al-
so involve the events that precede operant behav-
ior. The preceding event is said to ‘‘set the occa-
sion’’ for behavior and is called a discriminative
stimulus or Sd. The ring of a telephone (Sd) may set
the occasion for answering it (operant), although
the ring does not force one to do so. Similarly, a
nudge under the table (Sd) may prompt a new
topic of conversation (operant) or cause the per-
son to stop speaking. Discriminative stimuli may
be private as well as public events. Thus, a head-
ache may result in taking a pill or calling a physi-
cian. A mild headache may be discriminative stimu-
lus for taking an aspirin, while more severe pain
sets the occasion for telephoning a doctor.

Although discriminative stimuli exert a broad
range of influences over human behavior, these
Free download pdf