Professional Documents
Culture Documents
W. N. SCHOENFELD
"response conditioning," and they did not hesitate to say so. While
the casualness of Pavlovian researchers about choosing the R for
their theories would eventually have produced criticism like the
present paper, it has been more the developments in the field of
operant conditioning which have forced the need to review the
place of the R term in general behavior theory. These develop-
ments have brought to the fore old problems that now can be more
clearly perceived than they once were. They are getting compla-
cent behavior theorists to rub their eyes for a new look at their
t e r m "response."
Even before these developments, however, the need to review
the status of R in our behavior theo,'ies had been indicated by the
variety of "measures" that laboratory workers had come to use as
their R term. Response latency, duration, amplitude, and other
indices of avail were treated as equally valid, and even interchange-
able, in behavior-theoretical formulations. Several decades ago, a
few theorists became concerned over whether the several presumed
measures of R actually co-varied. Their thought was that such co-
variation, if it could be substantiated, would make defensible the con-
cept of "reflex strength" which Skinner had voiced, since it would
justify the pooling cf different "responses" and different response
"measures" into a generalized R term for conditioning theory to
use. But the acttml laboratory fimlings did not reve:d any sudl co-
wlriati(m to sigljific:mt degree, an(1 the concept of "reflex strength"
disappeared frolu the literature leaving behind its original prob-
lems still unresolved. Moreover, convention has added still other
novel usages for R which rest easily today in the minds of many
operant conditioners but which have only made the original prob-
lems all the worse. Thus, having begun with lever-pressing, oper-
ant theorists have extended R to a range of "responses" like an
author "writing a novel," a couple "getting marriedi" a parent
"helping his child," a researcher "doing an experiment," a social
reformer "designing a society." Propositions once formulated about
lever-pressing by rats have been given parallels in the cases of all
activities, or hehavior segments, which may be said to be "reinforce-
able," when that term means only, in some not wholly specified
way, that the organism can be encouraged to repeat the action,
to try again.
But the problems raised by operant conditioning, even from
its beginnings, have not all been so literary. On a more technical
and operational level, some questions of theoretical force are gen-
erated by two of operant conditioning's traditional hallmarks. The
first of these was the use of rate as a response measure; the second
was the paradigmatic practice of making the delivery of a reinforc-
Pay. J. Biol. Sol.
132 SCHOENFELD J uly-Sept. 1976
But the temporal relations called "before" and "after" are not lim-
ited to single trials, or to any single event in a series of (postulated)
identical response events. In reality, given the whole course of a
conditioning session or experiment, covering many trials and span-
ning many responses, every occurrence of either UCS or Srt pre-
cedes some responses and follows others. A similar neglect to con-
sider the whole course of a training session or experiment affected
the theoretical treatment of "aversive" conditioning. In "avoidance"
responding, for example, whether tile training is trial-by-trial o r
"free operant," a "successful" response does not permanently avert
the "aversive" stimulus, but only delays it; that is to say, the avoid-
ance training begins with at least one "escape" response (i.e., an
"unsuccessful" avoidance trial), and as training continues "unsuc-
cessfur' trials or responses occasionally recur with the result that
the aversive stimulus also recurs but as a delayed one.
It has been noted elsewhere (Schoenfeld and Cole, 1975) that
all "contingent" schedules of reinforcement are really rules for
response identification (that is, for identifying the response-to-be-
reinforced), and only "non-contingent" schedules can be said to be
really schedules of reinforcement. Leaving such consideratigns
aside, however, the contingency of SR upon R, which operant con-
ditioning theory emphasizes, involves a temporal relation between
Srt and R which is imposed by the experimenter. In practical
terms, it means that the experimenter stipulates something about R
as a precondition for his delivery of Sa. The "something" can be
any property or feature of R (often, what the apparatus requires
to detect the occurrence of an R at all): a stipulated amplitude, or
duration, or locus, or whatever. Such stipulations, and the fact that
any stipulations of any kind are set by the experimenter, not only
affect the experimental findings, but also raise for operant theory
problems of a rational sort. Among the latter is the cluestion of
what are the dependent and independent variables in such a pro-
cedure (Schoenfeld and Farmer, 1970; Schoenfeld, Cole, et aI., 1972;
Sehoenfeld, et al. 197:3; Schoenfeld and Cole, 1975). When rein-
forcement is made contingent or "response-dependent," the response
feature which is the stipulated requirement for SR delivery may
be reported as the dependent variable, but it also partakes of the
character of an independent variable; and conversely, the delivery
of SR may be described as the independent variable, but it also
partakes of the character of a dependent variable. The matter
stands the same in the so-called R "shaping" procedures that are
popular today: these procedures merely broaden the problem to
a succession of different "responses" before that R occurs which the
experimenter had decided to measure explicitly. In any operant
1.~ SCIIOENFELD Pay. ]. Biol. Sci.
July-Sept. 1976
tioner may opt to pin his work to a rate measure, but behavior
theory cannot be content with that. Nor can behavior theory long
continue to use the term "conditioning" for both Pavlovian and
operant procedures when the researchers themselves offer no reason
for believing that the same behavioral principles will be found to
hold throughout. Theory will rest content with a common language
for the two only when the cases are shown to be the same on actual
analysis. Workers with the "response" of salivation, and the "re-
sponse" of lever-pressing, are entitled to call both studies of "con-
ditioning" only if these responses are equivalent by explicit criteria
which are relevant to the term "conditioning." Behavior theory, in
the strongest and most rational form it has yet been able to assume,
tells us that reflex criteria are the only ones which confer upon
those workers the right to use "conditioning" for their two labo-
ratory exercises.
(e) Although it is often loosely said that an operant R has its
rate raised by cotlditioning, operant procedures are designable
which can lower the rate of the R under observation (see Skinner,
1938; Wilson and Keller, 1953). The so-called DRL and DRO
procedures, among others, are of this sort. Accordingly, it is more
accurate to say in general that operant conditioning aims at control
over the R, that is, control over whatever measure of R is chosen.
While this measure is usually that of rate, workers in the field have
successfully controlled other measures of R, such as its latency (in
trial-by-trial procedures), amplitude, site, "topogral)hy," and so on.
(d) The pr~edure of response "shaping," which so often is an
early stage in an operant conditioning experiment, may seem to
r the traditional picture of the experimenter patiently
waiting for his selected R to be "emitted" so that he can reinforce it.
On inspection, however, the term "shaping," like so many other
everyday usages of laboratory workers, does not mean what it says.
It seems to tell us that the R is being created piecemeal, is being
put together somehow, and once put together is having its ampli-
tude raised to the threshold of the recording machline so that in-
stances of its occurrence could be recognized. But that is not what
the "shaping" consists of. Rather, when the desired R does appear
in the process of shaping, it appears full-blown. Pedaaps it is the
organism's response repertc, ry which should be thought of as being
shaped; or perhaps we ought imagine that the operant level of
the desired response has been raised. Perhaps it is easy for us to
metaphorize the process as "shaping" because it is on occasion a
slow one, and because labeling the shaping procedures as "selective
reinforcement" and "successive approximations" makes us imagine
Volume
Number I1
3
"RESPONSE" IN BEHAVIOR THEORY 139
References
Farmer, J. and Schoenfeld, W. N.: Inter-reinforcement times for the bar-
pressing response of white rats on two DRL schedules. I. Exp. Anal.
Behav. 7:119-122, 1964.
Keller, F. S. and Schoenfeld, W. N.: Principles of Psychology. Appleton-
Century-Crofts, New York, 1950.
Notterman, J. M. and Mintz, D. M.: Dynamics of Response. Wiley, New York,
1965.
Schlosberg, H.: The relationship between success and tile laws of conditioning.
Psychol. Rev. 44:379-394, 1937.
Schoenfeld, W. N.: Some old work for modem behavior theory. Cond. Reflex,
1:219-223, 1966.
Schoenfeld, W. N.: Conditioning the whole organism. Cond. Reflex, 6:125-128,
1971.
Schoenfeld, W. N.: Problems of modem behavior theory. Cond. Reflex. 7:33-
65, 1972.
Schoenfeld, W. N., Antonitis, J. J., and Bersh, P. J.: Unconditioned response
rate of tile white rat in a bar-pressing apparatus. 1. Comp. Physiol.
Psychol. 43:41-48, 1950.
Schoenfeld, W. N. and Cole, B. K.: Behavioral control by intermittent stim-
ulation. In Reir[orcement: Behavioral Analyses. R. M. Gilbert ard J. R.
Millenson, eds., Academic Press, New York, 1972.
Schoenfeld, W. N., Cole, B. K., et al.: Stimulus Schedules: The t-r Systems.
Harper and Row, New York, 1972.
Schoenfcld, W. N., Cole, B. K., Lat~g, J. and Mankoff, R.: "Contingency" in
behavior theory. In Contcmpomry AI)l~n~achcs to Conditioning and Lean)-
inK. F.J. McC,uigan and I). lt. Lumsden, eds., V. II. Winston, Washing-
ton, D. C. 1973.
Schoenfeld, W. N. and Cole, B. K.: What is a "'schedule of reinforcement"?
Pay. 1. Biol. Sc/., 10:52-61, 1975.
Schoenfeld, W. N. and Farmer, J.: Reinforcement schedules and the "behavior
stream." In The Theory of Rein[orcement Schedules. W. N. Schoenfeld,
ed., Appleton-Century-Crofts, New York, 1970.
Schoenfeld, W. N., Kadden, R. M., and McMillan. J. C.: Sympathetic and
parasympathetic control of the cardiac conditional response ih Macaca
mulatta. In preparation.
Skinner, B. F.: The generic nature of the concepts of stimulus and response.
1. Gen. Psychol., 12:40-65, 1935.
Skinner, B. F.: Behavior o[ Organisms. Appleton-Century-Crofts, New York,
1938.
Skinner, B. F.: The concept of the reflex in the description of behavior. 1. Gen.
Psychol. 5:427-458, 1931. See also the added foreword to this paper in
B. F. Skinner, Cumulative Record. Appleton-Century-Crofts, New York,
1959, pp. 319-320.
Whitehead, A. N.: Science and the Modern World. Macmillan, New York,
1937.
Wilson, M. P. and Keller, F. S.: On the selective reinforcement of spaced
responses. 1. Comp. Physiol. Psychol., 46:190-193, 1953.