You are on page 1of 6

Shaunna Mae Acedo

Anna Veronica Maglinte

CONNECTIONIST THEORY
B. F. Skinner
General Paradigm: behavior is the result of its consequences
I. LIFE

Burrhus Frederic Skinner was born on March 20, 1904 in Susquehanna, Pennsylvania. He is the 1 st child of William Skinner & Grace
Mange Burrhus Skinner. He lived in a comfortable, happy, upper-middle-class home where his parents practiced the values of temperance,
service, honesty and hard work. The Skinners were Presbyterian, but Fred began to lose his fate during high school and thereafter never
practiced any religion. As a child, Fred was inclined towards music & literature. When he was 2 ½ y/o he had a younger brother, Edward.
Fred felt that he (Eddie) was loved more by his parents, yet he did not feel unloved. He was simply more independent and less emotionally
attached to his parents. Eddie died suddenly during Fred’s 1 st year in college. His parents wanted him to be the “family boy” & succeeded in
keeping him financially obligated even after he became a well-known name in American Psychology. After Fred took his bachelor’s degree
in English, he set about to realize his ambition of being a creative writer. Robert Frost, who had read one of his writings, sent Skinner an
encouraging letter. Skinner returned to his parents’ home after college, built a study room in the attic & every morning went to work at
writing – but nothing happened. This “dark year” exemplified a powerful identity confusion in his life. At the end of his unsuccessful dark
year (18 months), Skinner was faced to the task of looking for a new career, Psychology beckoned. After reading some works of Pavlov &
Watson, he became determined to be a behaviourist. Although Skinner had never taken an undergraduate in psychology course, Harvard
accepted him as a graduate student in psychology. After completing his PhD in 1931, Skinner received a fellowship from the National
Research Council to continue his laboratory at Harvard. At the end of his 3-yr term as a Junior Fellow, he was again in the position of
looking for a job. He had a PhD in Psychology, 5 ½ of laboratory research, but he was ill prepared to teach within the mainstream of
psychology, having “never even read a text in psychology as a whole”.
In 1936 he began teaching and research position at the University of Minnesota for 9 years. Soon after moving to Minneapolis and
following a short and erratic courtship, he married Yvonne Blue, they had 2 daughters Julie (1938) and Deborah (1944). At Minnesota, he
published his book The Behavior of Organisms and got involved in 2 of his most interesting ventures – Pigeon Guided Missile and Baby
Tender (made for her 2nd daughter Debbie). Both projects brought frustrations & disappointments, emotionally that may have led to his 2 nd
identity crisis. Alan Elms believed that the frustrations Skinner experienced over Pigeon Guided Missile and Baby Tender led to his 2 nd
identity crisis.
In 1945 he left Minnesota to become chair of Psychology Department at Indiana University, a move that added frustrations. During summer
of 1945, Skinner write Walden Two, a Utopian Novel that portrayed a society in w/c problems were solved thru behavioural engineering.
In 1948, Skinner returned to Harvard, where he taught mostly in the College of Education and continued with some small experiments with
pigeons. During 1964, when he was 60 y/o he retired from teaching but retained faculty status. In 1974, he retired as Professor of
Psychology but continued as Professor Emeritus.
August 18,1990 he died of leukemia. One week before his death he delivered an emotional address to APA convention. At this convention,
he received an unprecedented citation for Outstanding Lifetime Contribution to Psychology, the only person to receive such award in the
history of APA.

I. WORKS

→Pigeon Guide Missile

Baby Tender ←

In 1936, Skinner began teaching and research position at the University of Minnesota, where he remained for 9 years. During his Minnesota
years, Skinner published his first book, The Behavior of Organism (1938), but beyond that, he was involved with two of his most interesting
ventures—the Pigeon-guided missile and the baby-tender built for his second daughter, Debbie. Both projects, brought frustration and
disappointment, emotions that may have led to a second identity crisis.
Skinner’s project Pigeon was a clever attempt to condition Pigeons to make appropriate pecks on keys that would maneuver an explosive
missile into an enemy target. Almost two years before the United States entered the war, Skinner purchased a flock of Pigeons for the
purpose of training them to guide missiles. To work full-time on project-Pigeon, Skinner obtained a grant from the University Of Minnesota
and financial aid from General Mills, the food conglomeration housed in Minneapolis.
In an effort to secure the needed funds, he prepared a film of trained pigeons peeking into the controls of a missile and guiding it toward a
moving target. After viewing the film, government officials rekindled their interest and warded General Mills a substantial grant to develop
the project. In 1944, Skinner dramatically demonstrated to government officials the feasibility of the project by producing a live pigeon that
unerringly tracked a moving target. Despite the spectacular demonstration some observers laughed and most remained skeptical. Finally
after 4 years of work, more than two of which were full-time, Skinner was notified that financial help could no longer be continued, and the
project came to a halt.
Shortly after Skinner abandoned Project Pigeon and immediately after the birth of his second daughter, Debbie, he became involved in
another venture – the baby-tender. The baby-tender was essentially an enclosed crib with a large window and a continual supply of fresh
warm air. It provided a physically and psychologically safe environment for Debbie, one that also freed the parents from unnecessary
tedious labor. After Ladies’ Home Journal published an article on the baby-tender, Skinner was both condemned and praised for his
invention. When Debbie outgrew the baby-tender at age 2 ½ years, Skinner unceremoniously fashioned it into a pigeon cage.
In the summer of 1945, while on vacation, Skinner wrote Walden Two, a utopian novel that portrayed a society in which problems were
solved through behavioral engineering. Although not published in 1948, the book provided its author with immediate therapy in the form of
an emotional catharsis. At last Skinner had done what he failed to accomplish during his Dark Year nearly 20 years earlier. Skinner
admitted that the book’s two main characters, Farazier and Burris, represented his attempt to reconcile two separate aspects of his own
personality. Walden Two was also a benchmark in Skinner’s professional career. No longer would he be confined to the laboratory study of
rats and pigeons, but thereafter he would be involved with the application of behavioral analysis to the technology of shaping human
behavior. His concern with the human condition was elaborated in Science and Human Behavior (1953) and reached philosophical
expression in Beyond Freedom and Dignity (1971).
In 1948, returned to Harvard, where he taught mostly in the college of Education and continued with some small experiments with pigeons.
After he retired from teaching in 1964, Skinner wrote several important books on human behavior that helped him attain the status of
America’s best-known living psychologist. In addition to Beyond Freedom and Dignity (1971), He published About behaviorism (1974),
Reflections on Behaviorism And Society (1978), and Upon Further Reflection (1987). During this period, he also wrote a three-volume
autobiography, Particulars of My Life (1976), The Shaping of a Behaviorist (1979), and A Matter of Consequences (1983).

II. LEARNING PARADIGM


Operant Conditioning --- Positive & Negative Reinforcement

A. Animal Subject
 Pigeon
 Rat

B. Experimental Apparatus

 Skinner box - Skinner box that allowed him to formulate important


principles of animal learning. An animal placed inside the box is rewarded
with a small bit of food each time it makes the desired response, such as
pressing a lever or pecking a key. A device outside the box records the
animal’s responses.

C. Learning Theories
 Behaviourism - Basic idea: Stimulus-response. All behavior caused by external stimuli (operant conditioning). All
behavior can be explained without the need to consider internal mental states or consciousness.
- also called the learning perspective (where any physical action is a behavior), is a philosophy
of psychology based on the proposition that all things that organisms do—including acting, thinking and
feeling—can and should be regarded as behaviors.

B. F. Skinner’s entire system is based on operant conditioning.  The organism is in the process of “operating” on the environment,
which in ordinary terms means it is bouncing around it world, doing what it does.  During this “operating,” the organism encounters a
special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer.  This special stimulus has the effect of increasing
the operant -- that is, the behavior occurring just before the reinforcer.  This is operant conditioning:  “the behavior is followed by a
consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future.”
Imagine a rat in a cage. This is a special cage (called, in fact, a “Skinner box”) that has a bar or pedal on one wall that, when pressed,
causes a little mechanism to release a foot pellet into the cage.  The rat is bouncing around the cage, doing whatever it is rats do, when he
accidentally presses the bar and -- hey, presto! -- a food pellet falls into the cage! The operant is the behavior just prior to the reinforcer,
which is the food pellet, of course.  In no time at all, the rat is furiously peddling away at the bar, hoarding his pile of pellets in the corner
of the cage.
A behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future.
What if you don’t give the rat any more pellets?  Apparently, he’s no fool, and after a few futile attempts, he stops his bar-pressing
behavior.  This is called extinction of the operant behavior.
A behavior no longer followed by the reinforcing stimulus results in a decreased probability of that behavior occurring in the future.
Now, if you were to turn the pellet machine back on, so that pressing the bar again provides the rat with pellets, the behavior of bar-
pushing will “pop” right back into existence, much more quickly than it took for the rat to learn the behavior the first time.   This is
because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time
the rat was reinforced for pushing on the bar!

Schedules of reinforcement
Skinner likes to tell about how he “accidentally -- i.e. operantly -- came across his various discoveries.  For example, he talks about
running low on food pellets in the middle of a study.  Now, these were the days before “Purina rat chow” and the like, so Skinner had to
make his own rat pellets, a slow and tedious task.  So he decided to reduce the number of reinforcements he gave his rats for whatever
behavior he was trying to condition, and, lo and behold, the rats kept up their operant behaviors, and at a stable rate, no less.  This is how
Skinner discovered schedules of reinforcement!

Continuous reinforcement is the original scenario:  Every time that the rat does the behavior (such as pedal-pushing), he gets a rat
goodie.
The fixed ratio schedule was the first one Skinner discovered:  If the rat presses the pedal three times, say, he gets a goodie.  Or five
times.  Or twenty times. Or “x” times.  There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc.   This is a little
like “piece rate” in the clothing manufacturing industry:  You get paid so much for so many shirts.
The fixed interval schedule uses a timing device of some sort.  If the rat presses the bar at least once during a particular stretch of time
(say 20 seconds), then he gets a goodie.  If he fails to do so, he doesn’t get a goodie. But even if he hits that bar a hundred times during
that 20 seconds, he still only gets one goodie!  One strange thing that happens is that the rats tend to “pace” themselves:  They slow down
the rate of their behavior right after the reinforcer, and speed up when the time for it gets close.
Skinner also looked at variable schedules.  Variable ratio means you change the “x” each time -- first it takes 3 presses to get a goodie,
then 10, then 1, then 7 and so on.  Variable interval means you keep changing the time period -- first 20 seconds, then 5, then 35, then 10
and so on.
In both cases, it keeps the rats on their rat toes.  With the variable interval schedule, they no longer “pace” themselves, because they no
can no longer establish a “rhythm” between behavior and reward.  Most importantly, these schedules are very resistant to extinction.  It
makes sense, if you think about it.  If you haven’t gotten a reinforcer for a while, well, it could just be that you are at a particularly “bad”
ratio or interval!  Just one more bar press, maybe this’ll be the one!

This, according to Skinner, is the mechanism of gambling. You may not win very often, but you never know whether and when you’ll
win again.  It could be the very next time, and if you don’t roll the dice, or play that hand, or bet on that number this once, you’ll miss on
the score of the century!

Internal Events: Thoughts, Feelings, and Drives

Thoughts

 Skinner, 1974: thoughts are behaviors


o e.g. internal talk, planning of a chess move
 problem: cannot be measured ... science only concerned with public events
o main weakness associated with internal events: they are used as an hypothetical "cause" of behavior
o much better to deal with the behavior and the consequences following ... avoid the attempt to invent reasons

Feelings

 both thoughts and emotions are possible, but they do not "cause" behavior
 Skinner, 1974: emotions are products of reinforcements
o e.g. feeling of satisfaction after being rewarded for successful performance of a sport skill
o e.g. poor motivation after having just been rewarded is related to knowledge that it will be some time before
we will be rewarded again
 Bijou & Baer, 1961: consider the possibility that persistent emotional responses are related to consequences which are
reinforcing

Drives

 despite the usefulness of the concept of drives (e.g. hunger, thirst), Skinner avoided them
o e.g. Skinner, 1953: define the drive in terms of number of hours of deprivation
 Premack, 1961: possible solution - reinforcement defined in terms of the probability of a response at that moment
o e.g. high probability behavior reinforces lower probability behavior (children at play can be reinforced to eat
so that they can play some more)

Species-Specific Behavior

 Skinner, 1969: some constraints on behavior of different species:


o e.g. rats difficult to teach how to let go of things
o e.g. verbal behavior difficult to teach to nonhuman's
 solution: responses classified in terms of their topography
o i.e. responses can be constrained by species limitations
o primary focus is on the reinforcement contingencies
 long term view: species-specific behavior seen as having been under control of environment over generations
o e.g. a Darwinist concept in which the "most useful" behaviors for a species have been selectively rewarded

A question Skinner had to deal with was how we get to more complex sorts of behaviors.  He responded with the idea of shaping, or “the
method of successive approximations.”  Basically, it involves first reinforcing a behavior only vaguely similar to the one desired.   Once
that is established, you look out for variations that come a little closer to what you want, and so on, until you have the animal performing
a behavior that would never show up in ordinary life.  Skinner and his students have been quite successful in teaching simple animals to
do some quite extraordinary things.  My favorite is teaching pigeons to bowl!
I used shaping on one of my daughters once.  She was about three or four years old, and was afraid to go down a particular slide.  So I
picked her up, put her at the end of the slide, asked if she was okay and if she could jump down.  She did, of course, and I showered her
with praise.  I then picked her up and put her a foot or so up the slide, asked her if she was okay, and asked her to slide down and jump
off.  So far so good.  I repeated this again and again, each time moving her a little up the slide, and backing off if she got nervous.  
Eventually, I could put her at the top of the slide and she could slide all the way down and jump off.   Unfortunately, she still couldn’t
climb up the ladder, so I was a very busy father for a while.
Beyond these fairly simple examples, shaping also accounts for the most complex of behaviors.  You don’t, for example, become a brain
surgeon by stumbling into an operating theater, cutting open someone's head, successfully removing a tumor, and being rewarded with
prestige and a hefty paycheck, along the lines of the rat in the Skinner box.  Instead, you are gently shaped by your environment to enjoy
certain things, do well in school, take a certain bio class, see a doctor movie perhaps, have a good hospital visit, enter med school, be
encouraged to drift towards brain surgery as a speciality, and so on.  This could be something your parents were carefully doing to you,
ala a rat in a cage.  But much more likely, this is something that was more or less unintentional.

An aversive stimulus is the opposite of a reinforcing stimulus, something we might find unpleasant or painful.
A behavior followed by an aversive stimulus results in a decreased probability of the behavior occurring in the future.
This both defines an aversive stimulus and describes the form of conditioning known as  punishment.  If you shock a rat for doing x, it’ll
do a lot less of x.  If you spank Johnny for throwing his toys he will throw his toys less and less (maybe).
On the other hand, if you remove an already active aversive stimulus after a rat or Johnny performs a certain behavior, you are
doing negative reinforcement.  If you turn off the electricity when the rat stands on his hind legs, he’ll do a lot more standing.  If you
stop your perpetually nagging when I finally take out the garbage, I’ll be more likely to take out the garbage (perhaps).  You could say it
“feels so good” when the aversive stimulus stops, that this serves as a reinforcer!
Behavior followed by the removal of an aversive stimulus results in an increased probability of that behavior occurring in the future.
Notice how difficult it can be to distinguish some forms of negative reinforcement from positive reinforcement:  If I starve you, is the
food I give you when you do what I want a positive -- i.e. a reinforcer?   Or is it the removal of a negative -- i.e. the aversive stimulus of
hunger?
Skinner (contrary to some stereotypes that have arisen about behaviorists) doesn’t “approve” of the use of aversive stimuli -- not because
of ethics, but because they don’t work well!  Notice that I said earlier that Johnny will maybe stop throwing his toys, and that I perhaps
will take out the garbage?  That’s because whatever was reinforcing the bad behaviors hasn’t been removed, as it would’ve been in the
case of extinction.  This hidden reinforcer has just been “covered up” with a conflicting aversive stimulus.  So, sure, sometimes the child
(or me) will behave -- but it still feels good to throw those toys.  All Johnny needs to do is wait till you’re out of the room, or find a way
to blame it on his brother, or in some way escape the consequences, and he’s back to his old ways.   In fact, because Johnny now only gets
to enjoy his reinforcer occasionally, he’s gone into a variable schedule of reinforcement, and he’ll be even more resistant to extinction
than ever!

Behavior modification -- often referred to as b-mod -- is the therapy technique based on Skinner’s work.  It is very straight-forward: 
Extinguish an undesirable behavior (by removing the reinforcer) and replace it with a desirable behavior by reinforcement.   It has been
used on all sorts of psychological problems -- addictions, neuroses, shyness, autism, even schizophrenia -- and works particularly well
with children.  There are examples of back-ward psychotics who haven’t communicated with others for years who have been conditioning
to behave themselves in fairly normal ways, such as eating with a knife and fork, taking care of their own hygiene needs, dressing
themselves, and so on.
There is an offshoot of b-mod called the token economy.  This is used primarily in institutions such as psychiatric hospitals, juvenile
halls, and prisons.  Certain rules are made explicit in the institution, and behaving yourself appropriately is rewarded with tokens -- poker
chips, tickets, funny money, recorded notes, etc.  Certain poor behavior is also often followed by a withdrawal of these tokens.  The
tokens can be traded in for desirable things such as candy, cigarettes, games, movies, time out of the institution, and so on.   This has been
found to be very effective in maintaining order in these often difficult institutions.

D. Laws/Principles

– the probability of a response occurring increases if it is followed by a reward or positive reinforce, such as food or praise.

Principles of Conditioning
Reinforcement and Extinction
 key point: behavior is influenced by reinforcing stimuli
o e.g. Lipsitt, 1975: more sucking behavior of infants when liquid is sweet
 primary reinforcers: inherently provide reinforcement (e.g. pain)
 conditioned reinforcers: created from association with primary reinforcers (e.g. smiles, money)
 extinction: "forgetting" of a behavior which no longer is reinforced
 spontaneous recovery: resumption of a behavior which had already been extinguished (e.g. new situation)
Immediacy of Reinforcement
 Skinner, 1953: reinforcement works best if it is given as close as possible to the behavior with which it is paired
o e.g. give reward immediately after an event ... will result in the event being repeated
Discriminative Stimuli
 discriminative stimuli: those which affect the likelihood that a response will be given
o e.g. Skinner, 1953: pigeon reinforced for neck stretching, but only in the presence of a flashing light ... light
becomes a signal for the stretching activity, because it has already been associated with reinforcement
o e.g. redness of an apple becomes a discriminative stimulus for apple picking
o e.g. smile usually means a positive meeting, therefore smile becomes a discriminative stimulus for greeting
someone
 note that the control of a discriminative stimulus is not automatic - just likely
Generalization
 stimulus generalization: related stimuli are responded to in a similar fashion
o e.g. Skinner, 1953: in young children "Da da" is initially associated with all males
o requires further discrimination to only use the word "Da da" when referring to the father
 response generalization: similar responses for a whole class of responses after one specific response has been
reinforced
o e.g. Lovaas, 1977: reinforcement of the use of plurals in one part of speech results in use of plurals in other
parts of speech
Behavior Chains
 complicated behaviors are a series of reinforcement/stimuli chains
o e.g. batting chain:
 grip/stance -> reinforcement and signal for the swing
 hitting ball -> reinforcement and signal for base running
 etc.
Schedules of Reinforcement
 Skinner, 1953: differences between behavior reinforced continuously or reinforced intermittently:
o continuous reinforcement: every time
o intermittent:
 fixed-interval: e.g. after every nth peck or after n minutes
 ... but even pigeons know that after being reinforced, there will be no reinforcement for
awhile -> leads to a reduction in behavior immediately after reinforcement
 variable-interval: e.g. varied lengths of time between reinforcements (distribution around an average
target time interval)
 variable-ratio: e.g. varied numbers of responses between reinforcements (distribution around an
average number)
o Bijou & Baer, 1961: effects vary between continuous and intermittent reinforcement
o faster early learning with continuous reinforcement
o slower extinction with intermittent reinforcement

Negative Reinforcement and Punishment


 positive reinforcement = presentation of a rewarding event after behavior -> leads to more behavior
 negative reinforcement = removal of a punishing event after behavior -> leads to more behavior
 punishment = presentation of a punishing event after behavior -> leads to less behavior
 problems with punishment
o Skinner, 1938: hitting rats' legs no more effective than extinction
o Skinner, 1953: punishment is associated with unwanted side-effects (e.g. over-concern about the actual
punishment ... therefore lack of focus on the behavior)
 best to combine extinction of one behavior with reward of a better one
o alternate view: Liebert et al, 1977: prompt, painful punishment combined with possibility of alternate
responses is effective

Skinner's Operant Conditioning Mechanisms

 Positive Reinforcement or reward: Responses that are rewarded are likely to be repeated. (Good grades reinforce
careful study.)
 Negative Reinforcement: Responses that allow escape from painful or undesirable situations are likely to be repeated.
(Being excused from writing a final because of good term work.)
 Extinction or Non-Reinforcement : Responses that are not reinforced are not likely to be repeated. (Ignoring student
misbehavior should extinguish that behavior.)

Punishment: Responses that bring painful or undesirable consequences will be suppressed, but may reappear if reinforcement
contingencies change. (Penalizing late students by withdrawing privileges should stop their lateness.)

III. APPLICATIONS

A. Laboratory

Applications in Classroom Management


Skinner's theory, as well as other reinforcement techniques, was later applied to classroom settings with the idea that using
reinforcers could increase the frequency of productive behaviors and decrease the frequency of disruptive behaviors.
 Contingency Contracting This contract between the student and teacher specifies what behaviors are appropriate and
which are not by listing what types of rewards or punishments will be received.
 Token economy In a token economy, students are given some type of token for appropriate behaviors, and those
tokens can later be exchanged for prizes or privileges.
 Incentive System Applying an incentive system should involve all students in the classroom. It would be designed to
shape a misbeaving child's behavior. For example, this sytem could be set up to reward the whole class for total class
compliance
 Encouragement System The teacher could focus on one target behavior to work on with the erring student, at first
ignoring his other misbehaviors. For instance, the teacher could give the offender a reward card. For every problem
that student completes correctly, he would get a hole punched inhis card. After so many holes, the student would be
rewarded some kind of prize, like candy. Make it sugar-free, please.

B. Human Behaviour

Behavior Modification

 since behavior is the dependent variable, there are many behaviors which can be used to apply the principles
o Lovaas, 1987: autistic children
 elimination of undesirable behavior and reinforcement of appropriate behavior
 shaping used extensively (e.g. sounds -> words -> understandable speech)
 Parents often try to balance praise and punishment. To be effective, they should punish only behaviors they wish to
extinguish--they should not punish for not doing what should be done

Programmed Instruction

 Skinner, 1968: physical machines (teaching machine) and instructional system (programmed instruction)
o progression in small steps
o active involvement of the learner
o feedback provided within seconds of the response
o independent participation of student
 relationship between Skinner and Montessori:
o similarities:
 individualized instruction
 incremental
 starts from the level of the student
o differences:
 programmed instruction requires reading while Montessori method involves physical items
 programmed instruction created by adults ... therefore more adult direction than in Montessori
system

Sources:

Eysenck, Michael W., Simply Psychology 2nd Edition, Psychology Press Ltd. (pp. 100-107)

Feist, Jess & Feist, Gregory J. Theories of Personality 6th Edition, McGraw Hill (pp 434-437)

Learning Theories Knowledgebase (2010, June). Paradigms at Learning-Theories.com. Retrieved June 17th, 2010 from http://www.learning-
theories.com/paradigms

Instructional Design & Learning Theory (1998, May) . Brenda Mergel. Retrieved June 16 th, 2010 from
http://www.usask.ca/education/coursework/802papers/mergel/brenda.htm#The Basics of Behaviorism

Behaviorism. Retrieved June 16th, 2010 from http://en.wikipedia.org/wiki/Behaviorism

Crazy Joe’s Psych 101 Notes II. Operant Conditioning (May 28,2008). Retrieved June 16 th, 2010 from
http://www.scribd.com/doc/4387620/Operant-Conditioning

Richard R. Danielson (1996-2004). Ch 8 - Learning Theory: Pavlov, Watson, and Skinner. Retrieved June 17th, 2010 from
http://danielson.laurentian.ca/drdnotes/5106_crain_ch08.htm#PracticalApplications

SIL International (1998, July). Retrieved July 17 th, 2010 from


http://sil.org/lingualinks/literacy/implementaliteracyprogram/behavioristtheoryoflearningski.htm
Reinforcement Theory (2008, September). Retrieved July 17th, 2010 from http://wik.ed.uiuc.edu/index.php/Reinforcement_theory

Dr. George C. Boeree (1998-2000). Behaviorism. Retrieved July 17th, 2010 from http://webspace.ship.edu/cgboer/beh.html

You might also like