You are on page 1of 29

B. F.


Burrhus Frederic Skinner
• Born March 20, 1904 – Died August 18, 1990 of leukemia
• From a small Pennsylvania town of Susquehanna
• His father was a lawyer
• His mother was a housewife
• Received his BA in English from Hamilton College in
upstate New York
• Skinner started his career as an English major, writing
poems and short stories. After this...
• Skinner attended Harvard where he got his masters in
psychology (1930) and his doctorate (1931), and stayed
there to do research until 1936.
• Studied in the field of psychology (Behaviours)
• Skinner married Yvonne Blue in 1936 and they had their
first child, Julie, in 1938.
• In 1944, during World War II, Skinner worked on the
“Project Pigeon” which trained pigeons to direct bombs by
pecking at a target.
• In 1943, when Yvonne was pregnant for the second time,
Skinner designed the “baby tender,” a crib that was
designed to be safe than a normal crib.
What did Skinner study?
• Findings about the ways animals discover and learn things
• Most actions (or behaviours) are basically learned, and can be
unlearned and changed.

• “Personality is the result of measuring outside forces. Thus,

how we think and act can be modified by manipulating our
environment.” (ABC’s of the Human Mind)
Learning: Some Terms

Learning: Relatively permanent change in behavior due to

Does NOT include temporary changes due to disease,
injury, maturation, injury, or drugs, since these do NOT
qualify as learning

Reinforcement: Any event that changes the probability that a

response will recur

Response: Any identifiable behavior

- Internal: Faster heartbeat
- Observable: Eating, scratching

• The study of operant conditioning

started with experiments of Edward
Lee Thorndike.
• Thorndike refers to this as trial and
Error of learning.
• From these experiments, he
formulated the law of effect.
Operant Conditioning
• Changes in behavior is the outcome of an individual
responding to occurrences in the environment (stimuli)
• Also known as Instrumental learning
• If the subject is correctly stimulated it will give the suitable
• When a stimulus-response pattern is reinforced
(rewarded), the individual is conditioned to respond in a
certain manner
• E.g. A child that is rewarded for their good behaviour and
efforts within school will tend to follow this pattern to
continue to keep being rewarded. If they show bad
behaviour or no effort in school, and they are punished for
this (privileges taken away) they will realize they are not
gaining anything from this type of behaviour and stop
Operant conditioning
• Law of effect: Thorndike’s theory that behavior
consistently rewarded will be “stamped in” as learned
behavior, and behavior that brings about discomfort will
be “ stamped out.”
• Involves a selection from many responses of the one that
habitually will be given in stimulus situation to discover
how his behavior affects the environmental and vice versa.
• Motor Learning: Skinner (1959) and other psychologist
states that this type of learning, the outcome sought is skill
which may be described as the adaptation of movement to
stimuli resulting in speed and precision of performance.
Skill may vary from simple muscular reaction to complex
motor processes. However, it always involves the
developmental of patterns of neuromuscular coordination
and adjustment of perceptual situation
• In some aspects of motor learning, the method of trial,
error and success is fundamental, usually because the
learner does not have a clear idea of the skill needed
2 Types of Learning
• Respondent Behavior- are those that occur
automatically and reflexively.
• Operant Behavior- are those under
unconscious control. Some may occur
spontaneously and others purposely, but it is
the consequences of these actions that then
influence whether or not they occur again in
the future.

• A reinforcement is any stimulus event that

will maintain or increase the strength of a
Timing of Reinforcement
Operant reinforcement most effective when given immediately
after a correct response

Effectiveness of reinforcement is inversely related to time

elapsed after correct response occurs
• Continuous Reinforcement: Every time the rat does the appropriate
behavior, he gets a pellet.

• Fixed ratio schedule: If the rat presses the pedal three times, he gets a
pellet…or five times, or twenty times, or x times. There is a fixed ratio
between behaviors and reinforcers.

• Fixed interval schedule: If the rat presses the bar at least once during
a particular period of time, say 20 seconds, he gets a pellet. But
whether he presses the bar once or a hundred times within that 20
seconds, he only receives one reinforcer.

• Variable ratio schedule: You change the x each time. First it takes 3
presses to get a pellet, then 10, then 4, etc.

• Variable interval schedule: You keep changing the time period. First
10 seconds, then 35, then 5, then 40.
Types of Reinforcers

Primary Reinforcer: Unlearned and natural; satisfies

biological needs (e.g., food, water, sex)
Secondary Reinforcer: Learned reinforcer (e.g., money,
grades, approval, praise)
Token Reinforcer: Tangible secondary reinforcer (e.g.,
money, gold stars, poker chips)
Social Reinforcer: Provided by other people (e.g.,
learned desires for attention and approval)
Positive and Negative Reinforcement, Positive
and Negative Punishment
Positive & Negative Consequences

Add or Subtract Stimuli

Add (+) Subtract(-)

Reinforcer money/gift waive chores

Punisher spanking time-out/
(weakens) restriction
Types of Stimuli
• Appetitive stimulus: a stimulus that is pleasant
• Aversive stimulus: a stimulus that is unpleasant
• Positive reinforcement: reinforcement in which an appetitive
stimulus is presented.
• Positive punishment: punishment in which an aversive
stimulus is presented
• Negative reinforcement: reinforcement in which an aversive
stimulus is removed
• Negative punishment: reinforcement in which an appetitive
stimulus is removed
Continuous & Partial
Continuous Reinforcement: A reinforcer follows every correct
Partial Reinforcement: Reinforcers do NOT follow every
Partial Reinforcement Effect: Responses acquired with
partial reinforcement are very resistant to extinction
Schedules of Reinforcement
Continuous Reinforcement – every time the rat does the
correct behavior, it is reinforced with food.
Fixed Ratio Schedule – there is a fixed ratio between
correct behaviors and reinforcement, i.e., 5 pedal pushes to
one food pellet.
Fixed Interval Schedule – rat can push the pedal at least
once or possibly many more of times per every 30 seconds,
but will get only one food pellet.
Variable Schedules – variable ratio means you change the
amount of pedal pushes needed to receive the food pellet,
and variable interval means you change the time period.
Operant conditioning chamber or “Skinner Boxes”
- rats would press on a bar to receive a food (positive
reinforcement), or could be set up to give an electric shock
(negative reinforcement).
- pigeons would peck to receive food if a light was on, but not
receive food when a light was off.
Extinction of the operant behavior occurs when you stop giving
the rat a food pellet as a reward for pushing the pedal.
Cumulative recorder –records the rate of response of the animal
in the operant conditioning chamber
Air Crib – a crib that maintained constant temperature and
humidity, easy to clean.
An Air Crib
Skinner box
• A box often used in operant conditioning of
animals; it limits the available responses and
thus increases the likelihood that the desired
response will occur.
Skinner Box
(conditioning chamber)

designed to study
conditioning in
Skinner’s Rats
• Skinner tested out the theory of operant conditioning on
• Rats were placed in metal cages with a number of levers.
At first the rats would nose around the cage and
accidentally press the levers, an action that would cause
food or water to drop into a dish. After repeating the
action, the rats saw that they could receive food and
water by pressing the lever. (Learned this behaviour)
• So, when the rats were rewarded they were conditioned
to repeat this positive action to continue being rewarded
• Along with the rats, Skinner conducted experiments
on pigeons
• During World War II (1944) there were no missile
guidance systems so Skinner decided to try and
discover one. Skinner got funding for a top secret
project to train pigeons to guide bombs. He
trained pigeons to keep pecking a target that
would hold a missile onto a target. The pigeons
pecked reliably, even when falling rapidly and
working with warlike noise all around them.
(Learned behaviour)
• Skinner trained the pigeon’s to peck at a particular
colored disk
• This is based on Skinner’s theory of Operant
Conditioning- behaviours are repeated if they are
rewarded, and behaviours that are punished will be
Why is it important to the
social sciences?

• Knowing how people learn behaviour is a necessity to our

society so that we can control and promote the good
behaviour, which will benefit society as a whole.
• The theory of operant-conditioning helps us to control the
way humans learn behaviour and how society can be a
great influence on behaviour
• Helps us to understand how to improve behaviours
(people with problem behaviours and criminal histories)
Interesting facts
• Skinner tested his theory of Operant-conditioning on his
daughter, he tried to prove that environment affects
behaviour (Rumor)
• " I can remember growing up a very happy child. I can’t
exactly pin point any times in my young life that was
traumatizing." (Cohen 1987)

• "Education is what survives when what has been learnt has

been forgotten."