You are on page 1of 3

B.F. SKINNER happens after the response.

Skinner called this operant


behavior.
American psychologist B.F. Skinner is best known for
developing the theory of behaviorism, and for his utopian OPERANT CONDITIONING
novel Walden Two (1948).
In Skinner's operant conditioning process, an operant referred
Skinner referred to his own philosophy as 'radical to any behavior that acts on the environment and leads to
behaviorism' and suggested that the concept of free will was consequences. He contrasted operant behaviors (the actions
simply an illusion. All human action, he instead believed, was under our control) with respondent behaviors, which he
the direct result of conditioning. described as anything that occurs reflexively or automatically
such as jerking your finger back when you accidentally touch
Born in Pennsylvania in 1904, psychologist B.F. Skinner a hot pan.
began working on ideas of human behavior after earning his
doctorate from Harvard. Skinner's works include The Skinner identified reinforcement as any event that strengthens
Behavior of Organisms (1938) and a novel based on his the behavior it follows. The two types of reinforcement he
theories Walden Two (1948). He explored behaviorism in identified were positive reinforcement (favorable outcomes
relation to society in later books, including Beyond Freedom such as reward or praise) and negative reinforcement (the
and Human Dignity (1971). Skinner died in Massachusetts in removal of unfavorable outcomes).
1990.
Punishment can also play an important role in the operant
EARLY LIFE conditioning process. According to Skinner, punishment is the
application of an adverse outcome that decreases or weakens
Burrhus Frederic Skinner was born on March 20, 1904, in the the behavior it follows. Positive punishment involves
small town of Susquehanna, Pennsylvania, where he also grew presenting an unfavorable outcome (prison, spanking,
up. His father was a lawyer and his mother stayed home to scolding) while negative punishment involves removing a
care for Skinner and his younger brother. At an early age, favorable outcome following a behavior (taking away a
Skinner showed an interest in building different gadgets and favorite toy, getting grounded).
contraptions.
PROJECT PIGEON
As a student at Hamilton College, B.F. Skinner developed a
passion for writing. He tried to become a professional writer Skinner took a teaching position at the University of
after graduating in 1926, but with little success. Two years Minnesota following his marriage. While teaching at the
later, Skinner decided to pursue a new direction for his life. He University of Minnesota and during the height of World War
enrolled at Harvard University to study psychology. II, Skinner became interested in helping with the war effort.
He received funding for a project that involved training
INVENTIONS pigeons to guide bombs since no missile guidance systems
During his time at Harvard, Skinner became interested in existed at the time.
studying human behavior in an objective and scientific way. In "Project Pigeon," as it was called, pigeons were placed in
He developed what he referred to as an operant conditioning the nose cone of a missile and were trained to peck at a target
apparatus, which later become known as a "Skinner box." The that would then direct the missile toward the intended target.
device was a chamber that contained a bar or key that an The project never came to fruition, since the development of
animal could press in order to receive food, water, or some radar was also underway, although Skinner had considerable
other form of reinforcement. success working with the pigeons. While the project was
It was during this time at Harvard that he also invented the eventually canceled, it did lead to some interesting findings
cumulative recorder, a device that recorded responses as a and Skinner was even able to teach the pigeons to play ping-
sloped line. By looking at the slope of the line, which pong.
indicated the rate of response, Skinner was able to see that SCHEDULES OF REINFORCEMENT
response rates depended upon what happened after the animal
pressed the bar. That is, higher response rates followed In his research on operant conditioning, Skinner also
rewards while lower response rates followed a lack of discovered and described schedules of reinforcement:
rewards. The device also allowed Skinner to see that the
schedule of reinforcement that was used also influenced the 1. Fixed-ratio schedules
rate of response. The fixed-ratio schedule can be understood by looking at the
Using this device, he found that behavior did not depend on term itself. Fixed refers to the delivery of rewards on a
the preceding stimulus as Watson and Pavlov maintained. consistent schedule. Ratio refers to the number of responses
Instead, Skinner found that behaviors were dependent on what that are required in order to receive reinforcement. For
example, a fixed-ratio schedule might be delivery a reward for
every fifth response. After the subject responds to the stimulus interested buyer. It could be the next house, or it might
five times, a reward is delivered. take multiple stops to find a new customer.
 Video games: In some games, players collect tokens or
So imagine that you are training a lab rat to press a button in
other items in order to receive a reward or reach the next
order to receive a food pellet. You decide to put the rat on a
level. The player may not know how many tokens they
fixed-ratio 15 (FR-15) schedule. In order to receive the food
need in order to receive a reward or even what that reward
pellet, the rat must engage in the operant response (pressing
will be.
the button) 15 times before it will receive the food pellet. The
schedule is fixed, so the rat will consistently receive the pellet 3. Fixed-interval schedules
every 15 times it presses the lever.
In order to better understand how a fixed-interval schedule
Examples of Fixed-Ratio Schedules works, let's begin by taking a closer look at the term itself.
Production Line Work: Workers at a widget factory are paid A schedule refers to the rate at which the reinforcement is
for every 15 widgets they make. This results in a high delivered or how frequently a response is reinforced. An
production rate and workers tend to take few breaks. It can, interval refers to a period of time, which suggests that the rate
however, lead to burnout and lower-quality work. of delivery is dependent upon how much time has elapsed.3
Finally, fixed suggests that the timing of delivery is set at a
Collecting Tokens in a Video Game: In many video games,
predictable and unchanging schedule.
you have to collect so many tokens, object, or points in order
to receive some type of reward. For example, imagine that you are training a pigeon to peck at
a key. You put the animal on a fixed-interval 30 schedule (FI-
Sales Commissions: A worker earns a commission for every
30), which means that the bird will receive a food pellet every
third sale that they make.
30 seconds. The pigeon can continue to peck the key during
Grades: A child is offered a reward after they earn five A’s on that interval but will only receive reinforcement for the first
her homework assignments. After her fifth A on a homework peck of the key after that fixed 30-second interval has elapsed.
assignment, she gets to pick out a new toy.
Examples
Piecework: Jobs that require X amount of responses in order
 Imagine that you are training a rat to press a lever, but
to receive compensation. For example, a worker receives X
you only reinforce the first response after a ten-minute
amount of dollars for every 100 envelopes they stuff or every
interval. The rat does not press the bar much during the
100 fliers they stick on windshields.
first 5 minutes after reinforcement but begins to press the
Farm work: Farm workers are paid X amount of dollars for lever more and more often the closer you get to the ten-
every basket of fruit that they pick. minute mark.
 A weekly paycheck is a good example of a fixed-interval
2. Variable-ratio schedules schedule. The employee receives reinforcement every
a variable-ratio schedule with a VR 5 schedule, an animal seven days, which may result in a higher response rate as
might receive a reward for every five responses, on average. payday approaches.
This means that sometimes the reward can come after three  Dental exams also take place on a fixed-interval schedule.
responses, sometimes after seven responses, sometimes after People who go in for their regular six-month checkup and
five responses, and so on. The reinforcement schedule will cleaning often take extra care to clean their teeth right
average out to be rewarded for every five responses, but the before the exam, yet may not be as diligent on a day to
actual delivery schedule will remain completely unpredictable. day basis during the six months prior to the exam.

 Slot machines: Players have no way of knowing how 4. Variable-interval schedules


many times they have to play before they win. All they To understand how a variable-interval schedule works, let's
know is that eventually, a play will win. This is why slot start by taking a closer look at the term itself. Schedule refers
machines are so effective, and players are often reluctant to the rate of reinforcement delivery, or how frequently the
to quit. There is always the possibility that the next coin reinforcement is given. Variable indicates that this timing is
they put in will be the winning one. not consistent and may vary from one trial to the next. Finally,
 Sales bonuses: Call centers often offer random bonuses to interval means that delivery is controlled by time. So, a
employees. Workers never know how many calls they variable-interval schedule means that reinforcement is
need to make to receive the bonus, but they know that delivered at varying and unpredictable intervals of time.
they increase their chances the more calls or sales they
make. Imagine that you are training a pigeon to peck at a key to
 Door-to-door sales: The salesperson travels from house to receive a food pellet. You put the bird on a variable-interval
house, but never knows when they are going to find an 30 (VI-30) schedule. This means that the pigeon will receive
reinforcement an average of every 30 seconds. It is important Organisms (1938). His work drew comparisons to Ivan
to note that this is an average, however. Sometimes the pigeon Pavlov, but Skinner's work involved learned responses to an
might be reinforced after 10 seconds; sometimes it might have environment rather than involuntary responses to stimuli.
to wait 45 seconds. The key is that the timing is unpredictable.

Examples

 Checking Your Email: Typically, you check your email at


BABY TENDER
random times throughout the day instead of checking
every time a single message is delivered. The thing about In 1943, B.F. Skinner also invented the "baby tender" at the
email is that in most cases, you never know when you are request of his wife. It is important to note that the baby tender
going to receive a message. Because of this, emails roll in is not the same as the "Skinner box," which was used in
sporadically at completely unpredictable times. When you Skinner's experimental research. He created the enclosed
check and see that you have received a message, it acts as heated crib with a plexiglass window in response to his wife's
a reinforce for checking your email. request for a safer alternative to traditional cribs. Ladies Home
 Your Employer Checking Your Work: Does your boss Journal printed an article on the crib with the title "Baby in a
drop by your office a few times throughout the day to Box," contributing in part to some misunderstanding over the
check your progress? This is an example of a variable- crib's intended use.
interval schedule. These check-ins occur at unpredictable
times, so you never know when they might happen. The A later incident also led to further misunderstandings over
chances are good that you work at a fairly steady pace Skinner's baby crib. In her 2004 book Opening Skinner's Box:
throughout the day since you are never quite sure when Great Psychology Experiments of the Twentieth Century,
your boss is going to pop in, and you want to appear busy author Lauren Slater mentioned the oft-cited rumor that the
and productive when she does happen to stop by. baby tender was actually used as an experimental device.1 The
Immediately after one of these check-ins, you might rumors were that Skinner's daughter had served as a subject
briefly pause and take a short break before resuming your and that she had committed suicide as a result. Slater's book
steady work pace. pointed out that this was nothing more than a rumor, but a
 Pop Quizzes: Your psychology instructor might issue later review of the book mistakenly stated that her book
periodic pop quizzes to test your knowledge and to make supported the claims. This led to an angry and passionate
sure you are paying attention in class. While these exams rebuttal of the rumors by Skinner's very much alive and well
occur with some frequency, you never really know daughter Deborah.
exactly when he might give you a pop quiz. One week
In 1945, Skinner moved to Bloomington, Indiana and became
you might end up taking two quizzes, but then go a full
Psychology Department Chair at the University of Indiana. In
two weeks without one. Because you never know when
1948, he joined the psychology department at Harvard
you might receive a pop quiz, you will probably pay
University where he remained for the rest of his life.
attention and stay caught up in your studies to be
prepared. TEACHING MACHINES
SKINNER BOX Skinner also developed an interest in education and teaching
after attending his daughter's math class in 1953. Skinner
At Harvard, B.F. Skinner looked for a more objective and
noted that none of the students received any sort of immediate
measured way to study behavior. He developed what he called
feedback on their performance. Some students struggled and
an operant conditioning apparatus to do this, which became
were unable to complete the problems while others finished
better known as the Skinner box. With this device, Skinner
quickly but really didn't learn anything new. Instead, Skinner
could study an animal interacting with its environment. He
believed that the best approach would be to create some sort of
first studied rats in his experiments, seeing how the rodents
device that would shape behavior, offering incremental
discovered and used to a level in the box, which dispensed
feedback until a desired response was achieved.
food at varying intervals.
He started by developing a math teaching machine that offered
Later, Skinner examined what behavior patterns developed in
immediate feedback after each problem. However, this initial
pigeons using the box. The pigeons pecked at a disc to gain
device did not actually teach new skills. Eventually, he was
access to food. From these studies, Skinner came to the
able to develop a machine that delivered incremental feedback
conclusion that some form of reinforcement was crucial in
and presented material in a series of small steps until students
learning new behaviors.
acquired new skills, a process known as programmed
After finishing his doctorate degree and working as a instruction. Skinner later published a collection of his writings
researcher at Harvard, Skinner published the results of his on teaching and education titled The Technology of Teaching.
operant conditioning experiments in The Behavior of

You might also like