You are on page 1of 3

OPERANT CONDITIONING

B.F. SKINNER

The History of Operant Conditioning who had proposed what he called the law
Operant conditioning was first described of effect. According to this principle,
by behaviorist B.F. Skinner, which is why actions that are followed by desirable
you may occasionally hear it referred to as outcomes are more likely to be repeated
Skinnerian conditioning. As a behaviorist, while those followed by undesirable
Skinner believed that it was not necessary outcomes are less likely to be repeated.
to look at internal thoughts and
motivations to explain behavior. Instead, Operant conditioning relies on a fairly
he suggested, we should look only at the simple premise: Actions that are followed
external, observable causes of human by reinforcement will be strengthened and
behavior. more likely to occur again in the future. If
you tell a funny story in class and
Through the first part of the 20th century, everybody laughs, you will probably be
behaviorism became a major force within more likely to tell that story again in the
psychology. The ideas of John B. Watson future.
dominated this school of thought early
on. Watson focused on the principles of If you raise your hand to ask a question
classical conditioning, once famously and your teacher praises your polite
suggesting that he could take any person behavior, you will be more likely to raise
regardless of their background and train your hand the next time you have a
them to be anything he chose. question or comment. Because the
behavior was followed by reinforcement,
Early behaviorists focused their interests or a desirable outcome, the preceding
on associative learning. Skinner was more action is strengthened.
interested in how the consequences of
people's actions influenced their behavior. Conversely, actions that result in
punishment or undesirable consequences
Skinner used the term operant to refer to will be weakened and less likely to occur
any "active behavior that operates upon again in the future. If you tell the same
the environment to generate story again in another class but nobody
consequences." Skinner's theory explained laughs this time, you will be less likely to
how we acquire the range of learned repeat the story in the future. If you shout
behaviors we exhibit every day. out an answer in class and your teacher
scolds you, then you might be less likely
His theory was heavily influenced by the to interrupt the class again.
work of psychologist Edward Thorndike,
Types of Behaviors To track responses, Skinner also
Skinner distinguished between two developed a device known as a cumulative
different types of behaviors recorder. The device recorded responses
● Respondent behaviors are those as an upward movement of a line so that
that occur automatically and response rates could be read by looking at
reflexively, such as pulling your hand the slope of the line.
back from a hot stove or jerking your
leg when the doctor taps on your Components of Operant Conditioning
knee. You don't have to learn these
behaviors. They simply occur Reinforcement in Operant
automatically and involuntarily. Conditioning
● Operant behaviors, on the other Reinforcement is any event that
hand, are those under our conscious strengthens or increases the behavior it
control. Some may occur follows. There are two kinds of
spontaneously and others purposely, reinforcers. In both of these cases of
but it is the consequences of these reinforcement, the behavior increases.
actions that then influence whether 1. Positive reinforcers are favorable
or not they occur again in the future. events or outcomes that are presented
Our actions on the environment and after the behavior. In positive
the consequences of that actions reinforcement situations, a response
make up an important part of the or behavior is strengthened by the
learning process. addition of praise or a direct reward.
If you do a good job at work and your
While classical conditioning could manager gives you a bonus, that
account for respondent behaviors, Skinner bonus is a positive reinforcer.
realized that it could not account for a 2. Negative reinforcers involve the
great deal of learning. Instead, Skinner removal of unfavorable events or
suggested that operant conditioning held outcomes after the display of a
far greater importance. behavior. In these situations, a
response is strengthened by the
Skinner invented different devices during removal of something considered
his boyhood and he put these skills to unpleasant. For example, if your child
work during his studies on operant starts to scream in the middle of a
conditioning. He created a device known restaurant, but stops once you hand
as an operant conditioning chamber, often them a treat, your action led to the
referred to today as a Skinner box. The removal of the unpleasant condition,
chamber could hold a small animal, such negatively reinforcing your behavior
as a rat or pigeon. The box also contained (not your child's).
a bar or key that the animal could press to
receive a reward.
Punishment in Operant Conditioning a response occurs. Learning tends to
Punishment is the presentation of an occur relatively quickly, yet the
adverse event or outcome that causes a response rate is quite low. Extinction
decrease in the behavior it follows. There also occurs very quickly once
are two kinds of punishment. In both of reinforcement is halted.
these cases, the behavior decreases. 2. Fixed-ratio schedules are a type of
1. Positive punishment, sometimes partial reinforcement. Responses are
referred to as punishment by reinforced only after a specific
application, presents an unfavorable number of responses have occurred.
event or outcome in order to weaken This typically leads to a fairly steady
the response it follows. Spanking for response rate.
misbehavior is an example of 3. Fixed-interval schedules are another
punishment by application. form of partial reinforcement.
2. Negative punishment, also known as Reinforcement occurs only after a
punishment by removal, occurs when certain interval of time has elapsed.
a favorable event or outcome is Response rates remain fairly steady
removed after a behavior occurs. and start to increase as the
Taking away a child's video game reinforcement time draws near, but
following misbehavior is an example slow immediately after the
of negative punishment. reinforcement has been delivered.
4. Variable-ratio schedules are also a
Reinforcement Schedules type of partial reinforcement that
Reinforcement is not necessarily a involve reinforcing behavior after a
straightforward process, and there are a varied number of responses. This
number of factors that can influence how leads to both a high response rate and
quickly and how well new things are slow extinction rates.
learned. Skinner found that when and how 5. Variable-interval schedules are the
often behaviors were reinforced played a final form of partial reinforcement
role in the speed and strength of Skinner described. This schedule
acquisition. In other words, the timing involves delivering reinforcement
and frequency of reinforcement after a variable amount of time has
influenced how new behaviors were elapsed. This also tends to lead to a
learned and how old behaviors were fast response rate and slow extinction
modified. rate.

Skinner identified several different


schedules of reinforcement that impact
the operant conditioning process:

1. Continuous reinforcement involves


delivering a reinforcement every time

You might also like