You are on page 1of 10

Theories of Teaching and Learning

Behaviorist - Associationist

1. Connectionism
Proponent of the theory: Edward Thorndike, an American Psychologist

> Organisms form associative bonding or connection between stimuli and response"

> Connection between the stimulus and the response"

Note:

1. The consequence of the stimulus should be satisfying

2. The connection is then stamped in because of the pleasure they bring

3. Connections become strengthened with constant practice.

Example: Edward Thorndike's Puzzle Box

1. Hungry cats were individually placed into a box that could be opened by the animal via a
device such as a latch.
2. Once outside of the box, the cats gained access to food.
3. Thorndike found that the cats took less and less time to get out of the box the more trials
training had been given.

2. Behaviorist - Associationist Connectionism


Application in human learning: Learning in humans is achieved when an individual is able to
form associations between a particular stimulus and response.

1. Law of Effect - pleasurable consequences lead to repetition while unpleasant outcomes


extinguish behavior
Law of Effect in Teaching

✓ Students would be more likely to attend a class if the instructor adopts or makes use of a
pedagogy that makes the subject interesting and easy for the students to learn.

✓A student scoring well in exams gets positively reinforced to study harder for the next set of
exams.

✓A word of appreciation from the instructor's end for an introverted student for doing class
participation would motivate such a student to participate in the future as well.

✔When a student solves the problems correctly, he gets encouraged to solve more questions.

✔A student should be encouraged to undertake study in a subject that interests him the most.
Thus, studying the subject of one's own interest would lead to the success of the student.

2. Law of Readiness - learners will be resistant to learning until they are ready. When students
feel ready, they learn more effectively and with greater satisfaction than when not ready.

Law of Readiness in Teaching

✓ Hook the learners before they ever begin the course. This can be done via a pre-work activity
or a short video introducing the content. By creating anticipation, you are building learner
excitement and motivating them before they ever even access the content.

✓ Let students know why it is important to learn a subject and what can they expect from the
course. By sharing with the learners what they will learn, you are already motivating the
students to meet standards set forth. Be very clear about how the content will be organized and
lay out expected outcomes. This removes some anxiety and begins to get the learners excited
about accomplishing tasks set forth.

3. Law of exercise - what is practiced is strengthened while what is not practices is weakened.
Learning by doing is better then learning by watching others.

Law of Exercise in Teaching

✓ One learns by doing and one cannot learn a skill, for instance, by watching others. It is
necessary to practice the skill, because by doing so the bond between stimulus and response is
strengthened. In applying this to motor learning, the more often a given movement is repeated,
the more firmly established it becomes. The performance of drills attempts to utilize this law.

4. Law of belongingness - if two elements are seen as belonging together, they are more
easily learned.
Classic example: to open a door successfully, you need to turn the handle and pull or push the
door. The combined actions of turning the handle and pushing or pulling, opens the door.

Law of Belongingness in Teaching

✓ The teacher should establish the connections of the lessons and concepts taught in the
classroom. This include organizing the topics in chronological order. In Medical laboratory
science courses, the teacher needs to establish the connection between a particular skill and its
purpose or principle so that students can easily comprehend and draw out his or her own
insights.

3. Operant Conditioning/Instrumental Conditioning


Proponent of the theory: Burrhus Frederick Skinner, American Psychologist

> Employs rewards and punishments for behavior.

> Association is made between behavior and the consequence for that behavior which is either
positive or positive.

Principles of operant conditioning

1. Reinforcement refers to any condition that strengthens a particular action.

A. Positive reinforcement - Reward for doing something well.

B. Negative reinforcement - A bad consequence is removed after a good behavior is exhibited.

2. Punishment -refers to any condition that weakens the behavior

A. Positive punishment - Adding an aversive (unpleasant) consequence after an undesired


behavior is displayed to decrease future occurrences.

B. Negative punishment - Taking away a certain reinforcing item or a pleasant stimulus after
the undesired behavior happens to decreases future occurrences.
Principles of operant conditioning

3. Shaping - used to teach behaviors that they have never performed before with rewards.

> Subject perform behaviors that at first merely resemble the target behavior; through
reinforcement, these behaviors are gradually changed, or shaped, to encourage the
performance of the target behavior itself.

> Behaviors are broken down into many small, achievable steps.

How shaping works?

> Reinforce any response that resembles the target behavior.

Then reinforce the response that more closely resembles the target behavior. You will no longer
reinforce the previously reinforced response.

Next, begin to reinforce the response that even more closely resembles the target behavior.

Continue to reinforce closer and closer approximations of the targetbehavior.

Finally, only reinforce the target behavior.

Example of shaping:

> Using "shaping" to teach a child to clean a home

1. Child cleans up one toy and is rewarded.


2. Child cleans up five toys; then chooses whether to pick up ten toys or put her books and
clothes away; then cleans up everything except two toys.

Note: Through a series of rewards, she finally learns to clean her entire room.

Principles of operant conditioning

4. Extinction refers to a diminishing response when it is not followed by a reward.

-The reinforcement that maintains the behavior is no longer provided.

-Strength of the behavior is decreased


Principles of operant conditioning

5. Generalization refers to an act of responding to a new stimulus in a similar way as to a


conditioned stimulus.

Example of generalization:

Little Albert experiment by researchers John B. Watson and Rosalie Rayner conditioned a little
boy to fear a white rat.

Principles of operant conditioning

6. Discrimination- it involves the ability to distinguish between one stimulus and similar stimuli.
In both cases, it means responding only to certain stimuli, and not responding to those that are
similar.

1. Sound of a bell (neutral stimulus) is repeatedly paired with the presentation of food
(unconditioned stimulus) will naturally and automatically led to salivary response (unconditioned
response).

2. If another tone is added to the experiment, like a trumpet, and if the dog did not salivate, that
means that they are able to discriminate the sound of the tone and the similar stimulus.

Note: Not just any noise will produce a conditioned response. Because of stimulus
discrimination very particular sound will lead to a conditioned response.

4. Classical Conditioning
Proponent of the theory: Ivan Petrovich Pavlov, Russian Physiologist

> A neutral stimulus, when repeatedly paired with a stimulus that normally elicit a response,
comes to elicit a similar or identical response.
5. Contiguity Theory
Proponent of the theory: Edwin Ray Guthrie, Russian Physiologist

> A combination of stimuli which has accompanied a movement will on its recurrence tend to be
followed by that movement.

> All learning was a consequence of association between a particular stimulus and
response.

> Stimuli and responses affect specific sensory-motor patterns; what is learned are movements,
not behaviors.

>Rewards or punishment play no significant role in learning since they occur after the
association between stimulus and response has been made.

> Prostremity - which specifies that we always learn the last thing we do in response to a
specific stimulus situation.

> Trial and error learning.

Classic experiment: Cats escaping a puzzle box by Guthrie and Horton, 1946

1. Guthrie used a glass paneled box that allowed him to photograph the exact movements of
cats.

2. These photographs showed that cats learned to repeat the same sequence of movements
associated with the preceding escape from the box.

3. Improvement comes about because irrelevant movements are unlearned or not included in
successive associations.

Classroom example
> Students associate a study technique with making good grades.

Principles

1.In order for conditioning to occur, the organism must actively respond (i.e., do things).

2.Since learning involves the conditioning of specific movements, instruction must present very
specific tasks.
3.Exposure to many variations in stimulus patterns is desirable in order to produce a
generalized response.
4.The last response in a learning situation should be correct since it is the one that will be
associated.

6. Human Associative Learning


Proponent of the theory: Hermann Ebbinghaus, a German Psychologist

> Learning and Memory through psychology

> A learning principle that states that ideas and experiences reinforce each other and can be
mentally linked to one another. In a nutshell, it means our brains were not designed to recall
information in isolation; instead, we group information together into one associative memory.

> Example: It is difficult to recall just one eyebrow without seeing the whole face.

Hermann Ebbinghaus, the Forgetting Curve

1. Ebbinghaus conducted a series of tests on himself, which included memorization and


forgetting of meaningless three letter words,

2. Ebbinghaus memorized different nonsense words such as "WID", "ZOF'' and "KAF", and
then he tested himself to see if he could retain the information after different time periods.

3. The results thus obtained were plotted in a graph. which is now referred to as the forgetting
curve.

4. Ebbinghaus found the forgetting curve to be exponential in nature. Memory retention is 100%
at the time of learning any particular piece of information.

5. However, it drops rapidly to 40% within the first few days. After which, the declination of
memory retention slows down again.

6. In simple words, forgetting curve is exponential because memory loss is rapid and huge
within the first few days of learning. But, the rate of memory loss decreases and the rate of
much forgetting are much slower from then on.
Rate of forgetting

1. Meaningfulness of the information.

2. The way the information is presented

3. Physiological factors (examples: Stress, lack of sleep etc.)

Increasing Memory Strength asserted by Ebbinghaus

1. Better memory representation (i.e mnemonic).

2. Repetition based on active recall (spaced repetition) - here lessons are retaken at increasing
intervals until knowledge is fully embedded in long-term memory.

7. Systemic behavior theory/Drive theory


Proponent of the theory: Clark L. Hull, and American Psychologist

Drive refers to increased arousal and internal motivation to reach a particular goal.

Two types:

• Primary drives are directly related to survival and include the need for food, water, and oxygen.

• Secondary or acquired drives are those that are culturally determined or learned, such as the
drive to obtain money, intimacy, or social approval.

- Drive theory holds that these drives motivate people to reduce desires by choosing responses
that will most effectively do so.

> For instance, when a person feels hunger, he or she is motivated to reduce that drive by
eating.

> When there is a task at hand, the person is motivated to complete it.
Proponent of the theory: Clark L. Hull, and American Psychologist

Drive refers to increased arousal and internal motivation to reach a particular goal.

- Two types:

-Primary drives are directly related to survival and include the need for food,water, and oxygen.
-Secondary or acquired drives are those that are culturally determined or learned, such as the
drive to obtain money, intimacy, or social approval.

- Drive theory holds that these drives motivate people to reduce desires by choosing responses
that will most effectively do so.

1. For instance, when a person feels hunger, he or she is motivated to reduce that drive by
eating

2.When there is a task at hand, the person is motivated to complete it.

Experiment on drive theory on rat behavior by Hull's students, Charles T. Perm and
Stanley B. Williams

1. The rats were trained to run down a straight alley way to a food reward. 2. Thereafter, two
groups of rats were deprived of food, one group for 3 hours and the other for 22.

3. Hull proposed that the rats that were without food the longest would I have more motivation,
thus a higher level of drive to obtain the food. reward at the end of the maze.

4. Furthermore, he hypothesized that the more times an animal was rewarded for running down
the alley, the more likely the rat was to develop the habit of running.

5. As expected, Hull and his students found that length of deprivation and number of times
rewarded resulted in a faster running speed toward the reward. His conclusion was that drive
and habit equally contribute to performance of whichever behavior is instrumental in drive
reduction.
8. Stimulus Sampling Theory
Proponent of the theory: William Kaye Estes, and American Psychologist

> A mathematical learning theory stating that stimuli are composed of hypothetical elements
and that on any given learning trial one learns only a sample of those elements

> The theory suggested that a particular stimulus response association is learned on a single
trial; however, the overall learning process is a continuous one consisting of the accumulation of
discrete S-R pairings. On any given learning trial, a number of different responses can be
made but only the portion that are effective (1.e., rewarded) form associations.

> Thus, learned responses are a sample of all possible stimulus elements experienced.
Variations (random or systematic) in stimulus elements are due to environmental factors or
changes in the organism.

Application

1. Over time, different stimulus elements become available or unavailable for sampling due to
external or internal variations. Hence, some of the stimuli that have been conditioned in S-R
pairs (1.e., memory traces) may not be available at a given time we wish to make use of the
pairing.

2. On the other hand, something we have temporarily forgotten may be remembered when the
relevant stimuli happen to be included in the sample. The stronger the memory (i.e., the more
pairings created), the higher the likelihood that relevant stimuli are included in the
current sampling.

You might also like