You are on page 1of 8

Models of the Reading


By Murray Peglar B.A., B.Ed

To be able to teach reading, it is important to understand what

happens when we read.

Reading basically involves transforming a text, which is a graphic

representation, into thought, or meaning. It used to be thought that
this was simply a matter of combining letters into words, words into
sentences and sentences into meanings. However, over the last
thirty years, psychologists and linguists, using a variety of
experimental techniques, have discovered that things are much
more complex. Several models of the reading process have been
put forward to account for the experimental findings. A key element
in explaining reading is the amount to which what the brain already
knows affects perception of what is being read (top-down
processing). This idea was initially thought to be in contrast to earlier
ideas that reading was a linear progression from page to
understanding (bottom-up processing), but newer research seems to
indicate that both elements play important parts in reading.

The following sections outline some very important research and

ideas in our understanding of reading:

Kenneth Goodman
David Rumelhart
Rayner and Pollatsek
Short Circuits
Teaching Implications

Kenneth Goodman
In the early 1960s Kenneth S. Goodman began studying the
reading of authentic texts by urban and rural young people. His
earliest miscue research, published in 1965, is probably the most
widely replicated study in reading research history. But it was his
article, "Reading: a Psycholinguistic Guessing Game" (1967), that
began a revolution moving away from a view of reading as rapid
accurate sequential word recognition to an understanding of reading
as a process of constructing meaning - making sense - of print. That
research is part of the basis for the whole language movement and
disagreements over his conclusions about the nature of reading fuel
the current "reading wars." (Stenhouse Publishers, 2003)

Goodman defined reading as: a receptive psycholinguistic

process wherein the actor uses strategies to create meaning from
text (Goodman, 1988). Basically, the study of reading looks at
translating a linguistic surface representation (text) into thought.
Goodman based much of his theory on analysing miscues
(mistakes) in texts being read-aloud. He believed that efficient
readers minimize dependence on visual detail, but focused his
theories on the interactions of reader and text. Basic physical
sensory information (the physiological process) is cycled into deeper
levels of cognitive processes.
Cycles readers move from text to understanding through cycles of
deeper processing, moving from optical, to perceptual, to syntactic,
to meaning
Cognitive Processes of the brain used in reading are:

recognition / initiation the brain must recognise text and

initiate reading

prediction anticipates and predicts as it seeks order and

significance of input

confirmation verification of predictions or disconfirmation

correction reprocessing when it finds inconsistencies or


termination formal ending of reading act

N.B.: Goodman treats these processes as sequential, whereas later

models may not
This limited view, however, was still an improvement upon Noam
Chomskys generative grammar, which lacked explanation of topdown processing. Goodman also promoted the use of natural texts,
believing that language must be studied in context. This follows from
his postulated three sources of linguistic information: symbols
(characters), language structure (syntax), and semantic (meaning).

David Rumelhart
Rumelhart helped develop the field of cognitive science in the
1970s with his work on long term memory and semantic mapping in
the mind. He improved upon Goodmans model by creating a nonsequential model that relies heavily on the use of schemata and topdown processing for explaining understanding.
[can delineate] in a general manner, without limitation to any single
determinate figure as experience, or any possible image that I can
represent in concreto (Kant, 1781).
abstract structure of information (Anderson, 1984)
meanings [encoded] in memory in terms of the typical or normal
situations or events that instantiate the concept (Rumelhart, 1980)
A schema filled in with default values is called
a prototype. Whereas a schema is an organized abstract framework
of objects and relations, a prototype consists of a specified set of
expectations. A prototype is a highly typical instantiation or instance
of a schema (Langacker, 1987). If the instantiation (example)
matches our schema (idea), we comprehend. If understanding does
not occur, we can infer that the text does not have enough clues, or
that the reader does not have the appropriate schema. Learning
involves creating or changing schemata through:

accretion filling in variables in general and specific


tuning changing the constraints on one variable

restructuring building new schemata based off old models

There is room for flux in a perceived schema, as variables can

compensate for missing or altered factors. However, pre-reading to
activate a schema may not really help because schemata are still
relatively fixed and solid, especially in common or familiar areas.
Quick introductions may not undo years of solidifying schemata.
We can therefore think of schemata in terms of our:

Play schemata: with a script (schema) that is interpreted

(instantiations)Theory schemata: a predictable and useful

reality is represented and continually recreated by the sum of
our schemata

Procedure schemata: with (limitless?) subsets of meanings

and processes

Parse schemata: they determine how legal a situation is

(whether it fits with meaning)

Though schema theory would seem to explain only top-down and

internal processing, it also operates at lower levels, using featuredetectors that confirm attributes to interpret sensory data. These
sensory schemata then activate higher and higher level schemata,
eliminating erroneous possibilities, and narrowing understanding to
the appropriate meaning.

Rayner and Pollatsek

In their foundational work on reading psychology, Raynor and
Pollatsek used experimental evidence to show that neither top-down,
nor bottom-up theories in isolation can fully account for reading
data. This continues to be the main issue in modern research: how
to intersect bottom-up and top-down models. Though Raynor and
Pollatsek, and Clarke (see Short Circuits), have used experimental
research to show these two theories pointing to an intersection that
is reading, how these models can work together is still unknown.
They have, however, created a detailed model of sentence reading
that takes into account the interactions of initial encoding, long term
memory / knowledge, and the active processes of working memory
and parsing.
In this model, the parser (the part of the brain that analyses
sentences for structure) is seen as a purely syntactic device. It uses
input from the lexicon (personal vocabulary of language and
morphemes) to produce a structural representation for the sentence.
The parser uses the principles of minimal attachment and late
An example of minimal attachment is illustrated by Rayner and
Pollatsek (1989) in the sentences, "The girl knew the answer by

heart" and "The girl knew the answer was wrong". The minimal
attachment principle leads to a grammatical structure in which "the
answer" is regarded as the direct object of the verb "knew". This
works for the first sentence, but not the second, illustrating the effect
of late closure having a bearing on the grammatical structure.
They also assume that the nature of temporary storage in the
working memory is phonological. Therefore, if comprehension fails,
the inner speech module can replay the message. There is little
mention of details about how meaning is represented.
Though there is a detailed mapping of cognitive processes during
reading, Raynor and Pollatsek also found that good readers are able
to recognise lexical forms at a processing speed faster than the time
required to activate context effects and conscious predicting. Thus,
their theories present a more integrated approach, involving both
bottom-up and top-down processing, as the interactive models,
attempting to be more comprehensive, rigorous and coherent, give
emphasis to the interrelations between the graphic display in the
text, various levels of linguistic knowledge and processes, and
various cognitive activities (Weber, 1984).

Short Circuits and Reading

Though one would assume that good readers would use larger
chunks of text, and rely on semantic (meaning) cues rather than
syntactical (grammar) ones, and that these differences would hold
for L1 and L2 reading, some surprising evidence has been found.
Strong L1 readers did rely more on semantic cues, and weak
readers more on syntactical, however, both used syntactic cues
equally in L2. During oral reading miscue tests, differences between
strong and weak readers also diminished, though the types of
mistakes made were different, with strong readers making more
semantically acceptable miscues. What this means is that good L1
readers appeared less able to use their reading strategies in L2. It is
hypothesised that limited control over the language short circuits
the good readers system causing him/her to revert to poor reader
strategies (Clarke, 1980), in difficult L2 tasks. This creates short
circuits (gaps in reading understanding) in the following situations:

good L1 strategies + poor L2 competency = poor L2 reading

good L1 strategies + good L2 competency = good L2 reading

Thus, it appears that strategies and behaviours, not necessarily

knowledge, have a large effect on reading abilities. Strategies that
Clarke indicates as being useful are: concentration on passage-level
semantic cues; the formulation of hypotheses about the text before
reading, then reading to confirm, refine, or reject those hypotheses;
the de-emphasis of graphophonic and syntactic accuracy, that is,
developing a tolerance for inexactness, a willingness to take
chances and make mistakes. This being said, the importance of
language skills for effective reading should not be undermined.
Especially for weaker L1 readers, explicit teaching of strategies, as
well as language, would be appropriate, whereas stronger L1
readers may only need reminders of effective reading strategies.
Short Circuit any reading that does not end with meaning

letter naming spelling out words

recoding print is matched to another code (ie: sound) with

no meaning

syntactic nonsense approximating understanding when the

load is too great

partial structures alternating periods of productive reading

creating partial understanding

(Goodman, 1984)

Teaching Implications
The balance between top-down, and bottom-up processing,
though identified as complimentary, is still somewhat nebulous.
Therefore, much of the recommended teaching practice based on
these theories still centre around exercises that isolate and improve
top-down and bottom-up skills. Patricia Carrell (1987) has
categorised some such exercises:

Bottom-Up Exercises:

Grammatical Skills - basic grammar awareness will, of course,

help in reading comprehension, but decoding skills should
also include learning cohesive devices (substitution, elipsis,
conjunction, and lexical cohesion)

Vocabulary Development - with the introduction of schema

theory, vocabulary acquisition is now seen to involve deeper
understanding of words and their contexts, and should thus
be taught with an eye to quality, not quantity of learned words.

Top-Down Exercises:

Schema Activation - by building background knowledge, we

can increase students' understanding of texts. Cultural and
experiential knowledge gaps can create the impression of a
language barrier, when it is simply that the student lacks the
appropriate schema. Pre-reading exercises, realia in the
classroom, bit-by-bit exposure to text, visual representations,
semantic mapping, sub/superordinating, and comparisons
with previous knowledge are all ways to create understanding
of the concept before the language. For specific approaches,
see also:
o The Language Experience Approach (Hall, 1981;
Stauffer, 1980)
o Extending Concepts Through Language
Activities (Smith-Burke, 1980)
o Directed Reading-Thinking Activity (Stauffer, 1980)
o The Experience-Text-Relationship Method (Au, 1979)
o The Pre-Reading Plan (Langer, 1980)
o The Survey-Question-Read-Recite-Review
Method (Robinson, 1941)

See also Barnitz (1985), and Teaching

Reading for a summary of each method