You are on page 1of 10

Drawing the Boundaries of Meaning: Neo-Gricean studies in pragmatics and semantics in

honor of Laurence R. Horn; edited by Betty J. Birner and Gregory Ward (John Benjamins
Publishing Company, 2006)

Martina Blei
University of Rijeka

Draft of the review published in the Croatian Journal of Philosophy (Vol. XI, No 31,
2011)



The book is a festschrift written in honor of Laurence R. Horn. The editors of the book
claim that the volume is intended to bring together the best of the current work at the
semantics/pragmatics boundary from a neo-Gricean perspective in memory of one of the
most known representatives of this approach to the questions of the philosophy of
linguistics, namely Laurence R. Horn. He laid the ground for what would later become
known as the classic Gricean analysis of logical operator and presented to philosophers
and linguists his theory of scalar implicature that will later come to be known as the
theory of Horn scales. He examined the problem of natural language negation from a
neo-Gricean perspective and formed a well known theory of Q- and R-based
conversational implicature. This is what we find out about L. R. Horn in the short
introduction to the book written by its editors, Betty J. Birner and Gregory Ward and this
are the questions approached by the twenty two contributors in their eighteen
contributions published in this volume. The shortness of the introduction can be a
problem for a reader not so well versed in the problematic of this book because he is
forced to face a great deal of different problems and approaches that are not that easy to
follow without previous knowledge of Horns work, and more broadly, pragmatics and
semantics in general. A more explanatory introduction would be appreciated.
Lets present the contributions to this volume and their authors. Due to the restrictions of
space and the scope of interest that I have in this book Ill briefly present the
contributions that offer approaches that are more strictly linguistic and then proceed to a
more detailed presentation of the other papers. Betty J. Birner explores the relation
between inferential relations and noncanonical word order in the light of discourse-
old/new and hearer old/new information. Greg Carlson and Gianluca Storto examine the
semantics and pragmatics of certain context-sensitive lexical items, such as the words
protection, clue or danger. Donka F. Farkas analyses the uses of the Romanian
equivalents of English free choice any within what she calls an indefinitist theory.
Anastasia Giannakidou examines three Greek counterparts to the English word even
that are all polarity-sensitive. Georgia M. Green examines attitudinal markers such as
well and uh, which indicate something about how the speaker feels about what is
being said. Pauline Jacobson examines the construction cant seem. She tries to explain
why is it that in constructions like I cant seem to figure this out can and seem
appear to reverse scope. Steven R. Kleinedler and Randall Eggert discuss the semantics
and pragmatics of personal pronouns and their lexicographical challenges. Sally
McConnell-Ginet focuses on instrumental definitions which recommend particular
understanding of certain words the author analyzes the words are marriage and
family against the background of the same-sex marriage debate. Frederick J.
Newmeyer argues for a formal (syntactic) account of negation in English that would lend
support to a modular account of grammatical phenomena. Barbara H. Partee explores
some puzzles about the meanings of Mandarin noun phrases like possessors, numerals
and demonstratives and relates them to puzzles concerning definiteness and partitivity in
English possessives. Ellen F. Prince argues that reference must be handled by relation to
its truth-conditional meaning and discourse anaphora possibilities and then examines the
impersonal subject pronoun one in Yiddish and French. Scott A. Schwenter presents to
the reader the Jespersens Cycle, a term that refers to the cyclical process by which
sentence negators increase and reduce their formal complexity in different historical
steps, and then presents evidence which indicate that it needs adjustment.









Kent Bach presents to us what he considers to be The top 10 misconceptions
about implicature. These misconceptions are the following: (1) sentences have
implicatures; (2) implicatures are inferences; (3) implicatures cant be entailments; (4)
Gricean maxims apply only to implicatures; (5) for what is implicated to be figured out,
what is said must be determined first; (6) all pragmatic implications are implicatures; (7)
implicatures are not part of the truth-conditional contents of the utterances; (8) if
something is meant but unsaid, it must be implicated; (9) scalar implicatures are
implicatures; and (10) conventional implicatures are implicatures. He presents this
misconceptions in a, as he puts it, easy-to-follow sequence, without targeting their
sources.
As stated above, the first misconception is that sentences have implicatures. This
misconception arises from the confusion between implication and implicature, a term
coined by Grice that occurs in natural language. Implication is a logical term. We can
translate p q as, for example, q is implied by p or If p is true, then q is true. By the
statement p, q is implicated. The important thing is that if p is true q must be true. If I say
to a person with a broken car, that there is a gas station around the corner I may be
implicating that the station is open, sells gas etc., but it doesnt have to be the case. As
Bach puts it: what a sentence implies depends on its semantic content, while what a
speaker implicates is a matter of his communication intention in uttering the sentence.
The second misconception is that implicatures are inferences. As the first one, this
misconception is also a product of the misuse of terms. Implicatures are produced by the
speaker and to figure out the implication the hearer needs to perform an inference, that is
draw a conclusion that is not explicit in what is being said.
The third misconception is that implicatures cant be entailments. Here Bach
states that, even thou implications are not implicatures sometimes there can be a
parallelism between the two.
The fourth misconception is that Gricean maxims apply only to implicatures.
Gricean maxims are guidelines for successful communication, and implicatures arise
exactly from the violation of a maxim.
The fifth misconception is that for what is implicated to be figured out, what is
said must be determined first. According to Bach Grice never had an idea like that in
mind, all he wanted was to give a rational reconstruction of the process of understanding
of an implicature, he did not want to give a psychological theory or a cognitive model of
the process in question.
The sixth misconception is that all pragmatic implications are implicatures. Not
everything that may be inferred from the fact that a speaker uttered a certain sentence is
an implicature. For there to be an implicature there needs to be an intention, but even
without it we can reveal some unsaid information about the speaker, but this is not a case
of implication.
The seventh misconception is that implicatures are not part of the truth-
conditional content of utterances. An implicature carried by an utterance of a sentence is
not part of the semantic content of the sentence, but that does not mean that the
implicature is not part of the truth-conditional content of the utterance.
The eighth misconception is that if something is meant but unsaid, it must be
implicated. Here Bach turns to his notion of conversational impliciture. There are two
kinds of implicitures: expansion and completion. In the first case the hearer needs to add
more words to the speakers utterance in order to understand it, as in the case of the
utterance I will be home later (tonight). In the case of completion a sentence does not
express a complete proposition, take for example the sentence Fanny has finished
(there must be something which Fanny is being claimed to have finished, says Bach). The
first question we could pose is weather there is really a difference between this two
examples. And the second one is do we need the notion of impliciture at all. Recall
Grices example:
A: Im out of petrol.
B: There is a garage round the corner.
Couldnt we say that A needs to add more words to understand Bs utterance? For
example, There is a garage round the corner and its open. But that would be strange, A
doesnt need to add words, he just needs to process the utterance in the right way, and
thats the case in Bach examples too.
The ninth misconception is that scalar implicatures are implicatures at all.
According to Bach they are implicitures. But maybe they are none of that. Bach says that
in the example Some of the boys went to the party the speaker could have added but
not all. According to him that makes it an impliciture. But the sentence Some of the
boys went to the party, but not all from the point of view of language economy is not a
very good one. The added part is redundant; it does not add anything new to the sentence
because the but not all part is already contained in the word some (at least in
everyday natural language).
The last misconception is that conventional implicatures are implicatures. Bach
denies the idea that conventional implicatures are implicatures at all.



In her contribution entitled Where have some of the presuppositions gone?
Barbara Abbott tries to explain why some presuppositions are easily neutralized, while
others are hard. She begins her exposition with the distinction between presupposition
and conventional implicature. Broadly speaking a presupposition is a prior assumption,
something which must belong to the common ground in order to make an utterance
felicitous. The assertion of the sentence The king of France is wise presupposes the
assertion of There is a king of France. Conventional implicatures, on the other hand are
elements of meaning which are not part of the truth conditions of a sentence/utterance,
but which are conveyed conventionally rather than conversationally. The assertion of the
sentence He is an Englishman; he is, therefore, brave conveys the content of His being
brave is a consequence of his being Englishman. The word therefore is a trigger for
conventional implicature
1
. A trigger is a construction or item that signals the existence of

1
We could disagree about the mere existence of conventional implicature why should we say that the
word therefore implicates anything? The word "therefore" carries a sense of causation, but is not a
question of implication, but of the basic conventional meaning of the word.
a conventional implicature. There are also presupposition triggers that signal the presence
of a presupposition (in the example above the definite description triggers the
presupposition of the existence of the king). The word trigger can be somehow
misleading because it seems to indicate that it can give rise to an
implication/presupposition but that it isnt necessary, and that a trigger is somehow
external to the implication/presupposition and that is not true. The crucial difference
between presuppositions and conventional implicatures is that presuppositions are
entailed by the sentence (are part of the truth conditions of the sentence) and conventional
implicatures are not.
Abbott continues her essay by exploring four ways in which we can get rid of a
presupposition or a conventional implication suspension, metalinguistic cancellation,
filtering out and contextual neutralization. Abbott argues that the phenomenon of
contextual neutralization seems to distinguish some presupposition triggers from others.
The question she poses is why? She follows Dorit Abusch distinction between soft and
hard triggers. Soft triggers are those presupposition triggers whose presuppositions are
relatively easily neutralized (for example change of state verbs like stop If you
stopped smoking in 2001, you are eligible for a payment from the Tobacco Indemnity
Fund), and hard triggers for the other ones (For example the trigger too After the
first meeting, John will either miss the second meeting too, or attend the first meeting
too). The problem that the author tries to clarify is what differentiates hard triggers from
soft triggers. She explores the hypothesis that neutralization correlates with
nondetechability if a particular presupposition or conventional implicature is easily
detached it should not be neutralizable, but if it is nondetachable it should be
neutralizable.

In his paper Saying less and meaning less Michael Israel tries to blend elements
of Neo-Gricean Pragmatics and Cognitive Semantics in order to explore the phenomena
of attenuation and understatement. Attenuation involves the expression of a relatively
uninformative proposition that carries no additional meaning. An understatement, on the
other hand is a statement which can be used to express the meaning of a more informative
statement. So we can say that an understatement carries an implication that can be
triggered by the use of an attenuated proposition that and depends on the hearers ability
to enrich the content of the given utterance. Israel proposes gives examples of cases the
speaker would want to express less then she really means it could be out of politeness,
or the desire not to incriminate herself. He then argues that this phenomenon is
uncontroversial from the point of view of pragmatics, but quite puzzling form the point of
view of semantics how can the meaning of a construction consist in an absence of
meaning? Israel think that this paradox is more apparent then real and then it stems from
the idea that meaning can be equated with informativity he assumes that meaning is a
matter of imagination, more than of informativity. Melting neo-Gricean pragmatics and
cognitive semantics the author he proposes a compromise between those who place a
clear boundary between semantics and pragmatics, ant those who view meaning
construction as an entirely pragmatic process. He accepts the distinction between what is
said and what is implicated, but refuses the assumption that either one of the two has any
ontological priority over the other; neither could exist without the other.
Israel defends his position giving examples of the way in which different polar
sensitive items (what Abbott calls triggers) work. After a first set of examples he turns
to the way children understand implicatures and reports the conclusions of various
researches that show that children are often insensitive to pragmatic infelicities which
arise when a speaker says conspicuously less than she easily and truthfully could. From
the experimental literature he builds his theory on he concludes that childrens utterances
are built upon the principle that a speaker should say no more than is necessary to achieve
her goals, and not on the principle that a speaker should say enough to achieve her
communicative goals.

In their contribution Referring expressions and conversational implicature
Andrew Kehler and Gregory Ward explore the connection between these two
phenomena. They study the way in which speakers choose from the array of referring
expressions that are available. Speakers choice of referring expressions can lead to
conversational implicature. For example, if a speaker utters the sentence X is meeting a
woman this evening he would normally implicate that the woman in question is not Xs
wife. Kehler and Ward present to the reader three proposals for the explanation of the
connection between referring expressions and conversational implicature the Hawkins
analysis, the GHZ analysis and their own analysis based on the notion of familiarity. The
Hawkins analysis is implicature-based and its concerned with the contrast between the
definite (the) and indefinite articles (a/some) in English. For Hawkins, the
difference between these two forms hinges on whether the intended referent is unique
within a contextually-determined association set. Hawkins explains the utterance

I met a student before class. A student came to se me after class as well in
fact it was the same student I had seen before.

arguing that the use of the second occurrence of a student conversationally implicates
(by Grices Maxim of Quantity) that the student is not unique within any set that is
mutually manifest to the speaker and the hearer.
For GHZ, the implicature in examples such as the one mentioned above results
from the speakers decision to use to use an expression associated with a particular
cognitive status, which licenses the inference that the referent does not satisfy the
constraints associated with stronger statuses.
Ward and Kehler propose an alternative analysis of the way implicatures can be
the result of particular choices of referring expressions an analysis based on familiarity.
They state that in unexceptional contexts, the speakers failure to use a referring
expression that would normally indicate hearer-familiarity conversationally implicates
that the referent is nonfamiliar. In this context the notion of familiarity means that the
addressee is able to uniquely identify the intended referent because he already has a
representation of it in memory.
After presenting to the reader this three positions the authors compare them,
weighing the positive and the negative sides of each approach. They conclude with the
idea that familiarity is not necessary the dominant factor that determines choice of
referential form and that uniqueness (advocated on part of the Hawkins analysis),
giveness (advocated by the GHZ analysis) and familiarity are needed for a complete
account of the relationship that holds among the various referring expressions of a
language.

Francis Jeffry Pelletier and Andrew Hartline in their paper On a homework problem of
Larry Horns try to give a partial solution to a problem raised by the differences between
logical connectors and connectors in natural languages that Horn assigned as homework
to his students in various iterations of his semantics and pragmatic classes. When a
speaker employs or in a conversation he semantically means inclusive or (V), but the
hearer is justified in concluding an exclusive or (@) by means of some Grice-inspired
implicatures. Horns questions concern the mechanisms for converting the truth-
functional inclusive or to a truth-functional exclusive or, in context, especially in the
cases where there are multiple disjunctions. The first two problems labeled by Pelletier
and Hartline as easy ask:
1) What does ((P@Q)@R) mean?
2) Write a truth table for ((P@Q)@R).
When the students write down a truth table they will find out that the answer to the first
question is: The formula is true just in case either: exactly one of P, Q and R is true, or
else all three are true. The next two questions are:
3) Given the above truth table for @(P,Q,R), what happens with four arguments to
@, i.e., to @(P,Q,R,S)?
4) What is the relevant generalization for n, i.e., for @(P
1,
P
2,
P
n
)?
After filling in correctly the truth table the students should conclude that @(P,Q,R,S) is
true when exactly one of the four components is true and when exactly three of the four
components are true, and false otherwise. For some students the fourth question can be
misleading and they should help themselves writing down a truth table for @(P,Q,R,S,T).
by doing that that might correctly generalize to the claim that @(P
1,
P
2,
P
n
) is true if and
only if an odd number of P
1,
P
2,
P
n
are true, and false otherwise.
The answers to these questions lead us to the hard questions brought up by the fact that
in natural languages the interpretation of exclusive or with any number of disjuncts is
exactly one of the disjuncts is true and not an odd number of disjuncts is true.
Pelletier and Hartline present to us a new connective, a ternary exclusive or
connective
3
. The sentence
3
(P,Q,R) says that exactly one of the three disjuncts is
true. The problem is how to generate
3
(or
n
for higher values of n) from the @s
which were in turn generated from Vs.

Jerrold M. Sadock in his contribution Motors and switches An exercise in syntax and
pragmatics presents to us a challenge to the validity of the classical, truth-functional
treatment of natural language connectives, namely Coopers (1978) example: If you
throw switch S and switch T, the motor will start. Therefore, either if you throw switch S,
the motor will start, or, if you throw switch T the motor will start. Sadock tries to show
that a Gricean approach can account for the discrepancy between the truth-functional
treatment and our sense of what is conveyed by the words.
The problem lays in the second sentence of the example: Either if you throw switch S, the
motor will start, or, if you throw switch T the motor will start. From the point of view of
logic this sentence would be taken as intended to convey that throwing either switch is
sufficient to get the motor to start, (SVT) M; but from the point of view of form the
sentence would appear to mean that throwing switch S will get the motor to start and
throwing switch T will not and vice versa, (SM) V (TM). According to Sadock to
resolve this problem we should turn to the grammatical rule of operator spreading. Using
that rule we can spread an operator expression from a form like OP(AVB) into each of
several disjuncts, giving a form like OP(A) or OP(B). The use of this rule will favor the
reading of the sentence as (SVT) M. Moreover, according to Sadock the utterance of
such a sentence should be taken as a biconditional. The communicative value of the
sentence would then become ((SVTM)&(not(SVT) not(M)).