You are on page 1of 19

AI Magazine Volume 25 Number 3 (2004) (© AAAI)

Articles

Precisiated
Natural Language
(PNL)
Lotfi A. Zadeh

■ This article is a sequel to an article titled “A New cate logic, modal logic, and so on, is its much
Direction in AI—Toward a Computational Theory higher expressive power.
of Perceptions,” which appeared in the Spring The conceptual structure of PNL mirrors two
2001 issue of AI Magazine (volume 22, No. 1, fundamental facets of human cognition: (a) par-
73–84). The concept of precisiated natural lan- tiality and (b) granularity (Zadeh 1997). Partiality
guage (PNL) was briefly introduced in that article, relates to the fact that most human concepts are
and PNL was employed as a basis for computation not bivalent, that is, are a matter of degree. Thus,
with perceptions. In what follows, the conceptual we have partial understanding, partial truth, par-
structure of PNL is described in greater detail, and tial possibility, partial certainty, partial similarity,
PNL’s role in knowledge representation, deduc- and partial relevance, to cite a few examples. Sim-
tion, and concept definition is outlined and illus- ilarly, granularity and granulation relate to clump-
trated by examples. What should be understood is ing of values of attributes, forming granules with
that PNL is in its initial stages of development and words as labels, for example, young, middle-aged,
that the exposition that follows is an outline of the and old as labels of granules of age.
basic ideas that underlie PNL rather than a defini- Existing approaches to natural language pro-
tive theory. cessing are based on bivalent logic—a logic in
A natural language is basically a system for de- which shading of truth is not allowed. PNL aban-
scribing perceptions. Perceptions, such as percep-
dons bivalence. By so doing, PNL frees itself from
tions of distance, height, weight, color, tempera-
limitations imposed by bivalence and categoricity,
ture, similarity, likelihood, relevance, and most
and opens the door to new approaches for dealing
other attributes of physical and mental objects are
with long-standing problems in AI and related
intrinsically imprecise, reflecting the bounded
fields (Novak 1991).
ability of sensory organs, and ultimately the brain,
At this juncture, PNL is in its initial stages of de-
to resolve detail and store information. In this per-
velopment. As it matures, PNL is likely to find a va-
spective, the imprecision of natural languages is a
direct consequence of the imprecision of percep- riety of applications, especially in the realms of
tions (Zadeh 1999, 2000). world knowledge representation, concept defini-
How can a natural language be precisiated—pre- tion, deduction, decision, search, and question an-
cisiated in the sense of making it possible to treat swering.
propositions drawn from a natural language as ob-
jects of computation? This is what PNL attempts to

N
do. atural languages (NLs) have occupied,
In PNL, precisiation is accomplished through and continue to occupy, a position of
translation into what is termed a precisiation lan- centrality in AI. Over the years, impres-
guage. In the case of PNL, the precisiation language sive advances have been made in our under-
is the generalized-constraint language (GCL), a
standing of how natural languages can be dealt
language whose elements are so-called generalized
constraints and their combinations. What distin-
with on processing, logical, and computation-
guishes GCL from languages such as Prolog, LISP, al levels. A huge literature is in existence.
SQL, and, more generally, languages associated Among the important contributions that relate
with various logical systems, for example, predi- to the ideas described in this article are those of

74 AI MAGAZINE Copyright © 2004, American Association for Artificial Intelligence. All rights reserved. 0738-4602-2004 / $2.00
Articles

Biermann and Ballard (1980), Klein (1980), metic (Kaufmann and Gupta 1985). This an-
Barwise and Cooper (198l), Sowa (1991, 1999), swer is a consequence of the general rule
McAllester and Givan (1992), Macias and Pul- Q1 As are Bs
man (1995), Mani and Maybury (1999), Allan Q2 (A and B)s are Cs
(2001), Fuchs and Schwertelm (2003), and (Q1 × Q2) As are (B and C)s
Sukkarieh (2003). Another simple example is the tall Swedes
When a language such as preciasiated natur- problem (version 1).
al language (PNL) is introduced, a question Swedes who are more than twenty years old
that arises at the outset is: What can PNL do range in height from 140 centimeters to 220
that cannot be done through the use of exist- centimeters. Most are tall. What is the average
ing approaches? A simple and yet important height of Swedes over twenty?
example relates to the basic role of quantifiers A less simple version of the problem (ver-
such as all, some, most, many, and few in human sion 2) is the following (a* denotes “approxi-
cognition and natural languages. mately a”).
In classical, bivalent logic the principal
Swedes over twenty range in height from 140
quantifiers are all and some. However, there is
centimeters to 220 centimeters. Over 70* per-
a literature on so-called generalized quantifiers cent are taller than 170* centimeters; less than
exemplified by most, many, and few (Peterson 10* percent are shorter than 150* centimeters,
1979, Barwise and Cooper 198l). In this litera- and less than 15 percent are taller than 200*
ture, such quantifiers are treated axiomatically, centimeters. What is the average height of
and logical rules are employed for deduction. Swedes over twenty?
By contrast, in PNL quantifiers such as A PNL-based answer is given in the sidebar.
many, most, few, about 5, close to 7, much There is a basic reason generalized quanti-
larger than 10, and so on are treated as fuzzy fiers do not have an ability to deal with prob-
What should
numbers and are manipulated through the use lems of this kind. The reason is that in the the- be stressed is
of fuzzy arithmetic (Zadeh 1983; Kaufmann ory of generalized quantifiers there is no
and Gupta 1985; Hajek 1998). For the most
that the
concept of the count of elements in a fuzzy set.
part, inference is computational rather than How do you count the number of tall Swedes if existing
logical. Following are a few simple examples. tallness is a matter of degree? More generally, approaches
First, let us consider the Brian example how do you define the probability measure of
(Zadeh 1983): a fuzzy event (Zadeh 1968)? and PNL are
Brian is much taller than most of his close What should be stressed is that the existing complemen-
friends. How tall is Brian? approaches and PNL are complementary rather
than competitive. Thus, PNL is not intended to
tary rather
At first glance it may appear that such ques-
tions are unreasonable. How can one say some- be used in applications such as text processing, than
summarization, syntactic analysis, discourse
thing about Brian’s height if all that is known is
analysis, and related fields. The primary func-
competitive.
that he is much taller than most of his close
friends? Basically, what PNL provides is a sys- tion of PNL is to provide a computational
tem for precisiation of propositions expressed framework for precisiation of meaning rather
in a natural language through translation into than to serve as a means of meaning under-
the generalized-constraint language (GCL). Up- standing and meaning representation. By its
on translation, the generalized constraints nature, PNL is maximally effective when the
(GCs) are propagated through the use of rules number of precisiated propositions is small
governing generalized-constraint propagation, rather than large and when the chains of rea-
inducing a generalized constraint on the an- soning are short rather than long. The follow-
swer to the question. More specifically, in the ing is intended to serve as a backdrop.
Brian example, the answer is a generalized con- It is a deep-seated tradition in science to
straint on the height of Brian. view the use of natural languages in scientific
Now let us look at the balls-in-box problem: theories as a manifestation of mathematical
immaturity. The rationale for this tradition is
A box contains balls of various sizes and
that natural languages are lacking in precision.
weights. The premises are:
However, what is not recognized to the extent
Most are large. that it should be is that adherence to this tra-
Many large balls are heavy. dition carries a steep price. In particular, a di-
What fraction of balls are large and heavy? rect consequence is that existing scientific the-
The PNL answer is: most × many, where ories do not have the capability to operate on
most and many are fuzzy numbers defined perception-based information—information
through their membership functions, and exemplified by “Most Swedes are tall,” “Usual-
most × many is their product in fuzzy arith- ly Robert returns from work at about 6 PM,”

FALL 2004 75
Articles

The Robert example (b):


• Most tall men wear large-sized shoes.
INFORMATION • Robert is tall.
• What is the probability that Robert wears
large-sized shoes?
measurement-based perception-based An immediate problem that arises is that of
numerical linguistic meaning precisiation. How can the meaning of
the perception “There are several times as
• It is 35 C˚ • It is very warm many large balls as small balls” or “Usually
Robert returns from work at about 6 PM” be de-
• Monika is 28 • Monika is young
fined in a way that lends itself to computation
• Probability is 0.8 • Probability is high
and deduction? Furthermore, it is plausible, on
• • It is cloudy intuitive grounds, that “Most Swedes are tall”
• • Traffic is heavy conveys some information about the average
• It is hard to find parking height of Swedes. But what is the nature of this
near the campus information, and what is its measure? Existing
bivalent-logic-based methods of natural lan-
• Measurement-based information may be viewed as guage processing provide no answers to such
a special case of perception-based information questions.
The incapability of existing methods to deal
with perceptions is a direct consequence of the
Figure 1. Modalities of Measurement-Based fact that the methods are based on bivalent
and Perception-Based Information. logic—a logic that is intolerant of imprecision
and partial truth. The existing methods are cat-
egorical in the sense that a proposition, p, in a
natural language, NL, is either true or not true,
with no shades of truth allowed. Similarly, p is
either grammatical or ungrammatical, either
“There is a strong correlation between diet and ambiguous or unambiguous, either meaning-
longevity,” and “It is very unlikely that there ful or not meaningful, either relevant or not
will be a significant increase in the price of oil relevant, and so on. Clearly, categoricity is in
in the near future” (figure 1). fundamental conflict with reality—a reality in
Such information is usually described in a which partiality is the norm rather than an ex-
natural language and is intrinsically imprecise, ception. But, what is much more important is
reflecting a fundamental limitation on the cog- that bivalence is a major obstacle to the solu-
nitive ability of humans to resolve detail and tion of such basic AI problems as common-
store information. Due to their imprecision, sense reasoning and knowledge representation
perceptions do not lend themselves to mean- (McCarthy 1990, Davis 1990, Sowa 1991, 1999,
ing representation and inference through the Yager 1991, Sun 1994, Dubois and Prade 1996),
use of methods based on bivalent logic. To il- nonstereotypical summarization (Mani and
Mayberry 1999), unrestricted question answer-
lustrate the point, consider the following sim-
ing, (Lehnert 1978), and natural language com-
ple examples.
putation (Biermann and Ballard 1980).
The balls-in-box example: PNL abandons bivalence. Thus, in PNL
A box contains balls of various sizes. My per- everything is, or is allowed to be, a matter of
ceptions of the contents of the box are: degree. It is somewhat paradoxical, and yet is
• There are about twenty balls. true, that precisiation of a natural language
• Most are large. cannot be achieved within the conceptual
• There are several times as many large balls as structure of bivalent logic.
small balls. By abandoning bivalence, PNL opens the
The question is: What is the number of small door to a major revision of concepts and tech-
balls? niques for dealing with knowledge representa-
tion, concept definition, deduction, and ques-
The Robert example (a):
tion answering. A concept that plays a key role
My perception is: in this revision is that of a generalized con-
• Usually Robert returns from work at about straint (Zadeh 1986). The basic ideas underly-
6 PM. ing this concept are discussed in the following
The question is: What is the probability that section. It should be stressed that what follows
Robert is home at about 6:15 PM? is an outline rather than a detailed exposition.

76 AI MAGAZINE
Articles

The Concepts of
Generalized Constraint and Standard constraint: X ∈ C
Generalized-Constraint Generalized constraint: X isr R
Language
copula
A conventional, hard constraint on a variable,
GC-form (generalized
X, is basically an inelastic restriction on the X isr R constraint form of type r)
values that X can take. The problem is that in
modality identifier
most realistic settings—and especially in the
case of natural languages—constraints have constraining relation
some degree of elasticity or softness. For ex- constrained variable
ample, in the case of a sign in a hotel saying
“Checkout time is 1 PM,” it is understood that
1 PM is not a hard constraint on checkout X = (X1, …, Xn)
time. The same applies to “Speed limit is 65 X may have a structure:
miles per hour” and “Monika is young.” Fur-
thermore, there are many different ways, call
X = Location (Residence(Carol))
them modalities, in which a soft constraint re- X may be a function of another variable: X = f(Y)
stricts the values that a variable can take.
These considerations suggest the following X may be conditioned: (X/Y)
expression as the definition of generalized r := / ⱕ /…/  /  blank / v / p / u / rs / fg / ps / …
constraint (figure 2):
X isr R,
Figure 2. Generalized Constraint.
where X is the constrained variable; R is the con-
straining relation; and r is a discrete-valued
modal variable whose values identify the modal-
ity of the constraint (Zadeh 1999.) The con-
strained variable may be an n-ary variable, X = µ
(X1, …, Xn); a conditional variable, X|Y; a struc-
tured variable, as in Location(Residence(X)); or a
function of another variable, as in f(X). The 1 µsmall.number
principal modalities are possibilistic (r = blank),
probabilistic (r = p), veristic (r = v), usuality (r =
u), random set (r = rs), fuzzy graph (r = fg), bi-
modal (r = bm), and Pawlak set (r = ps). More
specifically, in a possibilistic constraint,
X is R,
R is a fuzzy set that plays the role of the possi- 0
bility distribution of X. Thus, if U = {u} is the number
universe of discourse in which X takes its val-
ues, then R is a fuzzy subset of U and the grade
of membership of u in R, µR (u), is the possibil- Figure 3. Trapezoidal Membership Function of “Small Number.”
ity that X = u: (“Small number” is context dependent.)
µ R(u) = Poss{X = u}.
For example, the proposition p: X is a small
number is a possibilistic constraint in which means that X is a normally distributed random
“small number” may be represented as, say, a variable with mean m and variance σ2.
trapezoidal fuzzy number (figure 3), that repre- In a veristic constraint, R is a fuzzy set that
sents the possibility distribution of X. In gener- plays the role of the verity (truth) distribution
al, the meaning of “small number” is context- of X. For example, the proposition “Alan is half
dependent. German, a quarter French, and a quarter Ital-
In a probabilistic constraint: ian,” would be represented as the fuzzy set
X isp R, Ethnicity (Alan) isv (0.5 | German + 0.25 | French
X is a random variable and R is its probability + 0.25 | Italian),
distribution. For example, in which Ethnicity (Alan) plays the role of the
X isp N(m, σ2) constrained variable; 0.5 | German means that

FALL 2004 77
Articles

Y f granule
L
M
S
0
0

medium x large S M L
f* (fuzzy graph) perception
Y f f*:

if X is small then Y is small


if X is medium then Y is large
0 X if X is large then Y is small

Figure 4. Fuzzy Graph of a Function.

the verity (truth) value of “Alan is German” is spectively, and Ai × Bj(i) is the Cartesian prod-
0.5; and + plays the role of a separator. uct of Ai and Bj(i). Equivalently, a fuzzy graph
In a usuality constraint, X is a random vari- may be expressed as a collection of fuzzy if-
able, and R plays the role of the usual value of then rules of the form
X. For example, X isu small means that usually
if X is Ai then Y is Bj(i), i = 1, …; m; j = 1, …, n
X is small. Usuality constraints play a particu-
larly important role in commonsense knowl- For example:
edge representation and perception-based rea- F isfg (small × small + medium × large + large ×
soning. small)
In a random set constraint, X is a fuzzy-set- may be expressed as the rule set:
valued random variable and R is its probability if X is small then Y is small
distribution. For example, if X is medium then Y is large
X isrs (0.3\small + 0.5\medium + 0.2\large), if X is large then Y is small

means that X is a random variable that takes Such a rule set may be interpreted as a descrip-
the fuzzy sets small, medium, and large as its tion of a perception of f.
values with respective probabilities 0.3, 0.5, A bimodal constraint involves a combination of
and 0.2. Random set constraints play a central two modalities: probabilistic and possibilistic.
role in the Dempster-Shafer theory of evidence More specifically, in the generalized constraint
and belief (Shafer 1976.) X isbm R,
In a fuzzy graph constraint, the constrained X is a random variable, and R is what is referred
variable is a function, f, and R is its fuzzy graph to as a bimodal distribution, P, of X, with P ex-
(figure 4). A fuzzy graph constraint is represent- pressed as
ed as P: ΣiPj(i) \ Ai,
F isfg (Σi Ai × Bj(i)),
in which the Ai are granules of X, and the Pj (i),
in which the fuzzy sets Ai and Bj(i), with j de- with j dependent on i, are the granules of prob-
pendent on i, are the granules of X and Y, re- ability (figure 5). For example, if X is a real-val-

78 AI MAGAZINE
Articles

ued random variable with granules labeled


small, medium, and large and probability gran-
ules labeled low, medium, and high, then
probability
X isbm (low\small\+high\medium+low\large)
which means that P3
Prob {X is small} is low
Prob {X is medium} is high
P2
Prob {X is large} is low
In effect, the bimodal distribution of X may be
P1
viewed as a description of a perception of the X
probability distribution of X. As a perception of 0
likelihood, the concept of a bimodal distribu-
tion plays a key role in perception-based calcu-
lus of probabilistic reasoning (Zadeh 2002.)
The concept of a bimodal distribution is an
A1 A2 A3
instance of combination of different modalities.
More generally, generalized constraints may be P(X) = Pi(1)\A1 + Pi(2)\A2 + Pi(3)\A3
combined and propagated, generating general-
ized constraints that are composites of other Prob {X is Ai} is Pj(i)
generalized constraints. The set of all such con-
straints together with deduction rules—rules
P(X)=low\small+high\medium+low\large
that are based on the rules governing general-
ized-constraint propagation—constitutes the
generalized-constraint language (GCL). An ex- Figure 5. Bimodal Distribution: Perception-Based Probability Distribution.
ample of a generalized constraint in GCL is
(X isp A) and ( (X, Y) is B), The principal distinguishing feature of PNL
where A is the probability distribution of X and is that the precisiation language with which it
B is the possibility distribution of the binary is associated is GCL. It is this feature of PNL
variable (X,Y). Constraints of this form play an that makes it possible to employ PNL as a
important role in the Dempster-Shafer theory meaning-precisiation language for perceptions.
of evidence (Shafer 1976.) What should be understood, however, is that
not all perceptions or, more precisely, proposi-
tions that describe perceptions, are precisiable
The Concepts of Precisiability through translation into GCL. Natural lan-
and Precisiation Language guages are basically systems for describing and
reasoning with perceptions, and many percep-
tions are much too complex to lend them-
Informally, a proposition, p, in a natural lan-
selves to precisiation.
guage, NL, is precisiable if its meaning can be
The key idea in PNL is that the meaning of a
represented in a form that lends itself to com-
precisiable proposition, p, in a natural lan-
putation and deduction. More specifically, p is
guage is a generalized constraint X isr R. In
precisiable if it can be translated into what may
general, X, R, and r are implicit, rather than ex-
be called a precisiation language, PL, with the
plicit, in p. Thus, translation of p into GCL may
understanding that the elements of PL can
be viewed as explicitation of X, R, and r. The
serve as objects of computation and deduction.
expression X isr R will be referred to as the GC
In this sense, mathematical languages and the
form of p, written as GC(p).
languages associated with propositional logic,
In PNL, a proposition, p, is viewed as an an-
first-order and higher-order predicate logics,
swer to a question, q. To illustrate, the proposi-
modal logic, LISP, Prolog, SQL, and related lan-
tion p: Monika is young may be viewed as the
guages may be viewed as precisiation lan-
answer to the question q: How old is Monika?
guages. The existing PL languages are based on
More concretely:
bivalent logic. As a direct consequence, the
languages in question do not have sufficient p: Monika is young → p*: Age (Monika) is young
expressive power to represent the meaning of q: How old is Monika? → q*: Age (Monika) is ?R
propositions that are descriptors of percep- where p* and q* are abbreviations for GC(p)
tions. For example, the proposition “All men and GC(q), respectively.
are mortal” can be precisiated by translation In general, the question to which p is an an-
into the language associated with first-order swer is not unique. For example, p: Monika is
logic, but “Most Swedes are tall” cannot. young could be viewed as an answer to the

FALL 2004 79
Articles

µ
most
1

usually

0 0.5 1 number

Figure 6. Calibration of Most and Usually


Represented as Trapezoidal Fuzzy Numbers.

question q: Who is young? In most cases, how- Most Swedes are tall→
ever, among the possible questions there is one ΣCount(tall.Swedes/Swedes) is most,
that is most likely. Such a question plays the
where most is a fuzzy number that defines most
role of a default question. The GC form of q is,
as a fuzzy quantifier (Zadeh 1984, Mesiar and
in effect, the translation of the question to
Thiele 2000) (figure 6).
which p is an answer. The following simple ex-
amples are intended to clarify the process of p: Usually Robert returns from work at
about 6 PM
translation from NL to GCL.
q: When does Robert return from work?
p: Tandy is much older than Dana → X: Time of return of Robert from work,
(Age(Tandy), Age(Dana)) is much.older, Time(Return)
where much.older is a binary fuzzy relation that R: about 6 PM (6* PM)
has to be calibrated as a whole rather through r: u (usuality)
composition of much and older. p*: Prob {Time(Return) is 6* PM} is usually.
p: Most Swedes are tall A less simple example is:
To deal with the example, it is necessary to p: It is very unlikely that there will be a signifi-
have a means of counting the number of ele- cant increase in the price of oil in the near future.
ments in a fuzzy set. There are several ways in In this example, it is expedient to start with the
which this can be done, with the simplest way semantic network representation (Sowa 1991)
relating to the concept of ΣCount (sigma of p that is shown in figure 7. In this represen-
count). More specifically, if A and B are fuzzy tation, E is the main event, and E* is a subevent
sets in a space U = {u1, …, un}, with respective of E:
membership functions µA and µB, respectively, E: significant increase in the price of oil in the
then near future
ΣCount(A) = Σi µA(u1), E*: significant increase in the price of oil
Thus, near future is the epoch of E*.
and the relative ΣCount, that is, the relative
count of elements of A that are in B, is defined The GC form of p may be expressed as
as Prob(E) is R,
ΣCount(A/B) = ΣCount(AB) , where R is the fuzzy probability, very unlikely,
ΣCount(B) whose membership function is related to that
in which the membership function of the in- of likely by figure 8.
tersection AB is defined as µvery.unlikely(u) = (1– µlikely)2,
µAB(u) = µA(u)  µB(u), where it is assumed for simplicity that very acts
where  is min or, more generally, a t-norm as an intensifier that squares the membership
(Pedrycz and Gomide 1998; Klement, Mesiar, function of its operand, and that the member-
and Pap 2000). ship function of unlikely is the mirror image of
Using the concept of sigma count, the trans- that of likely.
lation in question may be expressed as Given the membership functions of signifi-

80 AI MAGAZINE
Articles

E
E*
modifier variation attribute

mod var attr


significant increase price oil

epoch

future
mod

near

Figure 7. Semantic Network of p.


(It is very unlikely that there will be a significant increase in the price of oil in the near future.)

1
likely

unlikely = ant(likely)

very unlikely = 2ant(likely)

0 1 V

very.unlikely(v) = (µlikely (1 – v))2

Figure 8. Precisiation of Very Unlikely.

cant increase and near future (figure 9), we can and meaning representation are not coexten-
compute the degree to which a specified time sive. More specifically, precisiation of a propo-
function that represents a variation in the price sition, p, assumes that the meaning of p is un-
of oil satisfies the conjunction of the con- derstood and that what is involved is a
straints significant increase and near future. This precisiation of the meaning of p.
degree may be employed to compute the truth
value of p as a function of the probability dis- The Concept of a Protoform and
tribution of the variation in the price of oil. In
this instance, the use of PNL may be viewed as the Structure of PNL
an extension of truth-conditional semantics A concept that plays a key role in PNL is that of
(Cresswell 1973, Allan 2001.) a protoform—an abbreviation of prototypical
What should be noted is that precisiation form. Informally, a protoform is an abstracted

FALL 2004 81
Articles

The Tall Swedes Problem (Version 2)


In the following, a* denotes “approximately a.” Swedes more than twenty years of age range in
height from 140 centimeters to 220 centimeters. Over 70* percent are taller than 170* centimeters;
less than 10* percent are shorter than 150* centimeters; and less than 15 percent are taller than 200*
centimeters. What is the average height of Swedes over twenty?

Fuzzy Logic Solution


Consider a population of Swedes over twenty, S = {Swede1, Swede2, …, SwedeN}, with hi, i = 1, …, N,
being the height of Si.
The datum “Over 70* percent of S are taller than 170* centimeters,” constrains the hi in h = (hi, …,
hN). The constraint is precisiated through translation into GCL. More specifically, let X denote a vari-
able taking values in S, and let X|(h(X) is ≥ 170*) denote a fuzzy subset of S induced by the constraint
h(X) is ≥ 170*. Then
Over 70* percent of S are taller than 170* →
(GCL): N1 ∑ Count( X h( X ) is. )
≥ 170* is ≥ 0.7*

where ΣCount is the sigma count of Xs that satisfy the fuzzy constraint h(X) is ≥ 170*.
Similarly,
Less than 10* percent of S are shorter than 150*→

(GCL): N1 ∑ Count( X h( X ) is )
≤ 150* is ≤ 0.1

and
Less than 15* percent of S are taller than 200*→

(GCL): N1 ∑ Count( X h( X ) is )
≥ 200* is ≤ 0.15

A general deduction rule in fuzzy logic is the following. In this rule, X is a variable that
takes values in a finite set U = {u1, u2, …, uN}, and a(X) is a real-valued attribute of X, with ai =
a (ui) and a = (ai, …, aN)
1
N
(
∑ Count X a( X ) is C is B )
Av( X ) is ?D
where Av(X) is the average value of X over U. Thus, computation of the average value, D, reduces to
the solution of the nonlinear programming problem
1 
µ D ( v) = max a i µ B  ∑ µ ( a )
N i c i

subject to
1
v =
N
∑ i
ai (average value)

where µD, µB, and µC are the membership functions of D, B, and C, respectively. To apply this rule to
the constraints in question, it is necessary to form their conjunction. Then, the fuzzy logic solution
of the problem may be reduced to the solution of the nonlinear programming problem
 1 
µ D ( v) = max h  µ ≥0.7* 
 N
∑µ i ≥170* (h )  ∧ µ
i ≤ 0.1*
1

N
∑µ
i ≤150* (h ) ∧ µ
i ≤ 0.15*
1
N

 µ ≥ 200* (hi )

subject to
1
v =
N
∑ i
hi

Note that computation of D requires calibration of the membership functions of ≤ 170*, ≤ 0.7*, ≤
150*, ≤ 0.1*, ≥ 200*, and ≤ 0.15*. Note also that the fuzzy logic solution is a solution in the sense that

82 AI MAGAZINE
Articles

summary of an object that may be a proposi-


tion, command, question, scenario, concept,
decision problem, or, more generally, a system
of such objects. The importance of the concept E*: Epoch (Variation (Price (oil)) is
of a protoform derives from the fact that it
places in evidence the deep semantic structure
significant.increase) is near.future
of the object to which it applies. For example,
the protoform of the proposition p: Monika is Price
young is PF(p): A(B) is C, where A is abstraction
of the attribute Age, B is abstraction of Monika, significant
and C is abstraction of young. Conversely, Age increase Price
is instantiation of A, Monika is instantiation of
B, and young is instantiation of C. Abstraction
may be annotated, for example, A/Attribute,
current
B/Name, and C/Attribute.value. A few examples
are shown in figure 10. Basically, abstraction is
a means of generalization. Abstraction has lev-
els, just as summarization does. For example, present
successive abstractions of p: Monika is young are Time
A(Monika) is young, A(B) is young, and A(B) is
C, with the last abstraction resulting in the ter-
near.future
minal protoform, or simply the protoform.
With this understanding, the protoform of p:
Most Swedes are tall is QAs are Bs, or equivalent- Figure 9. Computation of Degree of Compatibility.
ly, Count(B/A) is Q, and the protoform of p:
Usually Robert returns from work at about 6 PM,
is Prob(X is A) is B, where X, A, and B are abstrac-
tions of “Time (Robert.returns.from work),” Using the protoform of p and calibrations of
“About 6 PM,” and “Usually.” For simplicity, the significant increase, near-future, and likely, (figure
protoform of p may be written as p**. 9), we can compute, in principle, the degree to
Abstraction is a familiar concept in program- which any given probability distribution of
ming languages and programming systems. As time functions representing the price of oil sat-
will be seen in the following, the role of ab- isfies the generalized constraint, Prob(E) is A.
straction in PNL is significantly different and As was pointed out earlier, if the degree of com-
more essential because PNL abandons biva- patibility is interpreted as the truth value of p,
lence. The concept of a protoform has some computation of the truth value of p may be
links to other basic concepts such as ontology viewed as a PNL-based extension of truth-con-
(Sowa 1999; Smith and Welty 2002; Smith, ditional semantics.
Welty, and McGuinness 2003; Corcho, Fernan- By serving as a means of defining the deep
dez-Lopez, Gomez-Perez 2003) conceptual semantic structure of an object, the concept of
graph (Sowa 1984) and Montague grammar a protoform provides a platform for a funda-
(Partee 1976.) However, what should be mental mode of classification of knowledge
stressed is that the concept of a protoform is based on protoform equivalence, or PF equiva-
not limited—as it is in the case of related con- lence for short. More specifically, two objects
cepts—to propositions whose meaning can be are protoform equivalent at a specified level of
represented within the conceptual structure of summarization and abstraction if at that level
bivalent logic. they have identical protoforms. For example,
As an illustration, consider a proposition, p, the propositions p: Most Swedes are tall, and q:
which was dealt with earlier: Few professors are rich, are PF equivalent since
p: It is very unlikely that there will be a signifi- their common protoform is QAs are Bs or,
cant increase in the price of oil in the near fu- equivalently, Count (B/A) is Q. The same ap-
ture. plies to propositions p: Oakland is near San
With reference to the semantic network of p (fig- Francisco, and q: Rome is much older than
ure 9), the protoform of p may be expressed as: Boston. A simple example of PF equivalent
concepts is: cluster and mountain.
Prob(E) is A (A: very unlikely)
E: B(E*) is C (B: epoch; C: near.future) A less simple example involving PF equiva-
E*: F(D) (F: significant increase; lence of scenarios of decision problems is the
D: price of oil) following. Consider the scenarios of two deci-
D: G(H) (G: price; H: oil) sion problems, A and B.

FALL 2004 83
Articles

Tall Swedes
Swedes
Most Swedes are tall → Count (A/B) is Q Most
Eva is much younger than Pat → ( A (B), A (C) is R

Age Eva Age Pat Much younger

Distance between New York and Boston is 200 miles → A(B, C) is R

Usually Robert returns from work at about 6 PM →


Prob {A is B} is C
Usually
About 6 PM
Time (Robert.returns.from.work)

Carol lives in a small city near San Francisco →


A (B(C)) is D and E
Small city
City near San Francisco
Carol
Residence
Location

Figure 10. Examples of Translation from NL to PFL

Scenario A: two possible outcomes whose probabilities


Alan has severe back pain. He goes to see a doc- may not be known precisely.
tor. The doctor tells him that there are two op- The protoform language, PFL, is the set of
tions: (1) do nothing and (2) do surgery. In the protoforms of the elements of the generalized-
case of surgery, there are two possibilities: (a) constraint language, GCL. A consequence of
surgery is successful, in which case Alan will be the concept of PF equivalence is that cardinal-
pain-free; and (b) surgery is not successful, in ity of PFL is orders of magnitude lower than
which case Alan will be paralyzed from the that of GCL or, equivalently, the set of precisi-
neck down. Question: Should Alan elect able propositions in NL. As will be seen in the
surgery?
sequel, the low cardinality of PFL plays an es-
Scenario B: sential role in deduction.
Alan needs to fly from San Francisco to St. Louis
The principal components of the structure
and has to get there as soon as possible. One of PNL (figure 12) are (1) a dictionary from NL
option is to fly to St. Louis via Chicago, and the to GCL; (2) a dictionary from GCL to PFL (fig-
other is to go through Denver. The flight via ure 13); (3) a multiagent, modular deduction
Denver is scheduled to arrive in St. Louis at database, DDB; and (4) a world knowledge
time a. The flight via Chicago is scheduled to database, WKDB. The constituents of DDB are
arrive in St. Louis at time b, with a < b. Howev- modules, with a module consisting of a group
er, the connection time in Denver is short. If of protoformal rules of deduction, expressed in
the connection flight is missed, then the time PFL (figure 14), that are drawn from a particu-
of arrival in St. Louis will be c, with c > b. Ques- lar domain, for example, probability, possibili-
tion: Which option is best? ty, usuality, fuzzy arithmetic (Kaufmann and
The common protoform of A and B is shown Gupta 1985), fuzzy logic, search, and so on. For
in figure 11. What this protoform means is that example, a rule drawn from fuzzy logic is the
there are two options, one that is associated compositional rule of inference, expressed in
with a certain gain or loss and another that has figure 14 where A°B is the composition of A

84 AI MAGAZINE
Articles

and B, defined in the computational part, in


which µA, µB, and µA°B are the membership
functions of A, B, and A°B, respectively. Simi-
larly, a rule drawn from probability is shown in gain
figure 15, where D is defined in the computa-
tional part.
The rules of deduction in DDB are, basically,
the rules that govern propagation of generalized
constraints. Each module is associated with an c
agent whose function is that of controlling exe-
cution of rules and performing embedded com-
putations. The top-level agent controls the pass-
ing of results of computation from a module to
1 2
other modules. The structure of protoformal, 0
that is, protoform based, deduction is shown in options
figure 16. A simple example of protoformal de-
duction is shown in figure 17.
The world knowledge database, WKDB, con- a
sists of propositions that describe world knowl-
edge, for example, Parking near the campus is b
hard to find on weekdays between 9 and 4; Big
cars are safer than small cars; If A/person works
in B/city then it is likely that A lives in or near
B; If A/person is at home at time t then A has
returned from work at t or earlier, on the un-
derstanding that A stayed home after returning
from work. Much, perhaps most, of the infor-
mation in WKDB is perception based.
World knowledge—and especially world Figure 11. Protoform Equivalence of Scenarios A and B.

NL GCL PFL

precisiation p* p**
p
GC(p) PF(p)

Dictionary 1 Dictionary 2
WKDB DDB

World
Deduction
Knowledge
Database
Database

Figure 12. Basic Structure of PNL.

FALL 2004 85
Articles

1:
Proposition in NL Precisiation

p p* (GC-form)

most Swedes are tall ΣCount (tall.Swedes/Swedes) is most

2:
Precisiation Protoform

p* (GC-form) PF(p*)

ΣCount (tall.Swedes/Swedes) is most Q As are Bs

Figure 13. Structure of PNL Dictionaries.

X is A
(X, Y) is B µ A B (v)= max u ( µ A (u)  µ B (u,v))
Y is A  B

symbolic part computational part

Figure 14. Compositional Rule of Inference.

knowledge about probabilities—plays an essen- Welty, and McGuinness 2003), Cyc (Lenat
tial role in almost all search processes, includ- 1995), WordNet (Feldbaum 1998), and Con-
ing searching the Web. Semantic Web and re- ceptNet (Lin and Singh 2004).
lated approaches have contributed to a An example of PFL-based deduction in
significant improvement in performance of which world knowledge is used is the so-called
search engines. However, for further progress it Robert example. A simplified version of the ex-
may be necessary to add to existing search en- ample is the following.
gines the capability to operate on perception- The initial data set is the proposition (per-
based information. It will be a real challenge to ception) p: Usually Robert returns from work at
employ PNL to add this capability to sophisti- about 6 PM. The question is q: What is the prob-
cated knowledge-management systems such as ability that Robert is home at 6:15 PM?
the Web Ontology Language (OWL) (Smith, The first step in the deduction process is to

86 AI MAGAZINE
Articles

Prob (X is A) is B   
µ D ( v ) = max g  µ B  ∫ µ A (u ) g ( u) du 
Prob (X is C) is D  U 
subject to: v = ∫ µ ( u) g ( u)du
U
c

∫ g ( u)du = 1
U

symbolic part computational part

Figure 15. Rule Drawn from Probability.

antecedent GC(p) PF(p)


p precisiation abstraction
proposition

Deduction
Database

consequent

q retranslation instantiation
PF(q)
proposition

Figure 16. Structure of Protoform-Based Deduction.

use the NL to GCL dictionary for deriving the The third step is to refer the problem to the
generalized-constraint forms, GC(p) and top-level agent with the query, Is there a rule
GC(q), of p and q, respectively. The second step or a chain of rules in DDB that leads from p**
is to use the GCL to PFL dictionary to derive to q**? The top-level agent reports a failure to
the protoforms of p and q. The forms are: find such a chain but success in finding a
p*: Prob(Time(Robert.returns.from.work) is about proximate rule of the form
6 PM) is usually Prob(X is A) is B
q*: Prob(Time(Robert is home) is 6:15 PM) is ?E Prob(X is C) is D
and The fourth step is to search the world knowl-
p**: Prob(X is A) is B edge database, WKDB, for a proposition or a
q**: Prob(Y is C) is ?D chain of propositions that allow Y to be re-

FALL 2004 87
Articles

p GC(p) PF(p)

Dana is young Age (Dana) is young X is A

Tandy is a few Age (Tandy) is (Age (Dana)) Y is (X + B)


years older + few
than Dana

X is A Age (Tandy) is (young + few)


Y is (X + B)
Y is A + B
µ A+B (v) = sup u ( µ A (u)  µ B (v – u) )

Figure 17. Example of Protoformal Reasoning.

Y
contingency table
L ΣC(L/L)
3 L/S L/M L/L
M
2 M/S M/M M/L
S ΣC(S/S)

0 X 1 S/S S/M S/L


1 2 3
S M L
ΣC(M  L)
Σ(M/L) =
ΣC(L)
Degree of independence of Y from X =
degree to which columns 1, 2, and 3 are identical
PNL-based definition

Figure 18. PML-Based Definition of Statistical Independence.

88 AI MAGAZINE
Articles

placed by X. A proposition that makes this pos- sides being ∈. As ∈ increases, at which point
sible is (A/person is in B/location) at T/time if will A and B cease to be independent?
A arrives at B before T, with the understanding Clearly, independence is a matter of degree,
that A stays at B after arrival. and furthermore the degree is context depen-
The last step involves the use of the modified dent. For this reason, we do not have a univer-
form of q**: Prob(X is E) is ?D, in which E is sally accepted definition of degree of indepen-
“before 6:15 PM.” The answer to the initial dence (Klir 2000.)
query is given by the solution of the variation- One of the important functions of PNL is
al problem associated with the rule that was that of serving as a definition language. More
described earlier (figure 15): specifically, PNL may be employed as a defini-
Prob(X is A) is B tion language for two different purposes: first,
Prob(X is C) is D to define concepts for which no general defin-
The value of D is the desired probability. itions exist, for example, causality, summary,
What is important to observe is that there is relevance, and smoothness; and second, to re-
a tacit assumption that underlies the deduction define concepts for which universally accepted
process, namely, that the chains of deduction definitions exist, for example, linearity, stabili-
are short. This assumption is a consequence of ty, independence, and so on. In what follows,
the intrinsic imprecision of perception-based the concept of independence of random
information. Its further implication is that PNL variables will be used as an illustration.
is likely to be effective, in the main, in the For simplicity, assume that X and Y are ran-
realm of domain-restricted systems associated dom variables that take values in the interval
with small universes of discourse. [a, b]. The interval is granulated as shown in
figure 18, with S, M, and L denoting the fuzzy
intervals small, medium, and large.
PNL as a Definition Language Using the definition of relative ΣCount, we
As we move further into the age of machine in- construct a contingency table, C, of the form
telligence and automated reasoning, a problem show in figure 18, in which an entry such as
that is certain to grow in visibility and impor- ΣCount (S/L) is a granulated fuzzy number that
tance is that of definability—that is, the prob- represents the relative ΣCount of occurrences
lem of defining the meaning of a concept or a of Y, which are small, relative to occurrences of
proposition in a way that can be understood by X, which are large.
a machine. Based on the contingency table, the degree
It is a deeply entrenched tradition in science of independence of Y from X may be equated
to define a concept in a language that is based to the degree to which the columns of the con-
on bivalent logic (Gamat 1996, Gerla 2000, Ha- tingency table are identical. One way of com-
jek 2000). Thus defined, a concept, C, is biva- puting this degree is, first, to compute the dis-
lent in the sense that every object, X, is either tance between two columns and then ag-
an instance of C or it is not, with no degrees of gregate the distances between all pairs of
truth allowed. For example, a system is either columns. PNL would be used for this purpose
stable or unstable, a time series is either sta- An important point that this example illus-
tionary or nonstationary, a sentence is either trates is that, typically, a PNL-based definition
grammatical or ungrammatical, and events A involves a general framework with a flexible
and B are either independent or not indepen- choice of details governed by the context or a
dent. particular application. In this sense, the use of
The problem is that bivalence of concepts is PNL implies an abandonment of the quest for
in conflict with reality. In most settings, stabil- universality, or, to put it more graphically, of
ity, stationarity, grammaticality, indepen- the one-size-fits-all modes of definition that are
dence, relevance, causality, and most other associated with the use of bivalent logic.
concepts are not bivalent. When a concept Another important point is that PNL suggests
that is not bivalent is defined as if it were biva- an unconventional approach to the definition
lent, the ancient Greek sorites (heap) paradox of complex concepts. The basic idea is to define
comes into play. As an illustration, consider a complex concept in a natural language and
the standard bivalent definition of indepen- then employ PNL to precisiate the definition.
dence of events, say A and B. Let P(A), P(B), and More specifically, let U be a universe of dis-
PA(B) be the probabilities of A, B, and B given course and let C be a concept that I wish to de-
A, respectively. Then A and B are independent fine, with C relating to elements of U. For exam-
if and only if PA(B) = P(B). ple, U is a set of buildings, and C is the concept
Now assume that the equality is not satisfied of tall building. Let p(C) and d(C) be, respective-
exactly, with the difference between the two ly, my perception and my definition of C. Let

FALL 2004 89
Articles

I(p(C)) and I(d(C)) be the intensions of p(C) and What should be underscored is that in its
d(C), respectively, with intension used in its logi- role as a high-level definition language, PNL
cal sense (Cresswell 1973, Gamat 1996), that is, provides a basis for a significant enlargement
as a criterion or procedure that identifies those of the role of natural languages in scientific
elements of U that fit p(C) or d(C). For example, theories.
in the case of tall buildings, the criterion may in-
volve the height of a building. Dedication
Informally, a definition, d(C), is a good fit or, This article is dedicated to Noam Chomsky.
more precisely, is cointensive, if its intension
coincides with the intension of p(C). A measure
of goodness of fit is the degree to which the in- References
tension of d(C) coincides with that of p(C). In Allan, K. 2001. Natural Language Semantics. Oxford:
this sense, cointension is a fuzzy concept. As a Oxford Blackwell Publishers.
high-level definition language, PNL makes it Barwise, J.; and Cooper, R.. 1981. Generalized Quan-
possible to formulate definitions whose degree tifiers and Natural Language. Linguistics and Philoso-
of cointensiveness is higher than that of defin- phy 4(1): 159–209.
itions formulated through the use of languages Biermann, A. W.; and Ballard, B. W. . 1980. Toward
based on bivalent logic. Natural Language Computation. American Journal of
Computational Linguistics (6)2: 71–86.
Corcho, O., Fernandez-Lopez, M.; and Gomez-Perez,
Concluding Remarks A. 2003. Methodologies, Tools and Languages for
Building Ontologies. Where Is Their Meeting Point?
Existing theories of natural languages are based, Data and Knowledge Engineering 46(1): 41–64.
anachronistically, on Aristotelian logic—a logi- Cresswell, M. J. 1973. Logic and Languages. London:
cal system whose centerpiece is the principle of Methuen.
the excluded middle: Truth is bivalent, meaning Davis, E. 1990. Representations of Common-sense
that every proposition is either true or not true, Knowledge. San Francisco: Morgan Kaufmann.
with no shades of truth allowed. Dubois, D., and Prade, H. 1996. Approximate and
The problem is that bivalence is in conflict Commonsense Reasoning: From Theory to Practice.
with reality—the reality of pervasive impreci- In Proceedings of the Foundations of Intelligent Systems.
sion of natural languages. The underlying facts Ninth International Symposium, 19–33. Berlin:
are (a) a natural language, NL, is, in essence, a Springer-Verlag.
system for describing perceptions; and (b) per- Fellbaum, C., ed. 1998. WordNet: An Electronic Lexical
ceptions are intrinsically imprecise, reflecting Database. Cambridge, Mass.: The MIT Press.
the bounded ability of sensory organs, and ul- Fuchs, N. E.; and Schwertel, U. 2003. Reasoning in
timately the brain, to resolve detail and store Attempto Controlled English. In Proceedings of the
information. Workshop on Principles and Practice of Semantic Web
PNL abandons bivalence. What this means is Reasoning (PPSWR 2003), 174–188. Lecture Notes in
Computer Science. Berlin: Springer-Verlag.
that PNL is based on fuzzy logic—a logical sys-
tem in which everything is, or is allowed to be, Gamat, T. F. 1996. Language, Logic and Linguistics.
Chicago: University of Chicago Press.
a matter of degree.
Abandonment of bivalence opens the door Gerla, G. 2000. Fuzzy Metalogic for Crisp Logics. In
Discovering the World with Fuzzy Logic: Studies in
to exploration of new directions in theories of
Fuzziness and Soft Computing (57), ed. V. Novak and I.
natural languages. One such direction is that of
Perfilieva, 175-187. Heidelberg: Physica-Verlag.
precisiation. A key concept underlying precisi-
Haack, S. 1996. Deviant Logic, Fuzzy Logic: Beyond the
ation is the concept of a generalized constraint.
Formalism. Chicago: University of Chicago Press.
It is this concept that makes it possible to rep-
Haack, S. 1974. Deviant Logic. Cambridge: Cambridge
resent the meaning of a proposition drawn
University Press.
from a natural language as a generalized con-
Hajek, P. 2000. Many. In Discovering the World with
straint. Conventional, bivalent constraints can-
Fuzzy Logic: Studies in Fuzziness and Soft Computing
not be used for this purpose. The concept of a (57), ed. V. Novak and I. Perfilieva, 302–304. Heidel-
generalized constraint provides a basis for con- berg: Physica-Verlag.
struction of GCL—a language whose elements Hajek, P. 1998. Metamathematics of Fuzzy Logic: Trends
are generalized constraints and their combina- in Logic (4). Dordrecht, The Netherlands: Kluwer Aca-
tions. Within the structure of PNL, GCL serves demic Publishers.
as a precisiation language for NL. Thus, a Kaufmann A.; and Gupta, M. M. 1985. Introduction to
proposition in NL is precisiated through trans- Fuzzy Arithmetic: Theory and Applications. New York:
lation into GCL. Not every proposition in NL is Van Nostrand.
precisiable. In effect, the elements of PNL are Klein, E. 1980. A Semantics for Positive and Compar-
precisiable propositions in NL. ative Adjectives. Linguistics and Philosophy 4(1): 1–45.

90 AI MAGAZINE
Articles

Klement, P.; Mesiar, R.; and Pap, E. 2000. Peterson, P. 1979. On the Logic of Few, Many Zadeh, L. A. 1997. Toward a Theory of
Triangular Norms—Basic Properties and and Most. Journal of Formal Logic 20(1–2): Fuzzy Information Granulation and Its
Representation Theorems. In Discovering 155–179. Centrality in Human Reasoning and Fuzzy
the World with Fuzzy Logic: Studies in Fuzzi- Shafer, G. 1976. A Mathematical Theory of Logic. Fuzzy Sets and Systems 90(2):
ness and Soft Computing (57), ed. V. Novak Evidence. Princeton, New Jersey: Princeton 111–127.
and I. Perfilieva, 63–80. Heidelberg: Physi- University Press. Zadeh, L. A. 1986. Outline of a Computa-
ca-Verlag. tional Approach to Meaning and Knowl-
Smith, B., and C. Welty. 2002. What Is On-
Klir, G. J. 2000. Uncertainty-Based Informa- tology? Ontology: Towards a New Synthe- edge Representation Based on the Concept
tion: A Critical Review. In Discovering the sis. In Proceedings of the Second International of a Generalized Assignment Statement. In
World with Fuzzy Logic: Studies in Fuzziness Conference on Formal Ontology in Information Proceedings of the International Seminar on
and Soft Computing (57), ed. V. Novak and I. Systems. New York: Association for Com- Artificial Intelligence and Man-Machine Sys-
Perfilieva, 29–50. Heidelberg: Physica-Ver- puting Machinery. tems, ed. M. Thoma and A. Wyner,
lag. 198–211. Heidelberg: Springer-Verlag.
Smith, M. K., C. Welty, and D. McGuinness,
Lehmke, S. 2000. Degrees of Truth and De- eds. 2003. OWL Web Ontology Language Zadeh, L. A. 1984. Syllogistic Reasoning in
grees of Validity. In Discovering the World Guide. W3C Working Draft 31. Cambridge, Fuzzy Logic and Its Application to Reason-
with Fuzzy Logic: Studies in Fuzziness and Soft Mass.: World Wide Web Consortium ing with Dispositions. In Proceedings
Computing (57), ed. V. Novak and I. Perfilie- (W3C). International Symposium on Multiple-Valued
va, 192–232. Heidelberg: Physica-Verlag. Logic, 148–153. Los Alamitos, Calif.: IEEE
Sowa, J. F. 1999. Ontological Categories. In
Lehnert, W. G. 1978. The Process of Question Computer Society.
Shapes of Forms: From Gestalt Psychology and
Answering—A Computer Simulation of Cogni- Phenomenology to Ontology and Mathematics, Zadeh, L. A. 1983. A Computational Ap-
tion. Hillsdale, New Jersey: Lawrence Erl- ed. L. Albertazzi, 307–340. Dordrecht, The proach to Fuzzy Quantifiers in Natural Lan-
baum Associates. Netherlands: Kluwer Academic Publishers. guages, Computers and Mathematics 9:
Lenat, D. B. 1995. CYC: A Large-Scale In- 149–184.
Sowa, J. F. 1991. Principles of Semantic Net-
vestment in Knowledge Infrastructure. works: Explorations in the Representation of Zadeh, L. A. 1979. A Theory of Approxi-
Communications of the ACM 38(11): 32–38. Knowledge. San Francisco: Morgan Kauf- mate Reasoning. In Machine Intelligence 9,
Liu, H.; and Singh, P. 2004. Commonsense mann Publishers. ed. J. Hayes, D. Michie, and L. I. Mikulich,
Reasoning in and Over Natural Language. 149–194. New York: Halstead Press.
Sowa, J. F. 1984. Conceptual Structures: Infor-
In Proceedings of the Eighth International mation Processing in Mind and Machine. Zadeh, L. A. 1968. Probability Measures of
Conference on Knowledge-Based Intellli- Reading, Mass.: Addison-Wesley. Fuzzy Events. Journal of Mathematical
gent Information and Engineering Systems. Analysis and Applications 23: 421–427.
Sukkarieh, J. 2003. Mind Your Language!
Brighton, U.K.: KES Secretariat, Knowledge
Controlled Language for Inference Purpos-
Transfer Partnership Centre. Lotfi A. Zadeh is a profes-
es. Paper presented at the Joint Conference
Macias, B., and Stephen G. Pulman. 1995. A of the Eighth International Workshop of sor in the graduate school
Method for Controlling the Production of the European Association for Machine and a member of the De-
Specifications in Natural Language. The Translation and the Fourth Controlled Lan- partment of Electrical En-
Computing Journal 38(4): 310–318. guage Applications Workshop, Dublin, Ire- gineering and Computer
Mani, I., and M. T. Maybury, eds. 1999. Ad- land, 15–17 May. Science at the University
vances in Automatic Text Summarization. of California, Berkeley,
Sun, R. 1994. Integrating Rules and Connec-
Cambridge, Mass.: The MIT Press. and serves as director of
tionism for Robust Commonsense Reasoning.
McAllester, D. A., and R. Givan, R. 1992. the Berkeley Initiative in
New York: John Wiley.
Natural Language Syntax and First-Order Soft Computing. He received his Ph.D. from
Yager, R. R. 1991. Deductive Approximate Columbia University in 1949 and was pro-
Inference. Artificial Intelligence, 56(1): 1–20. Reasoning Systems. IEEE Transactions on moted to the rank of professor in 1957. In
McCarthy, J. 1990. Formalizing Common Knowledge and Data Engineering 3(4): 1959, he moved to the University of Califor-
Sense, ed. V. Lifschitz and J. McCarthy. Nor- 399–414. nia, Berkeley, and served as chair of the De-
wood, New Jersey: Ablex Publishers. Zadeh, L. A . 2002. Toward a Perception- partment of Electrical Engineering and Com-
Mesiar, R., and H. Thiele. 2000. On T-Quan- Based Theory of Probabilistic Reasoning puter Sciences from 1963 to 1968. His first
tifiers and S-Quantifiers. In Discovering the with Imprecise Probabilities. Journal of Sta- paper on fuzzy sets was published in 1965,
World with Fuzzy Logic: Studies in Fuzziness tistical Planning and Inference 105(1): and since then his research has focused on
and Soft Computing (57), ed. V. Novak and I. 233–264. the development of fuzzy logic and its appli-
Perfilieva, 310–318. Heidelberg: Physica- Zadeh, L. A. 2001. A New Direction in cations. Zadeh is a fellow of the American
Verlag. AI—Toward a Computational Theory of Association for Artificial Intelligence, the As-
Novak, V., and I. Perfilieva, eds. 2000. Dis- Perceptions. AI Magazine 22(1): 73–84. sociation of Computing Machinery, the In-
covering the World with Fuzzy Logic: Studies in Zadeh, L. A. 2000. Toward a Logic of Per- stitute of Electrical and Electronics Engi-
Fuzziness and Soft Computing. Heidelberg: ceptions Based on Fuzzy Logic. In Discover- neers, and the International Fuzzy Systems
Physica-Verlag. ing the World with Fuzzy Logic: Studies in Association. He is a member of the National
Novak, V. 1991. Fuzzy Logic, Fuzzy Sets, Fuzziness and Soft Computing (57), ed. V. No- Academy of Engineering and a foreign mem-
and Natural Languages. International Jour- vak and I. Perfilieva, 4–25. Heidelberg: ber of the Russian and Finnish Academies of
nal of General Systems, 20(1): 83–97. Physica-Verlag. Sciences. Zadeh has received numerous
awards in recognition of his development of
Partee, B. 1976. Montague Grammar. New Zadeh, L. A. 1999. From Computing with
fuzzy logic, among them the IEEE Medal of
York: Academic Press. Numbers to Computing with Words—From
Honor, the ACM 2000 Allen Newell Award,
Pedrycz, W., and F. Gomide. 1998. Introduc- Manipulation of Measurements to Manipu- and twenty honorary doctorates. His e-mail
tion to Fuzzy Sets. Cambridge, Mass.: The lation of Perceptions. IEEE Transactions on address is zadeh@cs.berkeley. edu.
MIT Press. Circuits and Systems 45(1): 105–119.

FALL 2004 91
Call for Participation

First Annual General Game


Playing Competition
HELD IN CONJUNCTION WITH THE
TWENTIETH NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE
Pittsburgh, Pennsylvania
Sponsored by the American Association for Artificial Intelligence

G
eneral game players are computer sys- and overall time; and the best will advance to the
tems able to accept formal descriptions of second round.
arbitrary games and able to play those
games effectively without human intervention. In the runoff round, the best of the qualifiers will
General game playing systems are characterized be pitted against each other in a series of games
by their use of general cognitive information-pro- of increasingly complexity. The entrant to win
cessing technologies (such as knowledge repre- the most games in this round will be the winner
sentation, reasoning, learning, and rational be- of the overall competition.
havior). Unlike specialized game playing systems
(such as Deep Blue), they do not rely on algo- Note that, prior to the competition, players will
rithms designed in advance for specific games. be told nothing about the games to be played.
The rules of all games will be transmitted to the
The Competition players electronically at the beginning of each
The AAAI competition is designed to test the game. Game playing systems must be able to read
abilities of general game playing systems by com- the rules for each game, receive runtime infor-
paring their performance on a variety of games. mation from the game manager, and inform the
The competition will consist of two phases: a manager of its moves.
qualification round and a runoff competition.
Prize and Eligibility
In the qualification round, entrants will play sev-
eral different types of games, including single A $10,000 prize will be awarded to the winning
player games (such as maze search), competitive entrant. The competition is open to all computer
games (such as tic-tac-toe or some variant of systems, except those generated by affiliates of
chess), games with both competitors and cooper- Stanford University. Sorry, no human players al-
ators. In some cases, complete information of the lowed.
board will be available (as in chess or tic-tac-toe);
in others, only partial info will be available (as in More Information
battleship). In some cases, the game will be ex-
haustively searchable (as in tic-tac-toe); in other The competition website, (http://games. stan-
cases, this will not be possible (as in chess). Play- ford.edu), contains further details, including the
ers will have to handle all of these possibilities. description of the underlying framework, the
Entrants will be evaluated on the basis of consis- game description language, and the program-
tent legal play, ability to attain winning positions, matic interfaces necessary to play the games.

92 AI MAGAZINE

You might also like