This action might not be possible to undo. Are you sure you want to continue?

A Guide to this Volume

Gerhard Lakemeyer 1 and Bernhard Nebel 2 1 Institute of Computer Science University of Bonn R6merstr. 164 D-53117 Bonn, Germany gerhard@cs.uni-bonn.de Department of Computer Science University of Ulm D-89069 Ulm, Germany nebel@informatik, uni-ulm.de

1

Introduction

Knowledge representation (KR) is the area of Artificial Intelligence that deals with the problem of representing, maintaining, and manipulating knowledge about an application domain. Since virtually all Artificial Intelligence systems have to address this problem, K R is one of the centrM subfields of Artificial Intelligence. Main research endeavors in KR are - representing knowledge about application areas (e.g., medical knowledge, knowledge about time, knowledge about physical systems), - developing appropriate representation languages, - specifying and analyzing reasoning over represented knowledge, and - implementing systems that support the representation of knowledge and reasoning over the represented knowledge. While knowledge about an application domain may be represented in a variety of forms, e.g., procedurally in form of program code or implicitly as patterns of activation in a neural network, research in the area of knowledge representation assumes an explicit and declarative representation, an assumption that distinguishes K R from research in, e.g., programming languages and neural networks. Explicitness of representation means that the represented knowledge is stored in a knowledge base consisting of a set of formal entities that describe the knowledge in a direct and unambiguous way. Declarativeness means that the (formal) meaning of the representation can be specified without reference to how the knowledge is applied procedurally, implying some sort of logical methodology behind it.

Although the above two points seem to be almost universally accepted by researchers working in KR, this consensus has been achieved only recently. Brachman and Levesque mentioned in the Introduction to a collection of papers in 1985 that the "research area of Knowledge Representation has a long, complex, and as yet non-convergent history," [Brachman and Levesque, 1985, p. xiii] an impression that is indeed confirmed by the papers in this collection. A large portion of the papers contain meta-level discussions arguing about the right methods for representing knowledge or they present approaches that are completely incompatible with a logical, declarative point of view. Nowadays, the picture has completely changed, however. Logical methods predominate and methodological problems are hardly discussed any longer [Brachman et al., 1989, Allen et al., 1991, Nebel et al., 1992, Brachman, 1990]. Instead, research papers focus on particular technical representation and reasoning problems and address these problems using methods from logic and computer science. While this development indicates that K R has become a mature scientific discipline, it also leads to the situation that research results in K R appear to be less accessible to the rest of the Artificial Intelligence community. As a matter of fact, it is often argued that the foundational results that are achieved in the K R field are not relevant to Artificial Intelligence at all. We concede ~hat a large amount of K R research probably does not have any immediate impact on building Artificial Intelligence systems. However, this is probably asking for too much. Foundational K R research aims at providing the theoretical foundations on which we can build systems that are useful, comprehensible, and reliable, i.e., it aims at providing the logical and computational foundations of knowledge representation formalisms and reasoning processes. Results in foundational KR often "only" provide explanations why a particular approach works or how an approach can be interpreted logically. Additionally, the borderlines of what can be represented are explored and it is analyzed how efficiently a reasoning process can be. While this may not be of central concern when building Artificial Intelligence systems, such results are nevertheless important when we want to understand such systems, and when we want to guarantee their reliability. Perhaps the main motivation and driving force behind most research in KR has been the desire to equip artifacts with commonsense. This is literally true of a paper by John McCarthy, first published in 1958 and republished as [McCarthy, 1968], which started the whole KR enterprise, and it is still true, if only implicitly, of the papers in this book. In fact, work on the foundations of KR can largely be indentified with work on the foundations of commonsense reasoning, a point of view which we will follow throughout this brief survey. In the following sections, we touch on some basic logical and computational aspects of commonsense reasoning. The reader is warned that this is not a comprehensive overview of the field, which would be far outside the scope of this book. Instead we confine ourselves mainly to those areas that are actually covered by papers in this book.

2

Logical

Foundations

of Commonsense

Reasoning

As mentioned already in the beginning, the main assumption that distinguishes knowledge-based systems from other approaches is that knowledge is represented declaratively in some logic-like language. This is one part of what Brian Smith has called the knowledge representation hypothesis [Smith, 1982]. The other part postulates that these representations play a causal role in engendering the behavior of the system. While this causal connection is present in one form or another in every knowledge-based system, it is fair to say that so far there are very few, if any, theoretical results that explain this connection. Hence most foundational research in KR, including the work reported in this book, deals with problems that arise from the first part of the K R hypothesis and which can be dealt with independently from the second part. In this context, one can identify three fundamental questions: 1. What is the right representation language? 2. W h a t inferences should be drawn from a knowledge base? 3. How do we incorporate new knowledge? In the rest of this section, we will address each question in turn with an emphasis on the relevant papers in this book. 2.1 The Right Representation Language

While there is little disagreement any more about the assumption that a representation language is one of logic, where the sentences can be interpreted as propositions about the world, 3 designing an adequate language is not an easy task, since the various desirable features are often incompatible. In particular, very expressive languages usually have poor computational properties, an issue that has drawn considerable interest since a seminal paper by Brachman and Levesque [1984] and which is discussed in more detail in the next section. At this point we only mention that computational considerations have led to the development of languages that are far less expressive than full first-order logic, most notably the so-called concept languages or terminological logics. Four of the papers in this collection are devoted to this topic [Bander and Hollunder, 1994, Bettini, 1994, Allgayer and Franconi, 1994, Donini el al., 1994]. From the point of view of expressiveness, it often seems useful to have special epistemological or ontological primitives built into the language. Shoham and Cousins [1994] survey work in AI on a whole range of mental attitudes like beliefs, desires, goals, or intentions. The need for making such notions explicit is probably most convincing in multi-agent settings, where agents need to reason about each other's mental attitudes in order to communicate and cooperate successfully. [Gottlob, 1994, Kalinski, 1994, Niemel~ and Rintanen, 1994] consider the specific case of belief, 3 Until the late seventies, many so-called representation languages actually violated this fundamental assumption and led to vivid discussions such as [Hayes, 1977, McDermott, 1978].

which, together with knowledge, is probably the best understood among the attitudes. In these papers a very specific aspect of belief is considered, namely the ability to model certain forms of defeasible reasoning by referring explicitly to the system's own epistemic state (see Section 2.2 below). Bettini and Lin [Bettini, 1994, Lin, 1994], on the other hand, are concerned with adding explicit notions of time to the language. While Bettini considers incorporating an existing interval-based concept of time to a temporal logic, Lin proposes a new axiomatization of time, where time instances are defined on the basis of events. 2.2

The Right I n f e r e n c e s

Having explicit representations of knowledge alone is not very useful in general. Instead one wants to reason about these representations to uncover what is implied by them. After all, we use the term commonsense reasoning and not commonsense representation. Until the early seventies, deduction was the main focus of attention as far as inference mechanisms are concerned. It became clear, however, that a lot of commonsense reasoning is not deductive in nature. In particular, many inferences humans draw all the time are uncertain in some sense and may therefore be defeasible if new information becomes available. The prototypical example is the assumption that birds normally fly and if someone tells me about a bird called Tweety, then, knowing nothing else, I conclude that Tweety flies. Later on, if I find out that Tweety is indeed a penguin, I withdraw my earlier conclusion without hesitation. There are essentially two main research fields that try to formalize such reasoning, one which is based on probability theory (see, for example, [Pearl, 1988]) and another which directly models nonmonotonic reasoning by modifying classical logic in one way or another (see, for example, [Brewka, 1991]). While probabilistic methods are not dealt with at all in this volume, nonmonotonic reasoning receives a fairly broad coverage [Baader and Hollunder, 1994, Gottlob, 1994, Kakas, 1994, Kalinski, 1994, Niemel~ and Rintanen, 1994, Weydert, 1994]. Except for McCarthy's [1980] Circumscription, the main formalisms on nonmonotonic reasoning are represented in this volume. Baader and Hollunder [1994] discuss extending terminological logics using Reiter's [1980] Default Logic (DL). Kakas extends DL by applying ideas from abductive logic programming to it. Gottlob [1994] relates DL and Moore's [1985] Autoepistemic Logic (AEL) by showing how to faithfully translate DL theories into AEL theories. Both Kalinski [1994] and Niemel~i and Rintanen [1994] are concerned with complexity issues, the former by considering a weaker form of AEL and the latter by considering only AEL theories of a special form (with applications to other nonmonotonic formalisms as well). Finally, Weydert [1994] presents results on nested conditionals. This work is in the tradition of modeling nonmonotonic inferences on the basis of conditional logics such as [Lewis, 1973, Adams, 1975]. Apart from probabilistic and nonmonotonic reasoning, there are many other forms such as fuzzy, inductive, abductive or analogical reasoning. Of those the latter two are represented here with one paper each. Console and Dupre [1994] address abduction, which is concerned with finding plausible explanations for a

given observation. In particular, they address the problem of finding explanations at different levels of abstraction. Myers and Konolige [1994] discuss reasoning with analogical representations such as maps. They are particularly concerned with integrating both analogical and symbolic (sentential) representations. 2.3 Evolving Knowledge

Since knowledge bases are hardly ever static, devising methods for incorporating new information into a knowledge base is of great importance in K R research. This problem, often referred to as belief revision, is particularly challenging if the new information conflicts with the contents of the old knowledge base. Over the past decade, substantial progress has been made on the topic of belief revision, particularly since the ground-breaking work by AlchourrSn, G~irdenfors, and Makinson [1985], who propose postulates which any rational revision operator should obey (now referred to as AGM-postulates). Later, Katsuno and Mendelzon [1991] introduce an important distinction between revising a knowledge base, which refers to incorporating new information about a static world, and updating it, where the new information reflects changes in the world. They also propose a set of rationality postulates for update operators. In this volume, Boutilier [1994] and Nejdl and Banagl [1994] present new results following this line of research. Nejdl and Banagl define subjunctive queries for knowledge bases in the case of both update and revision. In particular, they show that their query semantics for revision and update satisfies precisely the AGM-postulates and the Katsuno-Mendelzon-postulates, respectively. Boutilier shows that, in the context of conditional logic, belief revision and nonmonotonic reasoning have precisely the same properties, further substantiating the claim that the two areas are closely related. Witteveen and 3onker[1994] address revision from a somewhat different angle. Here the emphasis is on finding plausible expansions of logic programs, which are incoherent under the well-founded semantics, such that the revised programs are no longer incoherent. 3 Commonsense Reasoning as Computation

Once a knowledge representation scheme together with its associated commonsense reasoning task has been formalized logically, we can immediately make use of the computational machinery associated with logic. For instance, once we have identified that a particular representation formalism is "simply" a subset of standard first-order logic, we know that resolution (or any other complete proof method) is a method to compute all the valid consequences of a knowledge base. In other words, in such a case, commonsense reasoning could be reduced to a well-known computation technique. However, this point of view is over-simplifying. First of Ml, often one deals with non-standard logics, e.g., non-monotonic or modal logics, for which standard techniques do not work. Secondly, even in the case that one only has a subset of

1988]. In particular.e. As pointed out above. inference algorithms. 1994] can be specified on less than half a page. one might be able to specify methods that always terminate. . it is possible to combine the tableau-based reasoning techniques for terminological logics with re~oning techniques developed for default logics [Junker 4 Consult. one could employ standard proof techniques if the formalism under consideration is (a notational variant of) a subset of standard first-order logic. we are often forced to restrict the expressiveness of the representation language or to give up on the accuracy of the answer [Levesque and Brachman. For instance. for instance. even propositional logic requires already significant computational resources . 1991] to terminological logics containing operators for collective entities. However. 4 On the other hand. commonsense reasoning appears to be quite fast when humans perform it. W h a t appears to be much more involved is the specification of an appropriate reasoning technique. 1980]. it does not make sense to use general proof methods if specialized reasoning techniques.e.1 Inference Algorithms As is evident from most papers.. Efficiency is indeed one of the major problems when we turn logical formalization into computation. showed in their paper that it is possible to extend the tableau-based technique introduced by Schmidt-Schaut3 and Smolka [Schmidt-SchauB and Smolka. i. In particular. 1987]. Research questions coming up in this context are: 1.a problem that is addressed by most of the papers in this volume. e.g. usually we do not want an arbitrary method. and. As is well-known. Allgayer and Franconi [1994]. the semantics of a terminological logic extended by operators to express collective entities and relations [Allgayer and Pranconi.reasoning in propositional logic is NP-hard. but an algorithm that is as efficient as possible . moreover. W h a t is the computational complexity of the reasoning task? 3. but extend these by incorporating default logic [Reiter. the formalization of a commonsense reasoning task as a form of logical inference is usually not overwhelmingly difficult. 1979] for an introduction to computational complexity theory.. as they are able to show. Can we specify an inference algorithms for the reasoning task? 2. i. in this case it is not possible to use standard first-order logic methods.standard first-order logic. if it is required that the reasoning process is computationally tractable. [Garey and Johnson. providing us with a sound. and terminating method for reasoning in this language. provided appropriate formal techniques and tools are employed. How can we achieve tractability? 3. tailored to the restricted language. should work reasonably fast on computers if the system is required to be of any use [Levesque. turn out to be much more ej~icient.. However. complete. Bander and Hollunder [1994] also start with terminological logics.

Finally. Gottlob develops a translation from default logic to autoepistemic logic that is quite interesting. It should be noted that in order to guarantee decidability. 1991]). They show that such a Horn theory may sometimes be of exponential size and that it is unlikely that a dense representation can be found in all cases. In order to answer this question. A final example for the use of computational complexity theory is the paper by Gottlob [1994].and Konolige. it is necessary to use a somewhat nonstandard interpretation of open defaults. However. Since. Based on this observation.A/:C language [Schmidt-SchauB and Smolka. 3. leading to an inference algorithm for the combined formalism. However. 1992] that show that the three main forms of nonmonotonic reasoning all have the same complexity. we usually want to deal with more . Such an analysis can guide the search for more efficient algorithms or for a reformulation of the reasQning problem in a way that renders reasoning more efficient. the standard interpretation of open defaults not only leads to undeeidability but also to counter-intuitive results. Of course. it does not answer the question whether this is the most efficient way. 1990. Donini et al [1994] are able to show that in some relevant special cases the complexity goes even down from co-NP-hardness to polynomial time. E f f i c i e n c y T r a d e o f f If a reasoning problem can be shown to require time that is not polynomial in the size of the problem description (under the assumption that N P c P ) . Schwind and Risch. this implies that in the w o r s t case we will not get an answer in tolerable time when the problem description grows beyond a certain (usually moderate) size. Furthermore. Kautz and Selman [1994] analyze the computational problems arising when approximating arbitrary propositional theories by Horn theories. such computational complexity results are irrelevant. though. 1991] in an almost straightforward way. which implies that there must exist (polynomial) translations between these formalisms. Although this paper is not by itself a paper on computational complexity analysis of commonsense reasoning.2 Computational Complexity of Reasoning An inference algorithm for a particular commonsense reasoning task demonstrates that that there is one way to turn this task into computation. as shown by Baader and Hollunder. 3. it makes use of computational complexity results [Gottlob.3 T h e E x p r e s s i v e n e s s vs. computational complexity theory can be used for analyzing the inherent difficulty of the problem. a computational complexity analysis can be used to compare and contrast different reasoning problems. For instance. if the problem descriptions are almost always small. Donini et al [1994] study the extension of terminological logics by an epistemic operator and show that this operator does not increase the computational complexity of reasoning in one of the standard terminological logics (the so-called . giving up this interpretation does not seem to be much of a sacrifice.

by restricting the processing time or by employing incomplete reasoning methods. it makes sometimes sense to use a language with more constructs but with restrictions on the structure of allowed expressions. for example. 1991]. Usually. which are also much more restricted than general propositional representations. they do not restrict the expressive power by disallowing logical operators in AEL theories.4 T h e A c c u r a c y vs. provided the special cases are relevant. Moreover. As in the cases above. Although there have been strong arguments about the usefulness of achieving efficiency by restricting the expressiveness [Doyle and Patil. 1984. Levesque and Brachman. excluding a particular operator from a representation language. Brachman and Levesque. one gives up on the quality or accuracy of an answer. The main problem Myers and Konolige identify and solve is the integration of analogical reasoning with the general framework of reasoning in first-order logic. however. other means for getting timely answers are called for. Donini et al [1994] show that enlarging a terminological logic with an epistemic operator for building concepts that are used as queries and restricting the forms of the query can indeed lead to a more natural reasoning task which is also more efficient.spatial knowledge . showed that excluding a particular operator from a terminological logic reduces the complexity of reasoning from NP-hardness to polynomiality [Brachman and Levesque. Subsequent investigations along this line [Donini et al. In particular. The paper by Niemels and Rintanen [1994] aims again at guaranteeing polynomial runtime in all cases by restricting expressive power. One way to exclude worst cases is to restrict the expressiveness of the representation language the reasoning task has to deal with. 1987]. For instance. but to provide special means for representing knowledge about one particular domain .that can be more naturally represented and more efficiently reasoned about using analogical representations. but they consider restrictions on the form of the theories. the aim is not to guarantee worst-case efficiency in all cases. instead of considering a representation language with less constructs. While this may lead to the desired runtime behavior. restricting the expressiveness can mean a number of things that are quite different from. 1991] have shown that requiring polynomiality of the inference algorithm leads to a severe restriction on the possible constructs one can use. 3. So we should consider the possibility of worst cases for moderately sized problem descriptions.. for example. there seems to be nevertheless a consensus that it is useful to analyze special cases of general reasoning patterns that can be solved more easily than the general problem. The work by Myers and Konolige [1994] also extends the representational framework (first-order logic) in order to achieve efficiency. for example. In this case. it raises the question as to how far we can still trust answers . however. E f f i c i e n c y T r a d e o f f If the expressiveness of a representation cannot he restricted.than 20 concepts or 10 default rules. they show that reasoning in stratified AEL Horn theories can be done in polynomial time.

it is not possible to compute such an ordering in polynomial time. They show that a globally minimal revision cannot be computed in polynomial time. They propose to approximate default reasoning by ordering the defaults linearly. References [AAAI-90. As mentioned above. 1975] E. where the particular order chosen is intended to be "optimally correct. but they approximate such an ordering by computing a locally optimal ordering. 1994]. their approximation scheme appears to be interesting since instead of general Horn theories one may aim for more restricted forms of such theories which can be polynomially bounded in size. the set of papers in this book covers a wide range of topics in the area of foundational KR&R research and highlights the common research methodology. this approximation can lead to computational problems in itself [Kautz and Selman. by providing the theoretical underpinning for KR&R systems.from a representation and reasoning system. Nevertheless. They propose to compute (off-line) Horn theories that approximate the logical contents of a given arbitrary theory. namely. August 1990. 1990] Proceedings of the 8th National Conference of the American Association ]or Artificial Intelligence. For instance. 1992. this research methodology does most probably not lead to any immediate benefit in the sense that we can build faster or better reasoning systems. 1987]. which is known to be one source of computational complexity in default reasoning [Gottlob. 1975. Boston. this research will help us understand where and what the limits of representation and reasoning are and how we can guarantee a reasonable behavior of KR&R systems. Nebel. but a locally minimal revision can well be computed in polynomial time. MIT Press. MA. Holland. Kautz and Selman show that the approximating theory can become very large. Reidel. Dordrecht. Greiner and Schuurmans [1994] address the multiple extension problem of default reasoning [Reiter. and although there are sometimes ways around this problem. In other words. Nevertheless." As they show. to analyze representation and reasoning tasks from a logical and computational perspective. W. 1991]. Kautz and Selman [1991] addressed this problem by a "knowledge compilation" technique. probabilistic approaches are not represented at all. 4 Outlook The collection of papers in this book does certainly not give a complete overview of the research going on at providing foundations for knowledge representation and reasoning. [Adams. The Logic of Conditionals. we are seeking a principled description of the reasoning capabilities of an incomplete reasoner. . The paper by Witteveen and Jonker [1994] applies a similar method to achieve tractability for revising logic programs. However. As already mentioned in the Introduction. Adams. they can show that it is very unlikely that dense representations of a approximating Horn theory exist in all cases.

Allen. Computers and Intractability--A Guide to the Theory of NP-Completeness. Brachman. Principles of Knowledge Representation and Reasoning: Proceedings of the ~nd International Conference.A. On the logic of theory change: Partial meet contraction and revision functions. Normative. [Gottlob. Morgan Kaufmann. 1985] Carlos E. In Lakemeyer and Nebel [1994]. Brachman and Hector J. [Gottlob. CA. TX. Complexity results for nonmonotonic logics. and the utility of representation services. Levesque. 1984] Ronald J. 1992] Georg Gottlob. Artificial Intelligence. [Brachman. Brachman and Hector J. Maurizio Lenzerini.. taxonomic classification. Sydney. Garey and David S. and Werner Nutt. Australia. August 1991. 1979. Queries. 1985] Ronald J. 1994] C. ON. pages 34-37. April 1991. Austin. Los Altos. Daniele Nardi. 1990] Ronald J. MA. Freeman. 1984. [Doyle and Patti. San Francisco. [Garey and Johnson. R.. subjunctive and autoepistemic defaults. In Lakemeyer and Nebel [1994]. Morgan Kaufmann. April 1991. [Brewka. In Lakemeyer and Nebel [1994].]0 [Alchourr6n et al. Sandewall. 1991] Gerhard Brewka. Readings in Knowledge Representation. editors. 2(3). In AAAI-90 [1990]. Reiter. and R. . 1979] Michael R. June 1985. In Lakemeyer and Nebel [1994]. [Donini etal. editors. Tractable concept languages. Nonmonotonic Reasoning: Logical Foundations of Commonsense. In Lakemeyer and Nebel [1994]. Cambridge. In Lakemeyer and Nebel [1994]. Journal of Symbolic Logic. Brachman. [Brachman and Levesque. In Lakemeyer and Nebel [1994]. pages 1082-1092. [Greiner and Schuurmans. Pikes. editors. The power of beliefs or translating default logic into standard autoepistemic logic. Computing extensions of terminological default theories. In Proceedings of the •th National Conference of the American Association for Artificial Intelligence. 1991] J. Morgan Kaufmann. H. 1994] Craig Boutilier. [Console and Dupr6. 1994] Jiirgen Allgayer and Enrico Franconi. and Werner Nutt. The future of knowledge representation. Johnson. [Brachman et al.. Levesque. Journal for Logic and Computation. Learning an optimally accurate representation system. UK. A formalization of interval-based temporal subsumption in first order logic. Peter Gs and David Makinson. 1992. Andrea Schaerf. In Lakemeyer and Nebel [1994]. 1994] Franz Bander and Bernhard Hollunder. 1994] Georg Gottlob. rules and definitions as epistemic sentences in concept languages. Principles of Knowledge Representation and Reasoning: Proceedings of the 1st International Conference. Patil. Cambridge. The tractability of subsumption in frame-based description languages. 1991] Francesco M. Alchourr6n. J. [Allen etal. [Allgayer and Franconi. and E.. 1994] Russell Greiner and Dale Schuurmans. 50(2):510-530.. In Proceedings of the 1Pth International Joint Conference on Artificial Intelligence. [Brachman and Levesque. Abductive reasoning with abstraction axioms. Manrizio Lenzerini. 1994] Luca Console and Daniele Dupr6. CA. [Bettini. May 1989. [Boutilier. Two theses of knowledge representation: Language restrictions. Daniele Nardi. Toronto. [Baader and HoUunder. Levesque. [Donini et al. 1991. 1991] Jon Doyle and Ramesh S. Collective entities and relations in concept languages. Donini. 1994] Francesco Donini. Bettini. 1985. Cambridge University Press. 48(3):261-298. Morgan Kaufmann. 1989] R. pages 458-465.

1985. [Moore. Heidelberg. MIT Press. 1980] John McCarthy. [Levesque. 17:355-389. [1991]. C. Foundations of Knowledge Representation and Reasoning. . MA. [Lin. In Lakemeyer and Nebel [1994]. Semantical considerations on nonmonotonic logic. [Katsuno and Mendelzon. [Levesque and Brachman. 1994] Yuen Q. 1994. Springer-Verlag. Hayes. Reasoning with an~dogica~ representations. Default reasoning via negation as failure. [McDermott. Computing extensions of autoepistemic and default logics with a truth maintenance system. Moore. Morgan Kanfmann. In Allen et al. In Proceedings of the 5th International Joint Conference on Artificial Intelligence. 1994] Ilkka Niemel/i and Jussi Rintanen. October 1992. Logic and the complexity of reasoning. Brachman. Cambridge. [Lewis. 1988. Cambridge. Computational Intelligence. Cambridge.11 [Hayes. In Allen et aJ. In Lakemeyer and Nebel [1994]. 1994] A.. Semantic Information Processing. 1978] Drew V. Tarskian semantics. On the difference between updating a knowledge base and revising it. 1990] Ulrich Junker and Kurt Konolige. [McCarthy. Levesque. 1977] Patrick J. On the impact of stratification on the complexity of nonmonotonic reasoning. Journal of Philosophical Logic. Artificial Intelligence. editor. 1968] John McCarthy. 1991] Bernhard Nebel. [Lakemeyer and Nebel. or no notation without denotation! Cognitive Science. 1991] Hirofumi Katsuno and Alberto O. editors. Belief revision and default reasoning: Syntax-based approaches. 13(1-2):27-39. [Niemel~ and Rintanen. 1988] Hector J.revision and update semantics for subjunctive queries (extended report). Expressiveness and tractability in knowledge representation and reasoning. [Kalinski. 1987] Hector J. 1980. McDermott. [Kakas. Asking about possibilities . Kakas. In Lakemeyer and Nebel [1994]. editors. Principles of Knowledge Representation and Reasoning: Proceedings of the 8rd International Conference. [Kautz and Selman. In Lakemeyer and Nebel [1994]. 1994] Jiirgen Kalinski. New York. [Nejdl and Banagl. 1987. 1973] David K. [McCarthy. and C. W. Levesque and Ronald J. In Lakemeyer and Nebel [1994]. Counterfactuals. Harvard University Press. Cambridge. 1994] Karen Myers and Kurt Konolige. MA. July 1978. [Myers and Konolige. MA. Minsky. Forming concepts for fast inference. pages 417-428. 1985] Robert C. pages 387-394. 3:78-93. Circumscription--a form of non-monotonic reasoning. 1994] Henry Kautz and Bart Selman. Lin. Artificial Intelligence. Mendelzon. 1994] Gerhard Lakemeyer and Bernhard Nebel. [1991]. Swartout. A common-sense theory of time. In Lakemeyer and Nebel [19941. 1994] Wolfgang Nejdl and Markus Banagl. In M. Berlin. 1992] B. pages 559-565. In defence of logic. [Nebel et al. [Nebel. [Junker and Konolige. 2(3):277-282. 1973. pages 403-418. Lewis. pages 278-283. MA. Nebel. Programs with common sense. Weak autoepistemic reasoning and well-founded semantics. In AAAI-90 [1990]. August 1977. 25:75-94. In Lakemeyer and Nebel [1994]. Rich. 1968.

1991] Bart Selman and Henry Kautz. San Mateo. [Shoham and Cousins. 1987] Raymond Reiter. [Witteveen and Jonker. [Weydert. Knowledge compilation using Horn approximations. Morgan Kaufmann. Nonmonotonic reasoning.]2 [Pearl. MA. 1982] Brian C. 1987. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. editors. A logic for default reasoning. 1991] (7. Logics of mental attitudes in AI -. 1980] Raymond Reiter. Anaheim. [Smith. 1991] Manfred Schmidt-SchauB and Gert Smolka. 48:1-26. Cambridge. Risch. Kruse and P. Revision by expansion in logic programs. Artificial Intelligence. Siegel. April 1980. 1994] Emil Weydert. .a very preliminary survey. pages 310-317. Schwind and V. 1994] Yoav Shoham and Steve B. CA. 1991. Jonker. Marseilles. PhD thesis. [Selman and Kautz. In R. In Lakemeyer and Nebel [1994]. Proceedings of the European Conference EGSQA U. 1988] Judea Pearl. Hyperrational conditionals . [Reiter. Springer-Verlag. Attributive concept descriptions with complements. CA. Artificial Intelligence. 1991. Reflection and Semantics in a Procedural Language. A tableau-based chaxa~:terisation for default logic. Cousins.monotonic reasoning about nested default conditionals. July 1991. Massachusetts Institute of Technology. In Lakemeyer and Nebel [1994]. [Schwind and Risch. In Proceedings of the 9th National Conference of the American Association for Artificial Intelligence. Annual Review of Computing Sciences. t3(1):81-132. 1994] Cees Witteveen and Catholijn M. 1982. France. Symbolic and Quantitative Approaches to Uncertainty. MIT Press. [Schmidt-SchauB and Smolka. [Reiter. 2. pages 904-909. Smith. 1988. In Lakemeyer and Nebel [1994]. Report MIT/LCS/TR-272.

distributive or collective).Collective Entities and Relations in Concept Languages* J i i r g e n Allgayer 1 a n d Enrico F r a n c o n i 2 x IBM Germany. special issue on Knowledge Representation for Natural Language Processing. A twofold extension of the A/ZC concept language is investigated : (1) special relations introduce collective entities either out of their components or out of other collective entities. In this paper a way of including collections and collective relations within a concept language. (2) plural quantifiers on collective relations specify their possible reading. This work has been partially supported by the Italian National Research Council (CNR). and by the IRST M A I A project. We would like to thank also Alessandro Artale.like "sing" with any possible reading (cumulative.ibm.e.com) 2 Istituto per la Ricerca Scientifica e Tecnologica (IRST) 1-38050 Povo TN. project "Sistemi Informatici e Calcolo Parallelo". and to compute recognition of individuals. Werner Nutt and Achille Varzi for the helpful and incisive discussions we had with them.it) A b s t r a c t . Software Architectures and Technologies Schlo6-Str. . chosen as the formalism for representing the semantics of sentences. allowing for a lattice-theoretical approach to the treatment of plurals.like "the Beatles" and their relationships . i. 1 Introduction In this paper it is shown how a concept language. The formal syntax and semantics of the concept language is given. is presented. In the final part a way to include a theory of parts (mereology) is suggested. together with a sound and complete algorithm to compute satisfiability and subsumption of concepts. In order to capture the full meaning of sentences like "The Beatles sing 'Yesterday' ". Italy (franconi@irst. Moreover. D-7000 Stuttgart 1. Germany (allg ayer@ vnet. 70. a knowledge representation language should be able to express and reason about plural entities . a knowledge representation language of the KL-ONE family .also called Frame-Based Description * This paper is a a reduced version of a paper to appear in M i n d s and Machines. An advantage of this formalism is the possibility of reasoning and stepwise refining in the presence of scoping ambiguities. many phenomena covered by the Generalized Quantifiers Theory are easily captured within this framework. Collective entities and collective relations play an important role in natural language.

john).is adopted. and in the logical form it does not appear as a predicate. i. LE D-BY (beatles. other approaches in the literature include [3.e. john). Classes are involved in sentences like "Men are persons". george). ~(beatles. 17. Paul.can be extended in order to represent and reason about collective entities or collections [1. where the NP "men" is represented by means of the class predicate MAN: Vx. for a deep analysis of the relations between collection theory and Generalized Quantifiers Theory please refer to the full paper [10]. a weakened form of Set Theory . 22. and the dangerous leap into a second order theory is avoided.called Collection Theory . paul). because the extensionality principle does not hold. in order to capture the different readings of a relation when applied to a collection. it can be also applied to other knowledge representation formalisms. such as Conceptual Graphs [26] and Intensional Propositional Semantic Networks SNePS [24]. and they should be represented as terms instead of predicates. In this way. but as a term.14 Languages. ~(beatles. so that in the presence of incomplete information a complete reasoning can still be carried on. This approach allows for the representation of ambiguous readings. 9]. the logical form of the sentences "The Beatles are John. George and Ringo" and "John is the leader of the Beatles" is the following: ~(beatles. Although this work has been conceived for concept languages. collections are contingent aggregates of objects. It turns out that the collection theory is more adequate to represent plurals than set theory. ~(beatles. Interesting connections with the Generalized Quantifiers Theory [5] can be drawn. ringo). Terminological Logics. introducing an non-extensional mereology [25]. 26. Within the collection theory. M A N ( x ) --. PERSON(x). Term Subsumption Languages. An analysis of plurals in natural language leads us to distinguish among two different categories of plural entities: classes and collections. 20. The enriched concept language proposed here is intended to form the semantical and computational means for the representation of plurals and plural quantifiers in natural language. they should be interpreted at the same level of individuals as single elements of the domain. 27]. . On the other hand. In the full paper the use of this formalism within a natural language dialogue system is extensively discussed. In order to give a meaning to the terms denoting collections. A more radical departure from set theory to represent collections is proposed in the last part of this paper. plural quantifiers are introduced. a lattice-theoretical approach for the treatment of plurals as in [18] is possible. Taxonomic Logics or Description Logics [30] . at the same level as the objects it is composed by. The plural entity Beatles is interpreted as a collection. For example.

many examples will clarify the expressive power of the newly obtained language As An account of the computational properties of . they are interpreted as entities of the domain. Within this model.A/:CS is given.A~:C. For example. At the beginning the Collection Theory and the Plural Quantifiers are introduced in a generic logical framework. Finally.e. and a sound and complete decision procedure for a weaker form of the language is devised.e. paul.e. thus. b ) i f f 3 x . the extensionality axiom says that . the formulas 2(beatles.c o l l e c t i o n a n d O v e r l a p p i n g ) C_(a. paul). The C relation is defined as the usual subset of the ordinary set theory. i. 2 The Collection Theory In this section a simple formal way to model collections of objects is introduced. a primitive membership binary relation (denoted by 9) is introduced. ( S u b . A collection can be related with other collections: it can share some components or it can include all the components of other collections. ~(beatles. x ) ~ ~(b. mostPopularSingers) express that each of the beatles is also a founder of the Apple Records company.15 The paper is organized as follows. appleCharterMembers). b) if and only if Cl'(a. a theory of the part-whole relation. the formulas C_(beatles. Since any other axiom for such relations is not introduced. john). i. Then it is presented how these theories can be merged into the propositionally complete concept language . For example.x) N(a.b) # O.e.x) A ~(b. george). called members or elements of the collection. O(a. to relate the collective entities with their elements. ringo and george are elements of the collective entity named beatles. it follows from the definition that the C_ relation is reflexive and transitive but not anti-symmetric . A collection is formed by selecting certain objects. the collection theory is abandoned in favour of a Mereology. The sub-collection and overlapping relations between collections are defined by means of the primitive relation: D e f i n i t i o n 1.b) iffVx. whereas the Cl relation is defined to be true for all the non-empty set-theoretic intersections N s. it is a quasi-ordering . and that some of them are among the most popular singers. For example. are intended to mean that the entities john. both objects and collections of objects are denoted by terms. fl(beatles. it turns out that the C_ and Cl relations have weaker properties than their counterparts in set theory. C'l(a. Paul. their conjunction represents the meaning of the sentence "The Beatles are John. ~(beatlcs. In set theory. ~ ( a . Like in the standard set theory. George and Ringo".x).since the extensionality axiom does not hold. ringo). ~(beatles.i. i.

right) distributive and cumulative readings for generic binary relations having a collection as left (resp. " / l " (disjointness) and "qs . As an example. D and b_) operators . allowing for the representation of the scope ambiguities in the logical form. we speak of the distributive reading. for instance.introduce the left (resp. A richer theory would include also negated relations. does not follow that C_(beatles. This simple framework will be referred as the Collection Theory. two collections having the same elements are not necessarily equal. Thus. and separately the others sing together. consider the following example: . George and Ringo sing 'Yesterday' ". In our framework. take the possible readings involving the plural subject of the sentence "John.like the Russel paradox .called plural quantifiers. This approach helps in delaying the decision for any of those variants of a sentence.16 two sets are equal if and only if they have the same elements. In order to introduce the formalism. appleCharterMembers) appleCharterMembers = beatles. For the collective reading all the men together sing the song. A relation holds not only directly between the objects of predications. However. say. In the case of underdetermined reading. if the negation of the ~ relation can not be expressed. finally. So. in the context of concept languages. if they are collections. with the proviso that all of them are involved in some action of singing. but may be distributed between the elements of such objects. legal liability or taxes to pay. like. relationships between collections have a more structured semantics than the standard one. because the entity beatles could have different attributes from the entity representing the group of people who founded the Apple Records company. a possible incremental growth of the knowledge base might rule out one or another reading. right) argument.see [28]. one sings alone. it is argued (though not yet proved) that. representational means are introduced to capture the semantics of the (possibly underdetermined) reading variants of NL expressions. from C_(appleCharterMembers. in the case each man sings separately. The economy of representation makes it unnecessary for the system to compute all the disambiguated interpretations of the sentence before storing its meaning in the knowledge base [21]. they are needed to properly cover many natural language phenomena. Paul.are avoided and well-foundedness of the Collection Theory is guaranteed. 3 Plural Quantifiers Generic relations which apply to collective entities can be quantified in different ways. The <1 and ~ (resp. As a trivial consequence of this choice. and. negated relations may lead to undecidability. it is worth noting that paradoxes . the cumulative reading can describe the mixed situations in which. such as "~". So. beatles).

BORg-IN(ringo. ~(C1. yesterday). ~(s. A R(s. john). each member was born in Liverpool: BORN-IN(john. liverpool). b) I>R(a. b) iff Vx. liverpool). SING(C2. (Plural Quantifiers) <3n(a. SING(ringo. x) b) V (3s. For e. In this interpretation. x) --* R(a. x) A R(a. ~(C1. The relation BORN-IN is "left distributive" over the components of the beatles .e. liverpool). together with the collective interpretation SING(beatles. paul). the relation SING holds between two collective entities C1. The 'inclusion' condition of the cumulative reading is satisfied: each member of the beatles belongs to at least one of the collections participating to the relation. b)) ) :>R(a. "The Beatles sing 'Yesterday' " ~_SING(beatles. yesterday). paul).e. ~(b.B Y has a "collective" reading . SING(C1. john).e.i. yesterday). john is the leader of the whole collection beatles. yesterday). The plural quantifiers are formally defined as follows. BORN-IN(paul.17 "John is the leader of the Beatles" LE D-BY (beatles. "The Beatles were born in Liverpool" <1BORN-IN(beatles. The relation SING has a "left cumulative" reading with respect to the beatles i. ~(a. yesterday). ringo). C2 and yesterday. ranging from single individuals to the entire beatles collection. it is possible that any collection of components of beatles sings 'Yesterday'. clvis). s)) ). SING(george. b) iff Vx. It is easy to check that the cumulative plural quantified expressions subsume . ~(b. george). and the distributive interpretation SING(john. x) --+ (R(a. x) b) ifr Vx. with the proviso that the union of such collections should include at least all the beatles members. Definition 2.xample. BORN-IN(george. x) --* R(x. b) iff Vx. liverpool). ~(C2. a possible valid interpretation for the cumulative reading is the following: ~(C1. yesterday). yesterday). 2(C2. x) V (3s.i. The relation L E D . 2(C2. liverpool). SING(paul. yesterday).

AI~CS. SONG(y). y) ~ ~R(~. C and N relations are the basic roles of the collection theory. . y). According to the syntax rules of figure 1. ~ R ( x . the collective . 8. the non-disambiguated logical form of a sentence with multiple interpretations can be represented: "The Beatles sing 'Yesterday' " <3S I N G( beatles. R(x. roles and individuals.e. which are logically binary relations. in order to obtain an expressive. roles are built out of primitive roles (denoted by the letter P). y.A/:C [23]. <1 and D plural quantifiers introduce the distributive and cumulative readings for generic roles. y) --* ~ n ( x . which extends the propositionMly complete concept language . we will consider the language . y = yesterday. if later in the discourse the sentence "Each one of them sings the song" appears. A concept is a description gathering the common properties among a collection of individuals. W . y ) .18 the non-quantified . I>. y).A~. I will strictly follow the concept language formalism introduced by [23] and further elaborated for example by [6. Basic types of a concept language are concepts.CS concepts (denoted by the letters C and D) are built out of primilive concepts (denoted by the letter A) and roles (denoted by the letter R). W. Properties are represented by means of roles. y). . With respect to the formal apparatus.i. y . b = beatles. Information which comes later in the discourse can monotonically refine the knowledge about the quantifiers scoping. the <l. ~R(~. From the sentence "They sing the song all together" the system is able to conclude that they is anaphoric to the Beatles and the song to 'Yesterday'. description language. In this way. y. and will produce an interpretation which specializes the preceding underdetermined one: SING(b. y).As concept language The collection theory is now merged into a larger logic. In the following. 7]. 4 The . R(x. W. logically it is a unary predicate ranging on the domain of individuals. y . y) -~ ~R(x. but still decidable.~ ~R(x. the system will add the formulas <ISING(b. Mlowing a richer expressivity for roles: the 9. On the other hand. y). SONG(y). which are Mso a refinement with respect to the preceding cumulative interpretation. y). yesterday).expressions and the distributive plural quantified ones: W .

y. An interpretation Z = (A z. --3x. If a concept has a model. denoting any entity having no elements. Subsumption can be reduced to satisfiability since C is subsumed by D if and only if C F3-~D is not satisfiable. According to the given semantics. E C O L L ( x ) ---+C_(x. A concept C is subsumed by a concept D (written C E D) if C z C_ D z for every interpretation I . y) A ( E C O L L ( x ) V E C O L L ( y ) ) . otherwise it is unsatisfiable. then it is satisfiable. concept and role expressions are called TBox terms. .3S. denoting any collection having at least one element. can be defined as follows: COLL . y).C R --+ P I 9 I C_ I n I <1R I DR I RI ~R (top) (bottom) (general complement) (conjunction) (disjunction) (universal quantifier) (existential quantifier) (primitive role) (has element relation) (sub-collection relation) (overlaps relation) (left distributive) (right distributive) (left cumulative) (right cumulative) Fig.= -~COLL. An interpretation I is a model for a concept C if C z 7s 0. the concept non-empty collection.z) consists of a set A z of individuals (the domain of I ) and a function . E C O L L & VS.C [ 3R.I -~C I C~D I C UD I VR.-I.T.19 C~ D --~ A[ (primitive concept) T I -l. y)). y. Usually. and the concept empty collection.z (the interpretation function of I ) that maps every concept to a subset of A z and every role to a subset of A z • A z such that the equations of figure 2 are satisfied. n(x. Vx. y. It is easy to verify the validity of the following statements: Vx. . Syntax rules for the .A/:CS concept language. y) ---+n ( x . ( C O L L ( x ) A C O L L ( y ) ) ~ (C_(x. 1.

Cs(0. finally. where E C O L L is interpreted as the empty set and C O L L is interpreted as any non-empty set: Vy.COLL) denotes any entity which possibly contains less elements than some non-empty collection.~2: • a s I Vx. n'(x. b) 9 z~2: x ~2: I Vx. This is somehow counterintuitive. It is worth noting that expressions containing the <1 plural quantifier cannot be reformulated in terms of the ~ role only.COLL) denotes any entity which possibly has an overlapping with some non-empty collection.b) 9149 C I = {(~. in order to understand better the expressive power of . y) r 0).x) 9 Rz)} Fig. ~) ~ 92:} ~ 9 ~ -~ (~.s) 9 R2:) V (a. and the latter denotes only non empty collections: V n . the concept (VC_.CS. b) 9 ~2: • ~2: I W . T ) denotes any collection having at least a common element with something else. x) (. (x # 0 ^ y :/: O) ~ (c_s(x. Let us introduce now some more complex concept definitions. b) 9 . These statements match our intuitions about collections. C O L L =_ T.b) 9 R z) V (x. The concept (Vr'I. (a. the second include in its denotation all the non-empty collections. y.1_2: = 0 (C [q D)2: --. x) 9 92:} 9 9 ~ ^ (b.y) ~ n'(x. ~) ( ~ R)2: = {(a.y.C z n D2: ( C U D ) z = C2: u D2: (~c)2: = A 2: \ C2: (VR. ~3x. (a. (b. x ) 9 9 z ---+ ((3s~ ( s . x) CI2: {(a.20 AZ 72: . b) 9 R ~} e 9 ~ -~ (a.~R) ~ = {(a.C)2: = {a 9 A2: (~R.~.x) 9 92: -'-+ ((3s. 2.Af_. b) 9 a2: • ~2: I W . The semantic interpretation for concepts and roles in ~4Z:C. Vx.b) 9 A z x A z [ IVb'(a. y) r 0 ^ (x = O v y = O). It can be shown that the first concept is equivalent to the top concept. x ) 9 9 z A (s. ~) 9 R D ( ~ R)2: = V x . T E COLL. (b. b) a2: • a s 1 3x.x) 9 92: A (a. 3 N . (~.b) 9 z--~b 9 z} z} 9 9 ~ --. COLL C VC. b) 9 RZ)} {(a. the concept ( 3 n . x) ( ~ R ) z = {(a. and surprisingly it can be proved that: .COLL. y). ( a . (s.C)2: = {a 9 z~2: 13b-(a. (b. and reflect the corresponding valid axioms in set theory.

b). If Z has a model. An interpretation Z is a model of Z iff every assertion of ~ is satisfied by I . and a and b denote individuals in O. the interpretation function z is extended to individuals. The semantics of the assertions is the following: C(a) is satisfied by an interpretation Z iff az E C z.(3R. i.e. . This can be understood by considering that the concept in the right hand side introduces for each element a possible new relation.B Y .G R O U P as a collection: POP-GROUP(beatles). b) is satisfied by Z iff (a s. the system is able to recognize the individual beatles as an instance of P O P . In order to assign a meaning to the assertions. A set ~ of assertions is called an knowledge base.e. the former has a larger denotation. Z logically implies an assertion a (written Z ~ a) if a is satisfied by every model of ~ . Let O be the alphabet of symbols denoting individuals.S O N G . that the relation "led by a person" is inherently collective with respect to the group. P E R S O N fq V L E D . whereas the concept in the left hand side introduces the same relation for each element.I N . can be reduced to satisfiability since a is an instance of C with respect to a knowledge base Z if and only if Z U {~C(a)} is unsatisfiable [13). C ) . an assertion is a statement of the form C(a) or R(a. bz) E R z. where C is a concept. From the definition and from the knowledge acquired during the discourse. and that the relation "sing a pop song" has a cumulative reading for the group. the best that can be proved is an inclusion relation: 3 ( 4 R ) . C I T Y I9 V~ S I N G . R is a role. let us see how the concept representing any pop group can be defined using the .V ~ . they are referred to as ABox statements.C). predications on individual objects. Given a knowledge base Z.21 # Vm. The instance recognition problem.G R O U P and is able to classify the P O P . Let us consider now assertions. that the relation "born in a city" inherently distributes to the single persons composing the group. The definition states that a pop group is composed by persons. and R(a. therefore. Coming back to our example regarding the Beatles.As language: P O P . so that az E A z for each individual a E O and az 7s b~ if a r b (Unique Name Assumption). P E R S O N I q V<I B O R N .G R O U P E COLL. then it is satisfiable. i. In fact. usually. P O P . C E_ V g . checking whether Z ~ C(a). ( 3 R . an individual a in O is said to be an instance of a concept C if ~' ~ C(a). P O P .G R O U P .

~(y)) E R z.S concepts operates on constraints [14]. the collective roles become primitive. x) A ~(b.22 5 T h e calculus for A z . s)) V R(a. (~_ R) and ( _ R) collective roles. R is a role. There are two reasons: first. b) ::~ 3x. The rule-based cMculus to decide the satisfiability of . from the natural language point of view such deductions are not the intuitive ones.R(a. A constraint system S is satisfiable if there is an interpretation Z and an Z-assignment c~ such that ~ satisfies S. ~) -~ ( (3s. A constraint system S is a finite. b) W. x) ---+O(b. n . A L g S . ~(s. they still induce a structure between the elements.~-) C_(a. c . ~_R(a.As will be proved. ~(a. and x. x) . ~(s. ~) ^ R(a.{~: : A.A/:CS-. nonetheless. b)) W.b) I>R(a. Let Z be an interpretation of the concept language. ~) --+ ( (3s. ~(b. y are variables belonging to a predefined alphabet of variable symbols.gS-concepts is proposed. ~(a. where C is a concept.x) f3(a. the Collection Theory is changed by relaxing the semantics of the C. the iff 'r has been replaced with a simple implication ' ~ ' . ~) ^ R(s. 9 : -~A}. ( t> lZ). D e f i n i t i o n 3. A constraint can be of the type x : C or xRy. An Z-assignment o~ satisfies a constraint system S if ~ satisfies every constraint in S. Even if at first sight this simplification may give the impression that the obtained theory is too weak. In this way. in a open world semantics .A/:C. which is a weaker variant of .which is usually adopted for NLP semantics . non-empty set of constraints. ~)).b) Vx.AZCS. The soundness of the algorithm and its completeness with respect to .. The interpretation of constraints is defined as follows.5. we claim that this is not actually the case. . and ~ satisfies x R y if and only if (a(x). second.b) With respect to the definition of the collective roles for .and A/:C. A clash is a system having one of the forms: {x : _l_}. This algorithm can be used also to decide subsumption between two concepts and instance recognition between an individual and a concept. ~(a. x) ---*R(x.the lost deductions that can be performed in . b)) V R(~.4/3g8 from the elements to the collective roles are very few. ( C o l l e c t i v e R o l e s f o r AL:g.from . An Z-assignment r is a function that maps every variable to an element of A z. We say that c~ satisfies x : C if and only if a(x) E C z. The algorithm is sound but not complete for A s In order to obtain AL:gS. s In this section an algorithm for deciding satisfiability of .Af_.A/:CS. ~(a. b) ~ Vx. In the following we will refer to the language . ( 4 R).

and zRy not in S.-~C S t a r t i n g from the s y s t e m S = {x : C}. we consider only simple . both x R z and z : C in S. where A is a p r i m i t i v e concept. sRy} or T = {zRy} and x R t in S. and yOz not in S S--~r {xSk. and D----C1 or D = 6 ' 2 S -+3" {xRy. It is i m p o r t a n t to r e m a r k t h a t they are i n t r o d u c e d in order to prove satisfiability.t. and x R z not in S. a n d t h a t they are not i n t e n d e d to be used as d e d u c t i o n rules. A n a r b i t r a r y A R C S . A R C S . both tgz 9nd s is a new wriable.( A E C S ) concept can be t r a n s f o r m e d in linear t i m e i n t o an equivalent simple concept b y m e a n s of the following r e w r i t i n g rules: -~T --. Follows from the definitions. and y : C not in S s -*c_ {y~z} u s if xCy and x~z in S. u n t i l a c o n t r a d i c t i o n is g e n e r a t e d or a m o d e l of C is explicitly o b t a i n e d : the p r o p a g a t i o n rules preserve satisfiability. both x g z and ySz in S. and zRy not in S S -+ I> {*Rz} u S if x( ~>R)y and y2z in S.C --~ 3R.23 Proposition4.concept C is satisfiable if and only if the constraint system {x:C} is satisfiable. A concept is called simple if it c o n t a i n s only c o m p l e m e n t s of the f o r m ~ A .-~C ~ 3 R . xRs} or T = {xRz} .t. and T and t r y in S. T -~-~C -~ C -~(C n D) --~ -~C u -~D -~(C U D) --. A E C S . Proof. -~C n -~D -~VR.C and xRy in S.C in S. and there is no z s. the propagation ~ules are applied. and x R z not in S S-~<l T U S if x(<IR)y and x g z in S. and neither x : C ' l nor x : C 2 i n S . [] For the calculus.t.t. x:C2}uS i f x : C l [ T C 2 in S. ( R e d u c t i o n t o a c o n s t r a i n t s y s t e m ) A . y g k } u S if xNy in S. and b o t h x : C 1 a n d x : C 2 not in S {x:D}uS i f x : C l U C 2 in S. and k is a new variable S--*~ {zRy}US ifx(<lR)y and xgz in S. and there is no z s. = (sgz. y : C} U S if x : SR. We have the following rules: S-*n S-*u {x:C1. = {sgz. and y is a new variable S-+v {y:C}US if x : VR. both tBz and s is a new variable. and there is no t s. and there is no t s. C -~ VR. and T S -~ t> T u S if x( ~ R ) y and ygz in S.concepts. _L -11 --.

If S ~ is obtained from S by applying a nondeterminislic propagation rule. the following holds: T h e o r e m 8 . One should collect all the complete constraint systems derivable from {x : C} by applying the propagation rules. If S t is obtained from S by applying a deterministic propagation rule. rules add to the system only simpler constraints for which no rule directly applies again. otherwise it is unsatisfiable. then S is satisfiable if and only if S ~ is satisfiable.yielding several possible constraint systems. So. i. and that the number of complete constraint systems generated by the propagation rules is finite .by observing that the application of the rules decreases the complexity of the constraint system. ( D e c i d a b i l i t y ) Satisfiability. finitely many. Easy by translating the constraint systems into logical formulas.concepts. [] Now. and from the independence of the meanings of the basic role expressions. [] A constraint system is said to be complete if no propagation rule is applicable to it. P r o p o s i t i o n 6. then C is satisfiable. subsumption and instance recognilion problems in . P r o p o s i t i o n 5.concepts.or nondeterministic (--+.. up to variable renaming. Proof. Proof. Then the proposition follows from local soundness and completeness and the termination of the propagation rules. Proof. several complete systems can be derived from {z : C}.~ t> ) . Given the constraint system {x : C}.are decidable. If at least one of these systems is clash free.they yield a uniquely determined constraint system . (sketch) First prove that a complete system is satisfiable if and only if it contains no clash. it is straightforward to put together the blocks and build up a sound and complete decision procedure to check satisfiability of . then S is satisfiable if S I is satisfiable. (sketch) One should prove that the size of each obtained complete constraint system is finite . . there is a way to apply the rule to S such that the obtained system is satisfiable if and only if S is satisfiable. [] P r o p o s i t i o n 7..e. ( T e r m i n a t i o n ) Let C be a simple A s concept. ---~<1. ( L o c a l s o u n d n e s s a n d c o m p l e t e n e s s ) Let S be a constraint system of . moreover.Al:CS. Such systems are. ( S a t i s f i a b i l i t y ) A constraint system S = {z : C} is satisfiable if and only if there exist a clash free complete constraint system which can be derived from S by applying the propagation rules.24 Propagation rules are either deterministic . after a finite number of applications of the propagation rules one obtains a finite set of complete constraint systems.Af~gS.in particular by checking the number of the newly introduced variables -.A/:gS. Because of the presence of nondeterministic rules.

as in [15.RI >_c.R The basic collection-forming relation is the partial-ordering binary relation _ (to be read HAS-PART) on the A z set . w . Let's have now the roles defined according to the syntax rule: n ~ P Ih I <~c. y.e. It is easy to see that the problem is PSPACE-hard .RI ~c'. Plural quantifiers are qualified in the sense that the elements of actual predications are selected by a qualification predicate C. subsumption and instance recognition in r163 The proposed algorithm is a sound decision procedure to check satisfiabilily. x). Moreover.R} ~_c. the language embodies plural quantifiers which specify the reading of a binary relation applied to collections. i. The idea is as follows.e. Final!y.w h o l e r e l a t i o n ) Vx. 4 s The proposed algorithm is a sound and complete decision procedure to check satisfiability. The semantics of the plural quantifiers is given by the following definition: . ( P a r t . ___(x. it is not known if a PSPACE-algorithm for checking the satisfiability of A E C S . a new class of plural quantifiers can be easily introduced in the mereological framework: the ~ and ti~ quantifiers. ~) ~ ~(~. vx. to a primitive part-whole partial-ordering relation "~" [25] is proposed.nl .nl J>c. ~(~. 6 A Mereology In this section the switching of the basic collection-forming operator from a quasi-ordering relation _C founded on a membership relation 9. anti-symmetric and transitive relation. z).~c. The <1 and I> quantifiers specify that the relation necessarily holds for all the parts of a certain type C. D e f i n i t i o n 11. 16].such a lower bound comes from `4s However. they express the left and right cumulative readings. __(x. i. The <1 and t> quantifiers specify that the relation necessarily holds for some parts including all the parts of a certain type C.~1 and tl~ quantifiers specify that the relation possibly holds for some part of a certain type C. ( A l g o r i t h m f o r . y) A ~-(y.concepts exists. the precise complexity of the satisfiability problem still need to be found. a less incomplete algorithm for AECS is under study. they express the left and right distributive readings.25 C o r o l l a r y 9.it is a reflexive. ( A l g o r i t h m f o r . x) ~ x = y. The . y. 4 s From the computational point of view. z. Again. subsumption and instance recognition in As C o r o l l a r y 10. They allow us to represent the group reading of a relation . y) A ~-(y.

R( a. within mereology the disjunction disappears.R(a. a plural operator ' . More specifically. V) -+ ~ C . h(s. x) A C(x)) -~ R(x. (_~(b. R(x. In this way. s)) <~]C. Vx. (as. x) <C. The difference is evident if we look at the semantics of the cumulative plural quantifier: the Collection Theory distinguishes an individual x from a singleton collection s whose only element is x. The qualification predicate acts like a filter to select the correct level in the mereological partonomy. b) iffW. john). Elements of a collection are parts in the partial order: ~(beatles.R(x. x) A C(x)) --. (Qualified Plural Quantifiers) <lC. In this way. It is called nonextensional for the same reasons as in collection theory: the extensionality axiom is not valid within this mereology.26 Definition 12. In order to recover the lost distinction between a collection and its elements.R(a. V).R(a. x) ^ C(~)) --. for example. ~_(~.x) A c(x) A R(x. which is a generalization of the simple Theory of Collections presented above. x) As is expected. b) ~C. On the other hand.. x) ^ C(x)) -~ (as. y)V ~_(y. V). In this way. In this mereological framework. b)) t>C. the main advantage of this approach is its uniformity in treating elements and collections. Vx. x) A C(x) A R(a. (h(a.~(a. . such that ~(s. either x is a part of the given singular type or the latter is part of x. b) iff 3x. x) -* (3y. R ( x .R(x.PERSON. R(a.R(x. V. y). john).y) --+ ~ c . _~(s. ~C. the plural predicate P E O P L E from the singular predicate PERSON: PEOPLE . R(x. We can define.y. b)iffW. for any subpart x of the plural entity. a ( x . Vx. b) >C. x) A R(a. x): this is why an explicit disjunction was introduced in the semantics.R(~. b) iff 3x. y) -~ < C . (Plural Operator) *P(a) iffVx. y. (_(a. a non-extensional Mereology is reconstructed. the cumulative plural quantified expressions are still more general than the unquantified (collective) expressions and the distributive plural quantified ones: Vx. <~C. As observed in [26].b) iffVx. y ) . ' is defined as follows: Definition 13. x) ) ) The purpose of the plural operator is to allow the construction of plural collective entities having singular objects of a certain type as their parts. since the _ relation is reflexive: >-(john. R ( x . qualification on plural quantifiers is needed. x) ^ R(s.R(~. ambiguities in the readings can be preserved by using the cumulative operator. b) iff Vx.P(y) A (~_(x. h(b. (~-(b. V) -~ ~C. V.

are expressed in the mereological framework in the following way: "The Beatles are born in Liverpool" ( <lPERSON.27 problems with multi-level plural entities [19] can be solved. and a sound and complete algorithm to decide satisfiability. The concept language . liverpool).I N and SING.e. for example: "The Beatles played in London" (.e. represents the case in which it is unknown whether the group which played in London was composed actually by all the members of the Beatles. So. 12].I N holds for some collection of persons. a formalism which is intended to give semantical and computational means to plurals and plural quantifications. i. the qualification P E R S O N comes from the lexical semantics entries of the relations B O R N . A mereological version of the collection theory has also been applied to model the structure of events and processes [4] in the domain of tense and aspect in natural language. . which should be a part of the collective entity beatles. i. london). yesterday). The examples for distributive and cumulative readings. 7 Conclusions In this paper a Collection Theory has been presented. The basic assumption taken into consideration is that verbal morphology plays a crucial role in specifying the temporal meaning of a sentence.SING)(beatles. in order to properly account for perfective and imperfective sentences and for habituals by means of plural quantifiers ranging on collections of events [11. the introduction of a mass dissective predicate as in [18]. "The Beatles sing 'Yesterday' " ( ~ PERSON. The group reading is a sort of weakened collective reading. This reformulation of the collection theory can be adopted for the logical analysis of plurals according to the lattice-theoretical approach pursued by [18].~llPEOPLE. The qualification predicates for the plural quantifiers are taken from the lexical definition of the involved relations: in this example. an element of the class PEOPLE. subsumption and instantiation for a slightly weaker variant of the language has been devised.AECS has been studied. and. the calculus. the relation P L A Y . Several issues still need to be considered within the mereological approach: the existence of atoms (entities which have no parts). This approach allows for complete reasoning even in the presence of scoping ambiguities. last but not least. the possible inclusion of several different >-i relations [29].PLAY-IN)(beatles. using qualified plural quantifiers.SORN-IN)(beatles. which embeds in a uniform and compositional way the collection theory.

The complexity of existential quantification in concept languages. Groups. of the 8 th ECA1. Donini. pages 151-162. 1991. Adding constraints inference to ABox reasoning. Linguistics and Philosophy. M. 1990. The logical analysis of plurals and mass terms: a lattice-theoretical approach. 1990. II. 4:159-219. L. In Proc. Lesmo. and A. pages 305-312. In Proc. Alessandra Giorgi. Walter de Gruyter. Munich. of the 5 th International Symposium on Knowledge Engineering. April 1993. However. Germany. In Proc. In R. Hollunder. Barcelona. Hollunder. B. Sweden. D. 1990. F. Padova. Nardi. von Stechow. Hybrid inferences in KL-ONE-based knowledge representation systems. In Proc. Fred Landman. N. special issue on Knowledge Representation for Natural Language Processing (to appear). 8. Fred Landman. Subsumption algorithms for concept description languages. Marchetti Spaccamela. In Proc. and P. of the International Conference on Mathematical Linguistics. 11. B. Berti. 10. Groups. and W. Nardi. Povo TN. Donini. of the 13 th IJCA1. ICML-93. Franconi. ]7. Allgayer and E. Chambery. Poll (eds. In Proc. J. In Proc. M. Lenzerini. A treatment of plurals and plural quantifications based on a theory of collections. October 1992. Godehard Link. 5. 1993. Australia. In Proc. D. Nutt. The complexity of concept languages. Abstract 9205-02. Hollunder. August 1993.). 12:723-744. B. A. August 1991. Guarino and R. Artificial Intelligence. pages 302-323. 1993. LSB--ONE+ -. Allgayer. Tense and aspect: a mereological approach. and W. Seville. and it has been argued that a theory of part-whole relation is more expressive and more adequate than an element-based collection theory. editors. References 1. Nutt. A preliminary version appears in the Preprints of the International Workshop on Formal Ontology. 9. In Proc. Schwarze. Barwise and R. Spain. Nutt. Lenzerini. Cooper. D. Enrico Franconi. M. (Abstract). M. pages 13-18. Enrico Franconi. Banerle. 14.28 Finally. 1989. MA. 16. of the 14 th German Workshop on Artificial Intelligence. 3. Use and Interpretation of Language. of the 2 nd International Conference on Principles of Knowledge Representation and Reasoning. J. 9:5-16. Nutt. Enrico Franconi. and Fabio Pianesi. . M. of the 9 th ECAI. SpringerVerlag. 6. Sweden. Italy. I. Italy. W. Linguistics and Philosophy. M. of the 9 th ECAL pages 348-353. M. J. Terenziani. 4. 1992. pages 219-249.dealing with sets efficiently. Spain. 1986. Donini. F. Nardi. Alessandra Giorgi. 1989. France. 12:559-605. C. and W. a mereological framework has been suggested. Franconi. 15. Minds and Machines. and Fabio Pianesi. Generalized quantifiers and natural language. F. pages 473-478. 1981. Bach. 13. 2. 7. A semantic account of plural entities within a hybrid representation system. Tractable concept languages. Stockholm. Cambridge. A network formalism for representing natural language quantifiers. Lenzerini. 1983. and M. 53:309-327. Linguistics and Philosophy. Schmidt-SchauB. 1988. Stockholm. Meaning. May 1992. Sidney. A mereological approach to tense and aspect. ]RST. 12. The algebra of events. of the 12 th 1JCAL pages 458-465. Linguistics and Philosophy. E. E. more work is needed to refine this mereological framework.

o f the 10 th ECAL pages 543-547. editors. Knoxville.29 18. A taxonomy of partwhole relations. In Nicola Guarino and Roberto Poll. Schmidt-Schaufi and G. 24. William A. pages 53-66. J. Computer and Mathematics with Applications. Attributive concept descriptions with complements. Sowa. . Slagle. Principles of Semantic Networks. M. Padova. Mechanisms for reasoning about sets. 22. Situation Semantics and its Applications. Ellis Horwood. Simons. A. Godehard Link. pages 469-497. 19. Eklund. The KL-ONE family. Shapiro and W. current reasearch and practice. Paul. Massimo Poesio. Cardiner. Woods and James G. March-May 1992. Roger Chaffin. 26. F. Algebraic semantics for natural language: some philosophy. 1987. 1992. Michael P. Computer and Mathematics with Applications. Plotkin. P. pages 398-402. Gerholz. of the 5 th International Symposium on Methodologies for Intelligent Systems. E. Tjan. 1990. Artificial Intelligence. J. pages 157-189. S. F. Gawron. Dialog-oriented ABoxing. In Proc. and Douglas Herrmann. Schmolze. Barwise. Clarendon Press. Morgan Kaufmann. March-May 1992. 27. MN. Sowa. St. Vienna. Smolka. 21. J. Stanford. L. Bosco S. 11:417-444. J. Toward the expressive power of natural language. 1991. Wellman and Reid G. editors. Proc. CSLI. 1991. C. 48(1):1-26. pages 277-288. vol. 23(25):243-275. Simmons. L. and S. chapter 20. In J. Austria. and P. Relational semantics and scope ambiguity.2. 23. Conceptual Structures. M. March 1993. How to fit generalized quantifiers into terminological logics. and James R. editors. M. some applications. Cognitive Science. Winston. 1987. In Proc. 1991. 20. W. Quantz. Tutiya. J. of the International Workshop on Formal Ontology. G. pages 19-49. In Proc. 29. The SNePS family. Italy. Parts: A Study in Ontology. TN. 25. Morton E. In T. David A. 1992. Nagle. Rapaport. editor. of AAAI-88. In J. Nagle. CA. special issue: Semantic Networks in Artificial Intelligence. 23(2-5):133-177. chapter 2. special issue: Semantic Networks in Artificial Intelligence. 28. 1988. Poesio. Representing and reasoning with set referents and numerical quantifiers. Oxford.

This is due to the unsatisfactory treatment of open defaults via Skolemization in Reiter's semantics. they do not allow for exceptions. most of these systems also have an assertional component. and may even vary with the syntactic structure of equivalent concept expressions. For example. In terminological representation formalisms. and we have only finitely many (open) defaults. One can for example state that an individual is an instance of a concept. the system can use descriptions to automatically insert concepts at the proper place in the t a x o n o m y (classification). which means. and it can use the facts stated about individuals to deduce to which concepts they must belong (realization). In this semantics it is possible to compute all extensions of a finite terminological default theory. Because of these problems. the concept descriptions are interpreted as universal statements. even though our base language is decidable. Semantically. which means that this type of default reasoning is decidable. and defines more complex concepts using the operations provided by the concept language of the particular formalism.de A b s t r a c t . In addition to this concept description formalism.uni-sb. considering the fact that the terminological language is a decidable sublanguage of first-order logic. this treatment may lead to an undecidable default consequence relation. unlike frame languages. 1 Introduction Terminological representation systems are used to represent the taxonomic and conceptual knowledge of a problem domain in a structured and well-formed way. one has the unpleasant effect that the consequences of a terminological default theory may be rather unintuitive. or that two individuals are connected by a role. we then consider a restricted semantics for open defaults in our terminological default theories: default rules are only applied to individuals that are explicitly present in the knowledge base. G e r m a n y e-malh (last name)@dfki. To describe this kind of knowledge. It turns out that such an integration is less straightforward than we expected.Computing Extensions of Terminological Default Theories F r a n z B a a d e r and B e r n h a r d Hollunder German Research Center for Artificial Intelligence (DFKI) Stuhlsatzenhausweg 3 D-66123 Saarbriicken. one could define the concept M a m m a / as an Animal that feeds its young with Milk. one starts with atomic concepts (unary predicates) and roles (binary predicates). We consider the problem of integrating Reiter's default logic into terminological representation systems. On the algorithmic side. As a consequence. where feeds-young-with is .

Our arguments for the importance of default extensions for terminological representation languages so far were given from the viewpoint of the terminological systems community. For example. defaults in the FAME system. Several existing terminological systems. "will not be complete (or even consistent)" ([15]. . We shall here consider the problem of integrating Reiter's default logic into a terminological representation formalism. one might want to assume by default that Mammals reproduce Viviparously. which can be found in the nonmonotonic reasoning literature.g.g. o r SB-ONE [14]. Only if it is known that a specific mammal reproduces with eggs. that this is an important item on the wish list of users of terminological representation systems (see e. commonsense reasoning is often based on assumptions that may ultimately be shown to be false. Kaep [15]. which is built using K-aep. such as BACK [19]. but also for semantic considerations. [13]). If the concept Platypus I is defined as an Animal that lives-in the Water. For this reason. [20]).46). In CLASSIC. In this general form. should this assumption be cancelled.45. in view of the fact that last IJCAI was in Austrafia. then the system will recognize that Platypus is a subconcept of Mammal. thus leaving the wide gap between propositional logic and full first-order logic almost unexplored. the need for embedding defaults into terminological representation formalisms is also substantiated by the fact. feeds its young with Milk. the user is warned to "use this trick with extreme caution" ([7]. Most nonmonotonic reasoning formalisms (e. Besides the general arguments for the importance of reasoning with defaults. one needs a formalism that can handle such default assumptions. as the designers of these systems themselves point out. LOOM [18]. This treatment of defaults in terminological systems has already been proposed by Brachman and Schmolze [8]. and not in the Antarctic.. Reiter's default logic [21]. these approaches usually have an ad hoc character. and reproduces with Eggs. C L A S S I C [7].g. p .9). "a limited form of defaults can be represented with the aid of rules and test functions. the formalisms are usually highly undecidable (see e. these investigations may also be of interest for research in nonmonotonic reasoning itself. Since most terminological representation languages can be viewed as decidable subclasses of first-order logic--but are nevertheless much more expressive than propositional logic--they can serve as interesting test cases for nonmonotonic reasoning formalisms. but does not destroy the definitional character of concept descriptions--because otherwise the advantage of automatic concept classification.g. have been or will be extended to provide the user with some kind of default reasoning facilities. [21] Theorem 4. However. We shall see that this not only applies for algorithmic. work on decision procedures for decidable subcases was mostly restricted to propositional logic (see e. l l ) unless the user is very careful when using them. would be lost (see [6]). and are not equipped with a formal semantics. but to a We are taking this as our exceptional animal. If one wants to use terminological systems for this kind of commonsense reasoning. In our example. Circumscription [16]) use full first-order predicate logic as their base language. etc. However. However. p." However.31 used as a role.

the proposed integration should be unproblematic. we shall consider a restricted semantics for open defaults in our integration: default rules are only applied to individuals that are explicitly present in the assertional part (ABox) of the knowledge base. if the open defaults are treated as proposed by Reiter ([21]. In order to make these methods more efficient. due to the unsatisfactory treatment of open defaults by Skolemization (see also Section 3). However. Section 7. as already pointed out by Reiter ([21]. a finite set of open defaults stands for a set of closed defaults that is finite as well. a finite set of premises and open defaults may lead to an undecidable default consequence problem. this proposal was never followed up. 2]) can be modified to solve these problems. in [3] we have shown that one runs into severe problems. from a semantic point of view. With this restricted semantics. p. In Section 5 we shall point out how the tableaux-based methods for assertional reasoning developed in our group ([11." But the default rules one wants to consider in terminological default theories are open defaults.32 the best of our knowledge. For Junker and Konolige's methods one has to find minimal proofs for assertional facts--which can be seen as an abduction problem for ABoxes--and for Schwind and Risch's method one must find maximal consistent sets of assertional facts. these methods seem to apply in our case. Reiter's default rule approach seems to fit well into the philosophy of terminological systems because most of them already provide their users with a form of "monotonic" rules. At first sight. with a (decidable) terminological language as base language. These rules can be considered as special default rules where the justifications-which make the behaviour of default rules nonmonotonic--are absent. the terminological representation language we shMl consider (see Section 2) is a sublanguage of firstorder logic. Thus the above-mentioned methods of Schwind and Risch and of Junker and Konolige can be applied to compute extensions (see Section 4). a closer look at the papers reveals that by "a finite number of defaults" it is meant "a finite number of closed defaults. . one has to solve certain algorithmic problems for the terminological language. Junker and Konolige [12] write that their method is applicable if the default theory "consists of a finite number of defaults and premises and classical derivability for the base language is decidable. T h o u g h one may thus lose some intuitive default inferences. and we certainly do not want to have infinitely many default rules. In fact. one might think that. this treatment of default rules is akin to the treatment of the monotonic rules in terminological systems such as CLASSIC. A similar problem arises when considering the integration from the algorithmic point of view. Since our base language is decidable. and Reiter's semantics has been formulated for full first-order logic.115) "the genuinely interesting cases involve open defaults." A related formulation can be found in the abstract of Schwind and Risch's paper on the same topic [25]." In [3] we have shown that. In fact. In the abstract of their paper on how to compute extensions for default logic.1). However. Because of the semantic as well as algorithmic problems posed by Reiter's treatment of open defaults.

33 2 The Representation Formalisms First we shall briefly review the terminological language . and valuerestriction (VR. In the present paper we restrict our attention to the language As D e f i n i t i o n 1. and the semantics of exists-restrictions is given by (3R..e. R(a. Concept names are considered as symbols for unary predicates. Then terminological default logic is defined as the speciMization of default logic to A/:g. b stand for individual names.1 T h e t e r m i n o l o g i c a l l a n g u a g e .A/~C consists of the following concept description formalism. The formula corresponding to the assertional fact C(a) (resp. disjunction. Consequently. Here a. (--C)(x):= -~C(x). (CUD)(x):= C(x)VD(x). b)) is obtained by replacing the free variable(s) in the formula corresponding to C (resp. For value-restrictions we define (YR. y) --+ C(y)). These objects are referred to by individual names. negation (-~C). b.C). Depending on which constructs are allowed for building concept descriptions we get different terminological languages. The assertional part of our language allows us to assert facts concerning particular objects. The semantics of an ABox can either be given directly by defining interpretations and models. To sum up. and to describe objects of this domain with respect to their relation to concepts and their interrelation with each other (assertional knowledge). The terminological part of the language . i. an ABox is translated into a set of first-order formulae consisting of .A/~C Terminological knowledge representation formalisms can be used to define the relevant concepts of a problem domain (terminological knowledge). 2. C for a concept term. b).A/:g [24] and Reiter's default logic. or by a translation into first-order logic. In terminological systems one usually has a unique name assumption.C). R) by a (resp. In order to make the fact explicit that we are dealing with a sublanguage of first-order logic. disjunction (C U D). b)). exists-restriction (3R. (C[3D)(x):= C ( z ) A D ( z ) . A finite set of such facts is called an ABox. The concept terms of this formalism are built from concept and role names using the constructors conjunction (C D). we choose the second option. and role names R into (atomic) formulae R(z. or that two objects are related by a role (written R(a. and R for a role name. a. where C. and negation are defined in the obvious way.C)(x) := Yy: (R(x. Y) with two free variables. concept names A are translated into (atomic) formulae a ( x ) with one free variable. The semantics of conjunction. The individual names of the ABox are considered as constant symbols. Concept terms are also translated into formulae with one free variable.C)(x) := 3y: (R(x. and role names as symbols for binary predicates. which can be expressed by the formulae a ~ b for all distinct individual names a. y) A C(y)). D stand for concept terms and R for a role name. and we can state that an object belongs to a concept (written C(a)).

which is a set of deductively closed first-order formulae defined by a fixed point construction (see [21].. .. and (l/V. Depending on whether one wants to employ skeptical or credulous reasoning. Theorem 4.. and 7 is its consequent. 7 Then Th(E) is an extension of (W. a default theory may have more than one extension.. D e f i n i t i o n 2. its consequent is added to the current set of beliefs.. ..4. A default theory is closed iff all its default rules are closed. Let E be a set of closed formulae.34 the translations of the ABox facts together with the formulae expressing unique name assumption. and Con(7)). justifications. "~fl~ ~ T h ( E ) } . . been described in [11. Formally. A default theory is a pair (}4/. p. . this consequence relation is not even recursively enumerable (see [21]. In general. ~ . The basic inference service for ABoxes is called instantiation.... 7)) iff Th(E) . 7)) where )41 is a set of closed first-order formulae (the world description) and 7) is a set of default rules. fl~ E 7). and -'/31. nonmonotonic inference rules. If the answer is yes we say that a is an instance of C with respect to . ill. and consequents in 7) by Pre(7)).. . or even no extension. .4 ~ C(a))./3n. 2]. We define E0 := W and for all i >_ 0 E~+I := El tA {7 I a:/31. Here c~ is called the prerequisite of the rule.)./3n are its justifications. 3' do not contain free variables. ~/are first-order formulae. A default rule is closed iff c~. 2. /31. which we shall use.. 7)) be a closed default theory.A (. a closed formula $ is a consequence of a closed default theory iff it is in all extensions or if it is in at least one extension of the theory. we denote the sets of formulae occurring as prerequisites. a E Th(E..89). respectively. a closed default rule can be applied..e. In general..~Ji~=oTh(Ei). It answers the question of whether (the translation of) a given ABox fact C(a) is a (logical) consequence of (the translation of) a given ABox . Intuitively. For a set of default rules 7). which he calls default rules. ~ . gus(7)). -. / 3 n . Reiter also gives an alternative characterization of an extension. A default rule is any expression of the form o~ : ~ l ./ where c~. the consequences of a closed default theory are defined with reference to the notion of an extension.. Here and in the following.2 Reiter's default logic Reiter [21] deals with the problem of how to formalize nonmonotonic reasoning by introducing nonstandard. Algorithms which solve this inference problem have. for example. . as the definition of extension. . T h ( F ) stands for the deductive closure of a set of formulae/~. .9). i.. in a slightly modified way../31. if its prerequisite is already believed and all its justifications are consistent with the set of beliefs.

Reiter shows ([21]. and since concept terms can be seen as formulae with one free variable. By default we 2 The concept terms occurring in one rule are assumed to have identical free variables...3 Terminological default theories A terminological default theory is a pair (. 7)) iff E is an extension of the closed default theory (}4/I. Reiter defines extensions of arbitrary default theories ()IV.4 is an ABox and 7) is a finite set of default rules whose prerequisites. 7)) where .5) that the s e t ~/ E 7) [ a e Th(E) and o f 1.4. as follows. 1)) is of the form Th(14] U Con(/)')) for a subset 73' of 7). 3 Reasons for and Problems Caused by Skolemization First. we are not interested in arbitrary formulae as consequences of a terminological default theory (.7)). First. where a is an individual n a m e occurring in the original ABox . the formulae of )IV and the consequents of the defaults are Skolemized (see [21]. The first example shows that intuitively valid consequences would get lost if one did not Skolemize.e. justifications. Suppose t h a t our ABox consists of the fact t h a t Tom has some child who is a doctor. and consequents are concept terms. 2. 7)1). Now E is an extension of 04/.. The reason for Skolemizing before building ground instances will be explained by an example in the next section. Then we shall argue that this t r e a t m e n t of open defaults is problematic both from a semantic and an algorithmic point of view. we illustrate by an example why Reiter uses Skolemization in his semantics for open default theories. 2 terminological default theories are subsumed by Reiter's notion of an open default theory.4 = {(qchild. but only in assertional facts of the form C(a). Section 7).4. since ABoxes can be seen as sets of closed formulae. default theories with open defaults.. . a set 7)~ of closed default rules is generated by taking all ground instances (over the initial signature together with the newly introduced Skolem functions) of the defaults of 7). where }4/I is the Skolemized form of )/V.. "fin r Th(E) always satisfies this property.doctor)(Tom)}. i. 7)) has an inconsistent extension iff )IV is inconsistent. i. For this reason it is called set of generating defaults for the extension Th(E). However. Since we are only adding consequents of defaults during the iteration. Obviously.35 Note that the extension Th(E) to be constructed by this iteration process occurs in the definition of each iteration step. as for ABox reasoning without defaults. T h e o r e m 2. Second./)). any extension Th(E) of (W. .4.e.. Another easy consequence of Definition 2 is t h a t ()IV.

because their being in A ~ B is inconsistent with its justification.rich-person)( Tom) and (3child. the first default has to be instantiated by the term child-ot(Bill). Technically. Bill). the default rule cannot fire for b and c. this default rule can be applied to d.doctor)( Tom). Consider concept terms C1 := 3R. whereas when Skolemizing the second ABox we get two Skolem constants c and d. c. but also a rich one.41. whereas Skolemization of the consequent of the second default yields a unary Skolem function. T h u s / ) consists of the default rules doctor : rich-person and doctor : 3child. logically equivalent world descriptions may lead to different results. the individuals c and d might be identical (which is . (3child.doctor rich-person 3child.36 want to conclude that doctors usually are rich persons. this comes from the fact that the closed defaults obtained by instantiating our open defaults with the Skolem constant Bill are applicable.A where R is a role name and A. say child-of. For this reason.4~ it is instantiated with a. B are concept names.(A N B) and C2 := 3R. On the other hand.-~B)(a) as a default consequence. where Bill is a new Skolem constant. Obviously. it is in general problematic. the ABoxes A1 :-. b. before the application of the default. this default is instantiated with a. doctor Skolemization of the world description . because being in A is consistent with being in -~B. the above facts could not have been deduced by default. if we assert that an individual a is in the first term this implies that it is in the second one as well.. Although the treatment of open defaults via Skolemization yields an appropriate behaviour in this example.42 has (3R.e. Without these ground instances. Beside the fact that Skolemization of the world description may lead to counterintuitive consequences of default theories (see Section 3 of [3]). the reason for this behaviour is due to the fact that. for identical sets of open defaults. For this reason.41. whereas for the Skolemized version of . Obviously. To deduce by default that the grandchild of Tom is not only a doctor. i.4' = {child(Tom. we get a single new Skolem constant b which is R-related to a and lies in A • B. whereas this fact cannot be deduced by default from the Skolemized version of .4 yields .3child.e. which shows that the Skolemized version of .C2(a)} are logically equivalent. but where c lies in A [9 B and d lies in A. d. Intuitively. Now consider the (open) default A : "~B/-~B. It is easy to see that the corresponding closed default theory has exactly one extension. and usually have children who are doctors. i. doctor (Bill) }. and that this extension contains the assertional facts that Tom has a rich child and a grandchild who is a doctor. the following example demonstrates that the consequences of a default theory may depend on the syntactic form of the world description. both R-related to a. For the Skolemized version of .. d is put into ~B.42 := {Cl(a). When Skolemizing the first ABox.{ e l ( a ) } and .

In addition to this semantic problem caused by Skolemization. functional roles. However. 4 Computing Extensions Because of the problems caused by Skolemization in Reiter's treatment of open defaults. i. In the restricted semantics for terminological default theories. In order to supply the truth maintenance system with enough information about first-order derivability in the . Since our terminological language is decidable. we now propose a restricted semantics for open default theories: default rules are only applied to individuals that are explicitly mentioned in the ABox. 9-.l~l. In fact. with this m e t h o d one has to consider all the (exponentially many) subsets of Z). or of Schwind and Risch can be applied to compute all extensions (according to our restricted semantics). 2)) into a Truth Maintenance Network (TMN) ~ la Doyle [9]. even though one employs a decidable base language and a finite set of defaults.37 the reason why the two ABoxes are logically equivalent) whereas this is no longer possible after the default has been applied.4. out(-. J3.. Z)) is of the form Th(`4 U Con(~)) for a subset ~ of :D.-'~n) -+ ^/) of the TMN. both methods depend on the fact that any extension of a closed default theory (. if the base language is decidable. The two methods which we shall describe below try to avoid considering all subsets. Section 4 of [3]). the consequence problem for an open terminological default theory over a language which extends . In fact. and the prerequisites and negated justifications/:z) of the defaults. and so-called agreements on attribute chains is in general undecidable (cf. and the iteration process terminates because there are only finitely many consequents to be added. an open default of a theory (`4. This is so because decidability of the base language makes each iteration step effective. the methods of Junker and Konolige. The nodes of the T M N are the consequents Cz). thus making the search for (the sets of generating defaults of) all extensions more efficient.. In principle. we end up with a finite set of closed defaults.4. A default c~ :/~1./)) is interpreted as representing the closed defaults obtained by instantiating the free variable by all individual names occurring in `4.4/:g by attributes.e. one could even use for this purpose the iteration process described in the definition of an extension... there are only finitely many such subsets. and the only problem is to decide which of these generate an extension.. D e f i n i t i o n 3 . If T) is finite.~/7 of 7:) is translated into a nonmonotonic justification (in(a). we have shown that this treatment of open defaults can also lead to an undecidable default consequence relation. 4. Because the ABox ..1 Junker and Konolige's method Junker and Konolige [12] translate a closed default theory (.4 and the set of open defaults D are assumed to be finite.

and they describe an algorithm which computes all admissible labellings of a TMN. A brute [orce algorithm could just compute all subsets Q of C9 such that . Given such an admissible labelling.i t is completely abstracted from derivability in the base language. Find all minimal subsets Q of B such that . They use this characterization for computing extensions of propositional default theories. and the entailment problem . this simple algorithm is very inefficient. we defer the description of a more efficient solution of this problem to a separate section. one has to compute the corresponding minimal sets for all elements q i n / : o . As mentioned in Section 2. A characteristic feature of Junker and Konolige's method is that--after the computation of the minimal sets Q . Let W be a set of closed formulae. we shall show how to apply the theorem to computing extensions of terminological default theories. Before we can formulate the theorem we need one more piece of notation.e. Junker and Konolige show that there is a 1-1-correspondence between admissible labellings of the TMN thus obtained and extensions of the default theory. . 4. eliminate the ones which are not minimal.4 U Q' ~ q for every proper subset Q~ of Q. This may be advantageous from a conceptual point of view. but it can be problematic from the algorithmic point of view. one has to show how to compute the above mentioned monotonic justifications of the TMN.4 U Q entails an assertional fact C(a) iff `4 u Q u {-~C(a)} is inconsistent. Because . and thus not appropriate for actual implementations. ~). B be ABoxes. In this subsection. These justifications are of the form (in(Q) -4 q) where q E s and Q is a minimal subset of C9 such that .4 u Q ~ q but .2 A m e t h o d b a s e d on a t h e o r e m b y S c h w i n d a n d R i s c h Schwind and Risch [25] give a theorem which characterizes those subsets 7) of 7) which are sets of generating defaults of an extension of a closed default theory (W.4 U Q entails q E / : 9 .4 U Q ~ q for q E L:o is an ordinary instantiation problem. and for the negated justifications it follows from the fact that the concept language has negation as an operator. for each q. Since a similar algorithmic problem has to be solved for the method obtained from Schwind and Risch's characterization of an extension. and then. This is obvious for the prerequisites and the consequents of our instantiated defaults.4 U Q for a subset Q of Cv is an admissible ABox of our language. '~ In order to make the translation of terminological default theories into TMNs effective.38 base language.. Of course. and 7) be a set of closed . In fact. First note that the elements o f / : 9 U C9 are admissible assertional facts. D e f i n i t i o n 4 .4 U Q is inconsistent. the set of generating defaults of the corresponding extension consists of the defaults whose consequents are labelled ~ i n . each prerequisite and negated justification of a default gives rise to several monotonic justifications of the TMN.4 U Q entails q--i. For this reason. we need a solution of the following problem: Let `4. the instantiation problem is decidable for our language.. even though this information may not contribute to the computation of an extension. .

Suppose the call Remove-Defaults(W. and the entaihnent problem in the base language is decidable. 79). L e m m a 6 . 2. If 73 is finite.39 defaults.73) be a closed default theory and let 73' C_ 19 be such that W U Con(73') is consistent. If W is inconsistent then there is only one extension. . T h e o r e m 5 ( S c h w i n d a n d R i s c h ) .Ot : f l l ' ' ' " f l n 7 I d E D and W U Con(73i) ~ a } . 73i+l = 73i U {d -. the "if" part of the second condition is tested. 73. and successively eliminate defaults violating the first condition of the theorem. we shall without loss of generality assume that W is consistent. fln/'~ we have d E D iff W U Con(D) ~ og and for all i.~ be all maximal subsets of 730 such that W tO Con(73i) is consistent. or the "only if" part of the second condition. Now. In the following. let 730 be the largest subset of 73 that is grounded in W. If 73 is not grounded in W. . . but it is easy to see that both formulations are equivalent. The advantage of our formulation is that it can directly be used as a procedure for deciding groundedness. We define 730 = 0 and. Figure 1 describes the procedure for computing all extensions of a closed default theory. Let (W. The iteration process described above corresponds to the iteration in the definition of extensions. If :Do C s then Do is a set of generating defaults for an extension of (W. this theorem provides us with an effective test of whether a subset 73 of 73 is a set of generating defaults of an extension of (YV. . with the main difference that it disregards the justifications. 73. For all d E 73 with d = ~ : i l l .73). and let 7 3 t . for i k O.73) iff the following two conditions hold: I. Let (W. Then :P is called grounded in W iff 73 = Ui~176731. namely the set of all formulae. . A subset ~ of 73 is a set of generating defaults of an extension of (W. The idea underlying our method is to start with these maximal sets 73i. . Since W is assumed to be consistent. To show soundness and completeness of the procedure (Theorem 9) we need three lemmas. 73') returns the list s of sets of defaults.. 73) be a closed default theory. if 73 is finite and the entailment problem in the base language is decidable. . 1 < i < n. This definition of groundedness differs from the one given in [25]. The second condition given in the following theorem makes up for this neglect. If no more defaults can be eliminated. extensions are consistent as well. which means that a set of generating defaults of an extension is a subset of one of the 73i. then Ui~0 73i is the largest subset of 73 that is grounded in W. We shall now describe a method based on this theorem which allows us to compute (the sets of generating defaults of) all extensions without having to consider all subsets of 73. . W U Con(~) ~ -'fli. D is grounded in W . .

J Con(:D0) ~ -~fli for some justification fli E Jus(790) (3) then let d = a : fl~. Now assume that d . 9 fl.. :D. Furthermore. [] . .. 1 < i < n. (5) for all maximM subsets :D" of :D0 such that d e :D" and }IVU Con(/:)") ~ ~fli (6) do Remove-Defaults(W. which implies that WOCon(790) ~ c~. Suppose that Do is contained in s It is easy to see that Do is a subset of :D' that is grounded in W (because of line (1)). Procedure for computing the sets of generating defaults of all extensions of the closed default theory (W. Proviso: 79 is finite and entailment in the base language is decidable. end Remove-Defaults(W. Both facts together show that the "only if" part of Condition 2 holds. first assume that d = a : ill. :D') . (4) Remove-Defaults(W. . which shows that Condition I of Theorem 5 holds for 790. fin/7 E 19 \790. 1. end Fig. (2) if W t. fin/7 be the corresponding default.~/7 E :Do. :D1.a : i l l .. for all i. (7) else if for each a : i l l .:D. .. Proof.). To show that :Do satisfies the second condition of Theorem 5. ~n/7 9 :D \ :Do either W U Con(:Do) ks a (8) or W U Con(:D0) ~ -~fli for some i (9) then add :D0 to the list of sets of generating defaults. :D).40 Compute-All-Extensions(W. . . Then either W U Con(79o) or }4] U Con(Do) ~ ~fli for some i (because the condition in lines (7) and (8) holds for :Do). :D. WUCon(:Do) -~fli (because the condition in line (2) does not hold for :Do). . We prove this lemma by showing that a set 7)o of defaults contained in s satisfies Conditions 1 and 2 of Theorem 5. :D. .:Do \ {d}). This shows that the "if" part of Condition 2 is Mso satisfied. :D') begin (1) let :Do be the largest subset of :DI that is grounded in W. . Recall that D0 is grounded in W. observe that.. l:)) begin (1) if W is inconsistent (2) then print "Inconsistent world description" (3) else for all maximal subsets :D~ of :Do such that W U Con(:D ~) is consistent (4) do Remove-Defaults(W.

Remove-Defaults is obviously not called recursively. Proof. and thus we only have to show D~ C Do.Do is added to the list of sets of generating defaults.there is a recursive call of Remove-Defaults. Let Do be a set of generating defaults for an extension of a closed default theory (}IV. To see this.D. line (4)) satisfies the required property. Then . Thus d~ 6 Do for all j. D.. We show that D~ = Do. D. First we show that ). Let 1) 0 be the largest subset of D' that is grounded in 141. we have W U Con(Dl)) ~ -'fli for all justifications fli E Jus(D~)) because the condition in line (2) does not hold for D~.41 L e m m a 7._l} ) ~ a~. Suppose Remove-Defaults is called with arguments )IV.V t3 Con(Do) ~ a for some default a : i l l .. which shows that W U C o n ( D o ) ~ ~ . line (5) and (6)).. Let Do be a set of generating defaults for an extension of a closed default theory ( W . d2. . is the prerequisite of the k-th default. If Remove-Defaults(W./3n/7 E D0\D0 such that W U C o n ( D 0 ) ~ a and W U C o n ( D 0 ) ~: -~fli for all i. Now assume that d E D0. . Thus.O I) does not recursively call Remove-Defaults. Let Do C D' be sets of defaults satisfying the assumptions of the lemma. recall that D~ is grounded in W. This means that the condition in line (2) does not hold for D~. This means that there is a default d = (~: i l l . we have shown that there is some default d = a :/3x.. Since Do is a set of generating defaults for an extension we know that 141 U Con(D0) ~= -~/3i. [] L e m m a S . Proof..:D) is grounded in 14. where D~ is the largest subset of D I that is grounded in W. Second. Because of Theorem 5 this is a contradiction with our assumption that Do is a . . If d ~ Do we have Do C D~) \ {d} C 7:)0 and the call of Remove-Defaults with input (I/Y. Suppose Remove-Defaults is called with arguments W. and let D' be a subset o l d such that :Do C D' and 141U Con(D') is consistent.D~ \ {d}) (cf. we get Do C _ C _D0.. or . fin/7 E D0 \ Do. Thus assume that the condition in line (2) holds for D~. fl. Suppose the call Remove-Defaults()~V..D.~/~. Since Do C_ D~ we especially know that W U Con(Do) ~ -~fli for all justifications fli E Jus(D0). If the condition in line (2) does not hold for D0.. Assume to the contrary that D~ \ Do ~ ~. D ) . Let 1 be the smallest number such that d~ 6 D~o\Do. . D" where Do C_ D" C 7)'. and this means that the call Remove-Defaults(W.D. .d~. This means that there is a sequence dl.. Thus there is a maximal subset D" of /)~ with 141 U Con(/)") ~ -~/3i that contains Do. and let 7)' be a subset of 7) such that Do C_ O ~ and W O Con(D') is consistent. Let Do C_ D' be sets of defaults satisfying the assumptions of the lemma. 1 _< j < l. D.. D') recursively calls RemoveDefaults then there is a call with arguments "IV. 1 < i < n. D. D ~. . D") has the required property (cf. 1 < i < n. Since Do is grounded in W. e DO such that 141 U Con(DO) ~ --'~i for some i. D'. of default in V~) such that 142 U Con({d~. D). where a~. . Then Do _C 790 because every set of generating defaults for an extension of (141. .. and nothing has to be shown. .

42 set of generating defaults. Recall that 141 U Con(Do) is consistent. The call Compute-All-Extensions(W. which will be considered in the next section: Let . the condition in lines (7). the second subprocedure is a direct application of such an algorithm. :D). For the fourth subprocedure. Now assume that :Do is a set of generating defaults for an extension of (kV. Theorem9. Ci are the arguments of the i-th call where C1 = :D~. and we can conclude that D~ = :Do. Therefore the assumption :D~ \ :Do 5s I~ is falsified. In this case :Do is added to the list s of sets of defaults (Lemma 8). (8) holds for :Do (cf. :D) (cf. L e m m a 6). The functions Compute-All-Extensions and Remove-Defaults use the following subprocedures which have not explicitly been described: Decide whether W is consistent.4. The third subprocedure is simply obtained by implementing the definition of groundedness. In fact. :D. Since :D is assumed to be finite and the Ci's are decreasing. Compute the largest subset :Do of :D' that is grounded in W. Thus :Do is added to the list of sets of generating defaults. . and :Do C_ Ci for all i. The first subprocedure is a direct application of the decision algorithm for entailment in the base language.:D') returns the list Z: of sets of defaults then :Do is contained in s This result is an immediate consequence of the previous two lemmas. note that W t. Condition 2 of Theorem 5). Lemma 7 shows that there is a sequence of calls of Remove-Defaults such that ~/V. First we observe that every set of defaults computed by the algorithm is in fact a set of generating defaults for an extension of a closed default theory (kV. Compute all maximal subsets :D' of :D such that W tO Con(:D') is consistent. there is some m > i such that Remove-Defaults(W. D).:D. Compute all maximal subsets :D" of :Do such that W tO Con(:D") ~: -~/3~.:D. If the call Remove-Defaults(W. The other two procedures depend on an algorithm for the following problem. :D' (cf. Ci+l C C{. B be ABoxes. Since :Do is a set of generating defaults.4 t9 Q is consistent.:D) computes sets of generating defaults for all extensions of the closed default theory (kV. lines (3) and (4) in the function Compute-All-Extensions) for some subset l)' of :D with :Do C :D'. :D. Compute all maximal subsets Q of B such that . In fact. :D) generates a call Remove-Defaults with arguments )IV. Thus there is a maximal subset :D' of :D such that kV to Con(:D') is consistent and 7)' contains :Do. Cm) does not generate a recursive call of Remove-Defaults.l Con(7)") ~: -~/3i iff W U Con(:D") U {13i} is consistent. [] Now we are ready to prove soundness and completeness of our algorithm. This shows that Compute-All-Extensions(W.

(3R. An obvious contradiction of the form A(b).--. --(C U D) and --C [7--D. facts A(b). i.A/:C. and then takes the minimal inconsistent (maximal consistent) ones. Negation normal forms can be generated using the fact that the following pairs of concept terms are equivalent: --~C and C. 4. which is a much easier case.43 5 Computing Minimal Consistent ABoxes Inconsistent and Maximal This section is concerned with the following algorithmic problems: Given two ABoxes . is consistent iff it does not contain an obvious contradiction.As is decidable. Because of the presence of disjunction in our language. a given ABox must sometimes be transformed into two different new ABoxes.A1 is consistent. with the intended meaning that the original ABox is consistent iff one of the new ABoxes is consistent. an ABox to which no more rules apply. find all minimal (resp.As (see [11. one can apply no more rules. The following facts make clear why the rules of Figure 2 provide us with a decision procedure for consistency of ABoxes of.CU--D.. i.C. A complete ABox.4. Without loss of generality we assume that the concept terms occurring in . the tableaux-based consistency algorithm tries to generate a finite model of A.4 is consistent. 3. 25]. The idea of employing tableaux-based methods for such purposes was already used in [17. The transformation process always terminates.A(b) will also be called "clash" in the following. exists-restriction.C) and 3R.Ao is consistent iff A1 or.. consistent).. it starts with .. If.A(b) for an individual name b and a concept name A.. As are obtained from . In order to decide whether an ABox . In principle.-. there is the obvious "bruteforce" solution which tests consistency of . --.. P r o p o s i t i o n 10. -. 11]. 1] for a proof).A2 is consistent.41. 2.40 are in negation normal form.40 by application of the disjunction yule then . negation occurs only directly in front of concept names. i. or value-restriction rule then Ao is consistent ill. 1. B.4 U Q is inconsistent (resp.-.4o by application of the conjunction.e.e. Formally. this means that one is working with sets of ABoxes instead of a single ABox. Since consistency of ABoxes in . IrA1 is obtained from. but these papers restricted themselves to propositional logic.C.--.e.4.. In the following we shall describe a more efficient method of finding these minimal (maximal) sets. Figure 2 describes the transformation rules of the tableaux-based consistency algorithm for . maximal) subsets Q of B such that .4 U Q for all subsets Q of B. The method is an extension of the tableaux-based consistency algorithms for ABoxes described in [1..(C I-7 D) and -.e. . as well as --(VR.C) and VR. and adds new assertional facts with the help of certain rules until the obtained ABox is "complete.." i.

`4m be the complete ABoxes obtained this way. c) and C(c) for some individual c. b) and C(b). maximal) subsets Q of B such that A to Q is inconsistent (resp..40 contains neither C(a) nor D(a). One generates a new individual name b..40 does not contain the assertion C(b). we want to know which elements of B can be dispensed with without destroying the property that all complete ABoxes contain an obvious contradiction (resp.41 and . If one of these is not obviously contradictory.4 is consistent iff one of the ABoxes in Ad is consistent. consistent). `4to8 is consistent. this yields a finite set M of complete ABoxes with the property that . Fig." and the elements . and let . The ABox `41 is obtained from . T h e d i s j u n c t i o n rule. To this purpose we introduce a propositional variable for each element of/3.e. For this reason.4 t3/3. We start with applying the tableaux-based consistency algorithm to `4 U/3. Let . and there are no minimal inconsistent sets to compute (resp.42. and applies transformation rules (in arbitrary order) as long as possible. Transformation rules of the consistency algorithm for `4s To check whether a given ABox `4 is consistent one thus starts with {`4}. the elements of .40 by adding C(a) and D(a).40 by adding D(a). Eventually. T h e c o n j u n c t i o n rule. The following rules replace `40 by an ABox `41 or by two ABoxes . i.r e s t r i c t i o n rule. and that . and that `40 does not contain both assertions C(a) and D(a).40 by adding C(a). it is important to know which facts in B contribute to a particular obvious contradiction.40 by adding C(b). Assume that (3R. B is the maximal consistent set).42 is obtained from . Assume that (C U D)(a) is in -40. T h e e x i s t s ./3 are ABoxes. T h e v a l u e . Assume that (VR.&4..r e s t r i c t i o n rule.C)(a) and R(a..44 Let M be a finite set of ABoxes. and the ABox . propositional formulae built from the variables by using conjunction and disjunction only.41. Now assume that `4.b) are in .40.4 are labelled with "true. Since the elements of A/[ are complete their consistency can simply be decided by looking for an obvious contradiction.41 is obtained from . and that . Assume that (C["ID)(a) is in `40. 2. and obtains `41 from `40 by adding R(a.. In the original ABox .40 be an element of . and we want to find all minimal (resp. and label assertional facts with "monotonic" boolean formulae built from these variables. Otherwise. The ABox `41 is obtained from . The ABox .C)(a) is in -40. which elements of/3 have to be removed to get at least one complete ABox without obvious contradiction). and that -40 does not contain assertions R(a.

4 tO B}. the new one is labelled by r A r Since the same assertional fact m a y arise in more than one way.. Let . Now let r r be the formulae expressing all the clashes in . The clash formula associated with .. there cannot be an infinite chain of rule applications. This can. If. P r o p o s i t i o n 12. during the consistency test. As for the unlabelled consistency algorithm. be shown by a straightforward adaptation to the labelled case of the termination ordering used in [1]. Before proving this proposition we point out how the clash formula can be used to find minimal (resp. i=lj=l A We have used conjunction when expressing a single clash because both assertional facts are necessary for the contradiction.41 is expressed by the propositional formula ind(A(a)) A ind(-~A(a))..4 and of "refutable" facts 13.~ be the complete ABoxes obtained by applying the labelled consistency algorithm to . but now all assertional facts occurring in these ABoxes have labels. we also get disjunctions in labels. consistent). Initially.-.-Am. Let r be the clash formula associated with . More formally. maximal) sets directly correspond to minimal (resp. let Q C B. such minimal (resp.4 tO/3. labelled ABoxes to which no rules apply. maximal) valuations making the clash . D e f i n i t i o n l l ( C l a s h f o r m u l a ) ..-. A particular clash A(a). This explains why disjunction is used to combine the formulae expressing the clashes of one complete ABox.4 tO 13 are in negation normal form.4to/3 are labelled with monotonic boolean formulae as described above. Now recall that we need at least one clash in each of the complete ABoxes to have inconsistency. the elements of. and why the formulae corresponding to the different complete ABoxes are combined with the help of conjunction.41.4 tO 13." Then . the transformation rules of Figure 3 are applied as long as possible. By Proposition 12.. Thus the labelled consistency algorithm also terminates with a finite set of complete ABoxes. We shall refer to the label of an assertional fact c~ by ind(c~). maximal) subsets Q of B such that `4 tO Q is inconsistent (resp. we shall now describe a labelled consistency algorithm for ABoxes `4 tO B consisting of "hard" facts .4 U Q is inconsistent iff r evaluates to '~true" under w. n assertional facts with labels r r give rise to a new fact.45 of B are labelled with the corresponding propositional variable. Without loss of generality we assume that the concept terms occurring in .41. we end up with complete ABoxes . Again.4 U B is r~ k..-~A(a) E . for example.e.4i. Starting with the singleton set {. and let w be the valuation which replaces the propositional variables corresponding to elements of Q by "true" and the others by "false. The labels occurring in these ABoxes can be used to describe which of the original facts in 13 are responsible for the obvious contradictions. i.4.

4i and . These new ABoxes either contain additional assertional facts.C)(a)).C)(a) is in . An ABox is extended by an assertional fact with index r means the following: If this fact is already present with index r we just change its index to r Y r Otherwise. where we assume that "false" is smaller than "true. Assume that (CU D)(a) is in ." In fact. it is added to the ABox and gets index r T h e c o n j u n c t i o n rule.42.40 does not contain assertions R(a.4i is obtained by extending . Here "minimal" and "maximal" for valuations is meant with respect to the partial ordering Wl < w2 iff wl(Pi) < w2(Pi) for all propositional variables Pi. or the indices of existing assertional facts are changed.40.40 by adding R(a.4o.40 by C(a) with index ind( (C F3D)(a) ) and by D(a) with index ind( (C rq D)(a) ).b) are in .4i from . and obtains .40. The following rules replace . T h e d i s j u n c t i o n rule. for a given monotonic boolean formula r and a valuation w.M be a finite set of labelled ABoxes. Assume that (CF3D)(a) is in `40. both with index ind( (3R.C)(a) ). The ASox `4i is obtained by extending . b)). and let wd denote the valuation obtained from w by replacing "true" by "false" and vice versa. The ABox `41 is obtained by extending -4o by C(b) with index ind( (VR.46 Let . and that -4o does not contain an assertion C(b) whose index is implied by ind((VR. Transformation rules of the labelled consistency algorithm for `4s formula r "true" (resp.C)(a) and R(a. c) and C(c) whose indices are both implied by ind((3R.44. One generates a new individual name b. and the ABox . and that Ao does not contain assertions C(a) and D(a) whose indices are both implied by ind((C n D)(a)). 3.41 or by two ABoxes .C)(a)) A ind(R(a." . The ABox . T h e v a l u e .42 is obtained by extending .4o by an ABox .C)(a)) A ind(R(a." It is easy to see that the problem of finding maximal valuations making a monotonic boolean formula "false" can be reduced to the problem of finding minimal valuations making a monotonic boolean formula "true. let cd denote the formula obtained from r by replacing conjunction by disjunction and vice versa. b) and C(b). and let `40 be an element of . Assume that (VR. T h e e x i s t s .r e s t r i c t i o n rule. "false"). and that . Assume that (3R.r e s t r i c t i o n rule. In order to avoid having to distinguish between these two cases in the formulation of the rules.4o by C(a) with index ind((C U D)(a)). we introduce a new notation. b)).40 does not contain C(a) or D(a) whose index is implied by ind((C U D)(a)). Fig. Then w is a maximal valuation making r "false" iff wd is a minimal valuation making cd "true.40 by D(a) with index ind((C U D)(a)). and that .

be sufficient for our purposes. Let `40 be a labelled ABox. transforming a given monotonic boolean formula into disjunctive normal form may cause an exponential blow-up. Since the same it is easy This will the modified exists-restriction rule can be applied infinitely often to fact (3R.C)(a) is in `4o. but the test whether a rule applies becomes tractable. and let w be a valuation.4 U Q for O~ C_ B. the preconditions of the rules include an entailment test for monotonic boolean formulae.47 It should be noted that the problem of finding minimal valuations that make a monotonic boolean formula r "true" is NP-complete. One generates a new individual name b. occurrence of the index in the top-level disjunction) without destroying termination and the property stated in Proposition 12." Let Q be a subset of ~.4o by removing all facts whose labels evaluate to "false. the rules will in general produce longer formulae occurring as indices. w(`40)) is obtained from . the valuation w is assumed to be such that it replaces the variables corresponding to elements of Q by "true" and the others by "false. one can weaken the precondition by testing a necessary condition for entailment (e. and obtains . and application of rules of the unlabelled algorithm. this is just the well-known problem of finding minimal hitting sets [22. starting with `4 U B.g. D e f i n i t i o n 1 3 . which is NP-hard. if r is in disjunctive normal form. On the other hand. Assume that (C kl D)(a) is in -40. this means that w(`4 U B) = . To optimize the search for minimal valuations one can use the method described in [23]. To get this correspondence. on the other hand. The wprojection of `4o (for short. Assume that (3R. The rules of the labelled consistency algorithm as described have the unpleasant property that deciding whether or not a rule is applicable is an NP-hard problem. However. on the one hand. In fact.40 by adding C(a). if r is in conjunctive normal form. But to see that the first two properties stated in Proposition 10 still hold. First we shall explain the connection between application of rules of the labelled consistency algorithm.4 U Q. However.C)(a) the modified set of rules need no longer terminate. the conditions on applicability of the disjunction and the exists-restriction rules have to be weakened for the unlabelled algorithm: T h e m o d i f i e d d i s j u n c t i o n r u l e .4o by adding D(a). The ABox `41 is obtained from . the minimal valuations can be found in polynomial time. In fact." Obviously. P r o o f o f P r o p o s i t i o n 12. and the h B o x `42 is obtained from . b) and C(b). Now we shall show how application of a rule of the labelled consistency algorithm to a labelled ABox `4o corresponds to application of a rule of the unlabelled algorithm to w(`40). starting with . T h e m o d i f i e d e x i s t s .40 does not contain C(a) and D(a). and that . In the following. In this case. 10].r e s t r i c t i o n r u l e . .41 from A0 by adding R(a.

First.C)(a) (without loss of generality we may assume that the newly generated individual is called b). The labelled ABox At is obtained from `4o by generating a new individual b.`4s are obtained from `4o by application of the disjunction rule. .. ( A s ) .40 then this fact has index r in `41. It should be noted that the (unmodified) exists-restriction rule need not be applicable since w(`4o) may well contain an individual c and assertions C(e) and R(a. exists-restriction. Thus assume that w(r = true. In this case.C)(a) and R(a. Since w(r = false this means that C(a) (resp. consider the case where w(r = false. Lemraal4.C)(a). c).`41 be labelled ABoxes such that `41 is obtained from `4o by application of the conjunction (resp. D(a)) has index r V r in `41. Then we either have w(`41) = . Since `41 is obtained by extending . ~(`41) can be obtained from w(`40) by applying the conjunction rule to (C [3 D)(a). If C(a) (resp. If both facts are already present in ~(`40) we have ~(`41) = w(`40). Since `41 is obtained by extending `40 by C(b) with index r A r we know that C(b) is an element of w(`41). If this assertional fact is already present in w(`40) then w(`41) = w(`4o). (2) Assume that the value-restriction rule is applied to the assertional facts (VR. Otherwise. Then (VR.40 by C(a) and D(a). and adding C(b) and R(a. or w(A1) is obtained from w(`4o) by application of the (unlabelled) conjunction (resp. both with index r For this reason. modified exists-restriction. o r a r e obtained from by application of the (unlabelled) modified disjunction rule. w(`41) can be obtained from w(`4o) by applying the value-restriction rule to (VR. Otherwise. and that this fact has index r in `40.C)(a) is an element ofw(`4o). D(a)) is an element of. b) to `40. . value restriction) rule. D(a)) is an element of w(`41) iff it is an element of w(`4o). We can obtain w (`41) from w(`40) by applying the modified exists-restriction rule to (3R. value restriction) rule. .b) are contained in w(`41). if C(a) (resp.4o with index r then C(a) (resp. Thus (C [3 D)(a) is an element of w(`4o). In fact. we have a similar lemma. The case where w(r = false is again trivial. Proof. L e m m a 15. b). Thus assume that w(r A r = true. Now assume that w(r = true. [] For the disjunction rule. w(r A r = false implies ~J(`41) = 02(`40).48 Let `40.`41.`4s be labelled ABoxes such that `41..C)(a) and R(a. Then we either have w(`41) = w(`4o). and that this fact has index r in `40. D(a)) is not in w(`41). C(b) and R(a. (3) Assume that the exists-restriction rule is applied to the assertional fact (3R. D(a)) is not in .C)(a) and R(a. Then (3R. (1) Assume that the conjunction rule is applied to the assertional fact (C • D)(a). Let `4o. Since w(r = false we have w(r V r = w(r which shows that C(a) (resp. we have w(`41) = w(`40)..(`40) = . and that these facts respectively have index r and r in `40As for the conjunction rule. b). b) are contained in ~(`40). both with index r we also know that C(a) and D(a) are contained in w(`41).

(The other cases can be treated similarly.w(A2) from w(`40) by applying the modified disjunction rule to (C u D)(a).4") is consistent.49 Proof. [] Since ` 4 1 . Obviously. all the ABoxes w ( A 1 ) . In addition.40).w(. . . This can shown as in the corresponding cases in the proof of L e m m a 14. We consider an assertional fact (C r-1D)(a) in w(. Proof. .. but not both. . . This concludes the proof of Proposition 12. L a m i n a 16. Then (C U D)(a) is an element of w(`40).A(a) E Ai is still present in w(.40. and applying the rules of the labelled consistency algorithm as long as possible. we can obtain w(A1)..dl) = w(`4o) = w(`42).. If both C(a) and D(a) are already present in w(Ao) then w(`41) = w(`4o) = w(`42).-. By L e m m a 14 and 15...For this reason. and that this fact has index r in . . we know that C(a) is contained in r and that D(a} is contained in w(`42). Assume that the disjunction rule is applied to the assertionM fact (C U D)(a). [] Now assume that we have obtained the complete ABoxes `41. w(`41) contains a clash iffw evaluates vk'-_l r to "true. we know that w(`4 U B) = `4 U Q is consistent iff one of w(. and their indices (say r r are implied by r But then w(r = true implies w(r = true = w(r Thus C(a) and D(a) are contained in w(A0). . which shows that the conjunction rule is not applicable to (CR D)(a) in w(`4o).) iff w evaluates ind(A(a)) A ind(".w(M.-. Am are complete we thus know that w ( . ~o(`4~) are complete as well. ." Now let r r be the formulae expressing all the clashes in .40 contains the assertional facts C(a) and D(a). .Ai. If ~o(r = false then w(. . and since the (modified) rules of the unlabelled consistency algorithm preserve solvability. .) contain a clash iff w evaluates to true the clash formula i=1 j = l AV computed by the labelled consistency algorithm. . . 4 1 ) . Thus assume that w(r = true. and show that the conjunction rule cannot be applied to this fact in w(`40). Then none of the (unmodified) rules of the unlabelled consistency algorithm applies to w(Mo).. It should be noted that the (unmodified) disjunction rule need not be applicable since ~o(`40) m a y well contain one of C(a) and D(a). Now Proposition 10 implies that w (`4i) is inconsistent iff it contains a clash.A1). The next lemma implies that these projected ABoxes are also complete." For this reason.4 U B.. Let `40 be a labelled ABox to which none of the rules of the la- belled consistency algorithm applies.) Since (C n D)(a) is present in w(A0) its index r in -40 satisfies w(r = true.4. A particular clash A(a). Otherwise.-Am by starting with .A(a)) to "true. Completeness of `40 implies that the (labelled) conjunction rule is not applicable to (CV1D)(a) in -4o. .

As an alternative to the pragmatic solution described in the present paper. and have proved the correctness of this solution. both from a semantic and an algorithmic point of view (see also [3]). we have considered a restricted semantics where default rules are only applied to individuals explicitly present in the knowledge base. This treatment of default rules is similar to the treatment of monotonic rules in many terminological systems. We have shown how the algorithmic requirements for Junker and Konolige's method (i. the methods of Junker and Konolige and of Schwind and Risch for computing all extensions of a default theory can be applied.. and Thomas Trentz for implementing the procedure for computing extensions described in Subsection 4. which is then treated with a modified preferential approach. According to Reiter's semantics the specificity of prerequisites of rules has no influence on the order in which defaults rules are supposed to fire. because of the nonmonotonic character of default rules. in which defaults are also applied to implicit individuals. For this reason.. With respect to the restricted semantics.2. In [4] we describe a modification of terminological default logic in which more specific defaults are preferred. open defaults are not viewed as schemata for certain instantiated defaults. 6 Conclusion We have investigated the integration of Reiter's default logic into a terminological representation formalism. A c k n o w l e d g e m e n t s We should like to thank Bernhard Nebel and Peter PatelSchneider for helpful comments. which means that users of such systems are already familiar with the effects this restriction to explicit individuals has. This work has been supported by the German Ministry for Research and Technology (BMFT) under research contract I T W 92 01. we thus have presented a solution of the two algorithmic problems described at the beginning of this section. the computation of maximal consistent sets of assertional facts) can be solved by an extension of the tableaux-based algorithm for assertional reasoning. However.e.50 To sum up. this restriction may sometimes lead to more consequences than would have been obtained without it. Together with the methods of Section 4 this gives us effective procedures to compute all extensions of terminological default theories.e. Instead. To make this possible without encountering the problems pointed out in Section 3. [5] proposes a new semantics for open defaults. they are used to define a preference relation on models. We have shown that the treatment of open defaults by Skolemization is problematic. . the computation of minimal inconsistent sets of assertional facts) and for an optimized algorithm based on a theorem of Schwind and Risch (i.

1989. Cognitive Science. KIT-Report 95. D . In Proceedings of the 8th National Conference on Artificial Intelligence. 5. 1991. 1990. 13. 12:231-272. 1985. Mass. 8. In 14th German Workshop on Artificial Intelligence.. 1979. Brachman and J. P . A. U.. Baader and B. L. In Terminological Logic Users W o r k s h o p . Making KR systems useful. Hollunder. pages 38-47. Hard problems for simple defaults. F. Computing extensions of autoepistemic and default logics with a truth maintenance system. Mass. Resnick. San Mateo. Garey and D. 18. 3. Research Report RR-93-13. 1979. 1980. 7. Artificial Intelligence. Mays and B. Principles of Semantic Networks. Schlechta. Patel-Schneider. pages 189-197. Research Report RR-92-58. Baader and K. volume 251 of Informatik-Fachberichte. J. J. D. 9(2):171-216.Proceedings. J. 1991. D F K I Saarbrficken. 12. Calif. Ebingerfeld. In TerminologicalLogic Users Workshop . Ont. J. TU Berlin. pages 11-12. 1993.. D F K I Saarbrficken. Baader and P. F. In Proceedings of 14. Peltason. Johnson. Cambridge. Sydney. 1989. DFKI Kaiserslautern. M. Hollunder. and A. Statement of interest. An overview of the K L . Computers and Intractability. Brachman. 6. Statement of Interest for the 2nd International Workshop on Terminological Logics.A Guide to the Theory of NP-Completeness. B. Circumscription . 15. 17. 6(3):80-93. defaults and definitions in knowledge representation. A. Hanschke. In Proceedings of the 3rd International Conference on Knowledge Representation and Reasoning. F. Junker and K. Springer. A truth maintenance system. page 186. Konolige. Borgida.. Australia. McDermott and J. Non-monotonic logic I. Selman. F. 1985. R. 2. 1991. Germany. Boston. pages 278-283. 10. H. 13:41- 72. 1991. A. Calif.a form of non-monotoIfic reasoning. McGregor. Brachman. Artificial Intelligence. R. McCarthy. Research Report RR-91-10.Proceedings. Baader and P. 1992. Morgan Kanfmann. editors. 11. Kautz and B. Freeman. San Francisco. Document D-91-13. How to prefer more specific defaults in terminological default logic. Toronto.O N E knowledge representation system. In Preprints of the Workshop on Formal Aspects of Semantic Networks. 1991. In Proceedings of the 12th International Joint Conference on Artificial Intelligence. 1990. and C. Chambery. A semantics for open normal defaults via a modified preferential approach. Two Habours. B. 1980. 4. Embedding defaults into terminological knowledge representation formalisms. F . F. A scheme for integrating concrete domains into concept languages. 13:27-39. Sowa. Dionne. taBACK. 1991. G. J. the 1st International Conference on Principles of Knowledge Representation and Reasoning. R. Hanschke. Doyle. E.. 16. Cal. Kobsa. 19. In K. editor. . 9. France. Also to appear in Proceedings of the 13th International Joint Conference on Artificial Intelligence. A Scheme for Integrating Concrete Domains into Concept Languages. Artificial Intelligence. 1993. Living with CLASSIC: When and how to use a KL-ONE-like language. Schmolze. The SB-ONE knowledge representation workbench. Hollunder. McGuinness. L . Hybrid inferences in KL-ONE-based knowledge representation systems.51 References 1. TU Berlin. von Luck. Nebel. R. pages 401-456. Doyle.. 'I lied about the trees' or. In J. Baader and B. 1992. KIT-Report 95. System presentation. D F K I Kaiserslantern. The A I Magazine.

A logic for default reasoning. Risch. v. 1991. M. . Reiter. 1987. 23. In Proceedings of the 1st European Conference on Symbolic and Quantitative Approaches for Uncertainty.Proceedings. 13(1-2):81-132. 1980.). pages 310-317. France. 21. Schwind and V. Mass. K I T Report 95. K. Schmidt-SchatrB and G. Search through systematic set enumeration. A theory of diagnosis from first principles. R. 47. In Proceedings of the 3rd International Conference on Knowledge Representation and Reasoning.. R. Cambridge. Artificial Intelligence. and C. 1992. TU Berlin. Artificial Intelligence.52 20. Rymon. 25. Smolka. Kindermann (Org. Artificial Intelligence. 32:5795. C. 22. C. Attributive concept descriptions with complements. Peltason. 24. 1991. A tableau-based characterisation for default logic. R. 1991. Reiter. Marseille. Luck. Terminological logic users workshop .

- U2L4_SequenceDiagrams
- Probability
- Driving Test Questions
- Project Report Molecular Dynamics CMSC6920
- Driving licensing questions
- DHA Suffa Application for Job
- Chemical Process Safety assignment related to FAR calculations.
- Human Error
- Human Error Chap 8 (James Reason)
- Human Error Chap 4
- Some solutions of Numerical Methods problems
- 1915 Talism HoshrubaJild02-Maamirabbasi.blogspot.com
- Bostan -e- Khyal (بوستان خیال)
- A Survey of Agent Oriented Software Technology
- Talism Hoshruba Jild01
- 087
- 087
- Memorial Univerisity INCO Build
- 2011 05-May 11 DesignPatterns 01
- ChemicalProcessSafetyFundamentalswithApplications_2e
- Toc
- Essentials Game Theory by Brown & Shoham
- ChemicalProcessSafetyFundamentalswithApplications_2e
- Sheet 7

by Nasir Danial

A quality book on Knowledgebase designs.

A quality book on Knowledgebase designs.

- Reasoning Shortcut Tricks_Part2
- logic2-6pp
- TeachYourselfLogic
- AI
- Lusch_and_Vargo_2008_SSME
- Fundamentals of Classical Logic
- DM-Ch1
- 10.1.1.105
- KKCRWeek5
- Logical and Relational Learning
- Sample Rhetoric and Academic Writing Syllabus and Selected Projects List
- Descartes Rationalism
- Logic Tutorial 1
- Chellas.ModalLogic
- The Life of Reason by Santayana, George, 1863-1952
- An Introduction to Reasoning. Stephen Toulmin, Richard Rieke, & Allan Janik.pdf
- Hume First Enquiry
- 9-figuresmoods-091211092852-phpapp02
- One Kings Lane Delivers Consistent Lifestyle Brand Experience on Every Device
- Logical Reasoning Number Series
- IBPS Syllabus for PO 2014
- Untitled
- Enhancing the Diagramming Method in Informal Logic
- Intention paper. anscombe.pdf
- Rational
- The Great Ins Tau Ration
- RAHUL KUMAR Risking and Wrongdoing
- Particularism, Perception and Judgement

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd