You are on page 1of 15

1|Page

CONTENTS
Sl. No. 1. 2. 3. 4. 5. Topic Abstract Definition & Scope Pg. No. 3 3 3 What is an ontology(Conceptualization)? 5 What is a biological ontology? 5 Components of Ontology 6 Individuals 6-7 Classes 7 Attributes 8 Relationships 9-10 Classification: Traditional ontology languages Principles for the Design of Ontologies Used for 10

6. 7. 8. 9. 10. 11. 12. 13.

Knowledge Sharing Design criteria for ontologies Ontologies Advantages Ontology Example Use of ontologies Conclusion References

10-11 11-12 12-14 14 15 15

2|Page

Abstract
The current interest in ontologies is the latest version of AIs alternation of focus between content theories and mechanism theories. Sometimes, the AI community gets excited by some mechanism such as rule systems, frame languages, neural nets, fuzzy logic, constraint propagation, or unification. The mechanisms are proposed as the secret of making intelligent machines. At other times,we realize that, however wonderful the mechanism, it cannot do much without a good content theory of the domain on which it is to work. Moreover, we often recognize that once a good content theory is available, many different mechanisms might be used equally well to implement effective systems, all using essentially the same content.

Definition and Scope


The subject of ontology is the study of the categories of things that exist or may exist in some domain. The product of such a study, called an ontology, is a catalog of the types of things that are assumed to exist in a domain of interest D from the perspective of a person who uses a language L for the purpose of talking about D. The types in the ontology represent the predicates, word senses, or concept and relation types of the language L when used to discuss topics in the domain D. An uninterpreted logic, such as predicate calculus, conceptual graphs, ontology KIF, is ontologically neutral. It imposes no constraints on the subject matter or the way the subject may be characterized. By itself, logic says nothing about anything, but the combination of logic with an ontology provides a language that can express relationships about the entities in the domain of interest. 1. An informal ontology may be specified by a catalog of types that are either undefined or defined only by statements in a natural language. 2. A formal ontology is specified by a collection of names for concept and relation types organized in a partial ordering by the typesubtype relation. Formal ontologies are further distinguished by the way the subtypes are distinguished from their supertypes: an axiomatized ontology distinguishes subtypes by axioms and definitions stated in a formal language, such as logic or some computer-oriented notation that can be translated to logic; a prototype-based ontology distinguishes subtypes by a comparison with a typical member or prototype for each subtype. Large ontologies often use a mixture of definitional methods: formal axioms and definitions are used for the terms in mathematics, physics, and engineering; and prototypes are used for plants, animals, and common household items.

What is an ontology(Conceptualization)?
An ontology is a speci_cation of a conceptualization. Conceptualization: system of categories accounting for a particular view on the world Gruber: conceptualization is a structure < D; R >, where D is a domain and R _ Dn a set of relations on D. Guarino: intensional account of conceptualization

3|Page

An ontology (arti_cial intelligence) is based on a vocabulary L. Ontological commitment: relates vocabulary to conceptualization A language L commits to a conceptualization C by means of an ontological commitment K. How does a term t relate to a domain?

4|Page

What is a biological ontology?


Controlled, structured vocabularies Account for the shared conceptualization of scientific communities Intended for database interoperability Recently: ontologies built out of terms which are intended to refer exclusively to universals For an information system, an ontology is a representation of some pre-existing domain of reality which: 1) reects the properties of the objects within its domain in such a way that there obtains a systematic correlation between reality and the representation itself 2)is intelligible to a domain expert 3) is formalized in a way that allows it to support automatic information processing There is no (explicit) conceptualization.

Components of Ontology
Common components of ontologies include:

Individuals: instances or objects (the basic or "ground level" objects)

5|Page

Classes: sets, collections, concepts, types of objects, or kinds of things.[1] Attributes: aspects, properties, features, characteristics, or parameters that objects (and classes) can have Relations: ways in which classes and individuals can be related to one another Function terms: complex structures formed from certain relations that can be used in place of an individual term in a statement Restrictions: formally stated descriptions of what must be true in order for some assertion to be accepted as input Rules: statements in the form of an if-then (antecedent-consequent) sentence that describe the logical inferences that can be drawn from an assertion in a particular form Axioms: assertions (including rules) in a logical form that together comprise the overall theory that the ontology describes in its domain of application. This definition differs from that of "axioms" in generative grammar and formal logic. In these disciplines, axioms include only statements asserted as a priori knowledge. As used here, "axioms" also include the theory derived from axiomatic statements. Events: the changing of attributes or relations

Individuals
Individuals (instances) are the basic, "ground level" components of an ontology. The individuals in an ontology may include concrete objects such as people, animals, tables, automobiles, molecules, and planets, as well as abstract individuals such as numbers and words (although there are differences of opinion as to whether numbers and words are classes or individuals). Strictly speaking, an ontology need not include any individuals, but one of the general purposes of an ontology is to provide a means of classifying individuals, even if those individuals are not explicitly part of the ontology. In formal extensional ontologies, only the utterances of words and numbers are considered individuals the numbers and names themselves are classes. In a 4D ontology, an individual is identified by its spatio-temporal extent. Examples of formal extensional ontologies are ISO 15926 and the model in development by the IDEAS Group.

Classes
Classes concepts that are also called type, sort, category, and kind can be defined as an extension or an intension. According to an extensional definition, they are abstract groups, sets, or collections of objects. According to an intensional definition, they are abstract objects that are defined by values of aspects that are constraints for being member of the class. The first definition of class results in ontologies in which a class is a subclass of collection. The second definition of class results in ontologies in which collections and classes are more fundamentally different. Classes may classify individuals, other classes, or a combination of both. Some examples of classes:

Person, the class of all people, or the abstract object that can be described by the criteria for being a person. Vehicle, the class of all vehicles, or the abstract object that can be described by the criteria for being a vehicle. Car, the class of all cars, or the abstract object that can be described by the criteria for being a car.

6|Page

Class, representing the class of all classes, or the abstract object that can be described by the criteria for being a class. Thing, representing the class of all things, or the abstract object that can be described by the criteria for being a thing (and not nothing).

Ontologies vary on whether classes can contain other classes, whether a class can belong to itself, whether there is a universal class (that is, a class containing everything), etc. Sometimes restrictions along these lines are made in order to avoid certain well-known paradoxes. The classes of an ontology may be extensional or intensional in nature. A class is extensional if and only if it is characterized solely by its membership. More precisely, a class C is extensional if and only if for any class C', if C' has exactly the same members as C, then C and C' are identical. If a class does not satisfy this condition, then it is intensional. While extensional classes are more well-behaved and well-understood mathematically, as well as less problematic philosophically, they do not permit the fine grained distinctions that ontologies often need to make. For example, an ontology may want to distinguish between the class of all creatures with a kidney and the class of all creatures with a heart, even if these classes happen to have exactly the same members. In most upper ontologies, the classes are defined intensionally. Intensionally defined classes usually have necessary conditions associated with membership in each class. Some classes may also have sufficient conditions, and in those cases the combination of necessary and sufficient conditions make that class a fully defined class. Importantly, a class can subsume or be subsumed by other classes; a class subsumed by another is called a subclass (or subtype) of the subsuming class (or supertype). For example, Vehicle subsumes Car, since (necessarily) anything that is a member of the latter class is a member of the former.

Attributes
Objects in an ontology can be described by relating them to other things, typically aspects or parts. These related things are often called attributes, although they may be independent things. Each attribute can be a class or an individual. The kind of object and the kind of attribute determine the kind of relation between them. A relation between an object and an attribute express a fact that is specific to the object to which it is related. For example the Ford Explorer object has attributes such as:

<has as name> Ford Explorer <has by definition as part> door (with as minimum and maximum cardinality: 4) <has by definition as part one of> {4.0L engine, 4.6L engine} <has by definition as part> 6-speed transmission

The value of an attribute can be a complex data type; in this example, the related engine can only be one of a list of subtypes of engines, not just a single thing. Ontologies are only true ontologies if concepts are related to other concepts (the concepts do have attributes). If that is not the case, then you would have either a taxonomy (if hyponym relationships exist between concepts) or a controlled vocabulary. These are useful, but are not considered true ontologies.

7|Page

Relationships
Relationships (also known as relations) between objects in an ontology specify how objects are related to other objects. Typically a relation is of a particular type (or class) that specifies in what sense the object is related to the other object in the ontology. For example in the ontology that contains the concept Ford Explorer and the concept Ford Bronco might be related by a relation of type <is defined as a successor of>. The full expression of that fact then becomes:

Ford Explorer is defined as a successor of : Ford Bronco

This tells us that the Explorer is the model that replaced the Bronco. This example also illustrates that the relation has a direction of expression. The inverse expression expresses the same fact, but with a reverse phrase in natural language. Much of the power of ontologies comes from the ability to describe relations. Together, the set of relations describes the semantics of the domain. The set of used relation types (classes of relations) and their subsumption hierarchy describe the expression power of the language in which the ontology is expressed.

An important type of relation is the subsumption relation (is-a-superclass-of, the converse of is-a, is-a-subtype-of or is-a-subclass-of). This defines which objects are classified by which class. For example we have already seen that the class Ford Explorer is-a-subclass-of 4Wheel Drive Car, which in turn is-a-subclass-of Car. The addition of the is-a-subclass-of relationships creates a taxonomy; a tree-like structure (or, more generally, a partially ordered set) that clearly depicts how objects relate to one another. In such a structure, each object is the 'child' of a 'parent class' (Some languages restrict the isa-subclass-of relationship to one parent for all nodes, but many do not). Another common type of relations is the mereology relation, written as part-of, that represents how objects combine together to form composite objects. For example, if we extended our example ontology to include concepts like Steering Wheel, we would say that a "Steering Wheel is-by-definition-a-part-of-a Ford Explorer" since a steering wheel is always one of the components of a Ford Explorer. If we introduce meronymy relationships to our ontology, the hierarchy that emerges is no longer able to be held in a simple tree-like

8|Page

structure since now members can appear under more than one parent or branch. Instead this new structure that emerges is known as a directed acyclic graph.

As well as the standard is-a-subclass-of and is-by-definition-a-part-of-a relations, ontologies often include additional types of relations that further refine the semantics they model. Ontologies might distinguish between different categories of relation types. For example:

relation types for relations between classes relation types for relations between individuals relation types for relations between an individual and a class relation types for relations between a single object and a collection relation types for relations between collections

Classification: Traditional ontology languages


Common Logic - and its dialects

CycL DOGMA (Developing Ontology-Grounded Methods and Applications) F-Logic (Frame Logic) KIF (Knowledge Interchange Format) o Ontolingua based on KIF KL-ONE KM programming language LOOM (ontology) OCML (Operational Conceptual Modelling Language) OKBC (Open Knowledge Base Connectivity) PLIB (Parts LIBrary) RACER

By syntax Markup ontology languages


These languages use a markup scheme to encode knowledge, most commonly with XML.

DAML+OIL Ontology Inference Layer (OIL) Web Ontology Language (OWL) Resource Description Framework (RDF) RDF Schema SHOE

By Description logic-based

structure:

Frame-based

F-Logic, OKBC, and KM are completely or partially frame-based languages. Description logic provides an extension of frame languages, without going so far as to take the leap to first-order logic and support for arbitrary predicates. Examples include KL-ONE, RACER, and OWL.
9|Page

Gellish is an example of a combined ontology language and ontology that is description logic based. It distinguishes between the semantic differences among others of:

relation types for relations between concepts (classes) relation types for relations between individuals relation types for relations between individuals and classes

It also contains constructs to express queries and communicative intent.

First-order logic-based
Common Logic, CycL and KIF are examples of languages that support expressions in firstorder logic, and, in particular, allow general predicates.

Principles for the Design of Ontologies Used for Knowledge Sharing


Several technical problems stand in the way of shared, reusable knowledge-based software. Like conventional applications, knowledge-based systems are based on heterogeneous hardware platforms, programming languages, and network protocols. However, knowledgebased systems pose special requirements for interoperability. Such systems operate on and communicate using statements in a formal knowledge representation. They ask queries and give answers. They take background knowledge as an input. And as agents in a distributed AI environment, they negotiate and exchange knowledge. For such knowledge-level communication, we need conventions at three levels: representation language format, agent communication protocol, and specification of the content of shared knowledge.

Design criteria for ontologies


Formal ontologies are designed. When we choose how to represent something in an ontology, we are making design decisions. To guide and evaluate our designs, we need objective criteria that are founded on the purpose of the resulting artifact, rather than based on a priori notions of naturalness or Truth. Here we propose a preliminary set of design criteria for ontologies whose purpose is knowledge sharing and interoperation among programs based on a shared conceptualization. 1. Clarity: An ontology should effectively communicate the intended meaning of defined terms. Definitions should be objective. While the motivation for defining a concept might arise from social situations or computational requirements, the definition should be independent of social or computational context. Formalism is a means to this end. When a definition can be stated in logical axioms, it should be. Where possible, a complete definition (a predicate defined by necessary and sufficient conditions) is preferred over a partial definition (defined by only necessary or sufficient conditions). All definitions should be documented with natural language. 2. Coherence: An ontology should be coherent: that is, it should sanction inferences that are consistent with the definitions. At the least, the defining axioms should be logically consistent. Coherence should also apply to the concepts that are defined informally, such as those described in natural language documentation and examples. If a sentence that can be

10 | P a g e

inferred from the axioms contradicts a definition or example given informally, then the ontology is incoherent. 3. Extendibility: An ontology should be designed to anticipate the uses of the shared vocabulary. It should offer a conceptual foundation for a range of anticipated tasks, and the representation should be crafted so that one can extend and specialize the ontology monotonically. In other words, one should be able to define new terms for special uses based on the existing vocabulary, in a way that does not require the revision of the existing definitions. 4. Minimal encoding bias: The conceptualization should be specified at the knowledge level without depending on a particular symbol-level encoding. An encoding bias results when a representation choices are made purely for the convenience of notation or implementation. Encoding bias should be minimized, because knowledge-sharing agents may be implemented in different representation systems and styles of representation. 5. Minimal ontological commitment: An ontology should require the minimal ontological commitment sufficient to support the intended knowledge sharing activities. An ontology should make as few claims as possible about the world being modeled, allowing the parties committed to the ontology freedom to specialize and instantiate the ontology as needed. Since ontological commitment is based on consistent use of vocabulary, ontological commitment can be minimized by specifying the weakest theory (allowing the most models) and defining only those terms that are essential to the communication of knowledge consistent with that theory.

Ontologies Advantages
The advantage of an ontology is that it represents real world information in a manner that is machine process able. This leads to a variety of interesting applications for the benefit of the target user groups. For example, using the ontology in figure in ontologies examples, we can ask the following questions for information discovery purposes:

Give me all courses of the computer science department Give me all courses in year 4 Give me all courses of a specific subject area

The above are machine implement able for user friendly results. The reason ontologies are becoming popular is largely due to what they promise: a shared and common understanding of a domain that can be communicated between people and application systems. Specifically, ontologies offer the following benefits:

They assist in the communication between humans. Here, an unambiguous but informal ontology may be sufficient. They achieve interoperability among computer systems achieved by translating between different modelling methods, paradigms, languages and software tools. Here, the ontology is used as an interchange format. They improve the process and/or quality of engineering software systems.

With respect to computer-based modelling, ontologies have the following advantages:


11 | P a g e

Re-Usability: the ontology is the basis for a formal encoding of the important entities, attributes, processes and their inter-relationships in the domain of interest. This formal representation may be a reusable and/or shared component in a software system. Search: an ontology may be used as metadata, serving as an index into a repository of information. Knowledge Acquisition: using an existing ontology as the starting point and basis for guiding knowledge acquisition when building knowledge-based systems may increase speed and reliability.

The development of an ontology is an interdisciplinary research process that involves computer scientists as well as experts for the specific area that is addressed. Once an ontology has been created it may be used for information discovery in the following ways:

To pose queries for either metadata, concepts or properties of the ontology As a conceptual framework to help the user think about the information repository and formulate queries As a guide to understand the ontology-driven metadata To drive the user interface for creating and refining queries

Ontology Example
Following is an example of the use of attributes and entities for the representation of the real world: consider the address of a person. Is it an entity, relationship, or attribute?

Consider address for a telephone company database, which has to keep track of how many and what type of phones are available in any one household, who lives there (there may be several phone bills going to the same address) etc. for this case, address is probably best treated as an entity. Or, consider an employee database, where for each employee you maintain personal information, such as her address. Here address is best represented as an attribute. Or, consider a police database where we want to keep track of a persons whereabouts, including her address (i.e., address from Date1 to Date2, address from Date2 to Date3, etc.) Here, address is treated best as a relationship.

Ontologies are used in our everyday lives, although most of us do not realize it. An example of how this happens follows:

The department name The year in of academic studies in which the course is taught The subject area And the course number

12 | P a g e

13 | P a g e

Ontology for Cancer Diagnostic

Use of ontologies
In AI, knowledge in computer systems is thought of as something that is explicitly represented and operated on by inference processes. However, that is an overly narrow view. All information systems traffic in knowledge. Any software that does anything useful cannot be written without a commitment to a model of the relevant worldto entities, properties, and relations in that world. Data structures and procedures implicitly or explicitly make commitments to a domain ontology. It is common to ask whether a payroll system knows about the new tax law, or whether a database system knows about employee salaries. Information-retrieval systems, digital libraries, integration of heterogeneous information sources, and Internet search engines need domain ontologies to organize information and direct the search processes. For example, a search engine has categories and subcategories that help organize the search. The search-engine community commonly refers to these categories and subcategories as ontologies. Object-oriented design of software systems similarly depends on an appropriate domain ontology. Objects, their attributes, and their procedures more or less mirror aspects of the domain that are relevant to the application. Object systems representing a useful analysis of a domain can often be reused for a different application program. Object systems and ontologies emphasize different aspects, but we anticipate that over time convergence between these technologies will increase. As information systems model large knowledge domains, domain ontologies will become as important in general software systems as in many areas of AI. In AI,while knowledge representation pervades the entire field, two application areas in particular have depended on a rich body of knowledge. One of them is natural-language understanding. Ontologies are useful in NLU in two ways. First, domain knowledge often plays a crucial role in disambiguation. A well designed domain ontology provides the basis for domain knowledge representation. In addition, ontology of a domain helps identify the semantic categories that are involved in understanding discourse in that domain. For this use, the ontology plays the role of a concept dictionary. In general, for NLU,we need goals of
14 | P a g e

similar types. These reasoning strategies were also characterized by their need for specific types of domain factual knowledge. It soon became clear that strategic knowledge could be abstracted and reused. With few exceptions, the domain factual knowledge dimension drives the focus of most of the AI investigations on ontologies. This is because applications to language understanding motivates much of the work on ontologies. Even CYC, which was originally motivated by the need for knowledge systems to have world knowledge, has been tested more in natural-language than in knowledge-systems applications.

Conclusion
Ontology is a branch of Philosophy concerned with the study of what exists. Formal ontologies have been proposed since the 18th century, including recent ones such as those by Carnap (1968) and Bunge (1977). From a computational perspective, a major benefit of such formalizations has been the development of algorithms which support the generation of inferences from a given set of facts about the world, or ones that check for consistency. Such computational aids are clearly useful for knowledge management, especially when one is dealing with large amounts of knowledge. Various methods have been devised to support knowledge organization and interchange. Controlled vocabularies provide a standardized dictionary of terms for use during for example indexing or retrieval. Dictionaries can be organized according to specific relations to form taxonomies. Ontologies further specify the semantics of a domain in terms of conceptual relationships and logical theories. Ontologies may be constructed for different purposes, for example to enable information sharing and to support specification. When we want to enable sharing and reuse, we define an ontology as a specification used for making ontological commitments (Gruber, 1993). Ontological commitment is an agreement to consistently use a vocabulary with respect to a theory specified by an ontology. In order to support a specification we define ontology as a conceptualization, i.e., ontology defines entities and relationships among them. Every information base is based on either implicit or explicit conceptualization. Research within artificial intelligence has formalized many interesting ontologies and has developed techniques for analyzing knowledge that has been represented in terms of these. Along a very different path, Wand (1989; 1990) studied the adequacy of information systems to describe applications based on a general ontology, such as that proposed by Bunge (1977).

References
1. D.B. Lenat and R.V. Guha, Building Large Knowledge-Based Systems: Representation and Inference in the CYC Project, Addison- Wesley, Reading, Mass., 1990. 2. B. Chandrasekaran, AI, Knowledge, and the Quest for Smart Systems, IEEE Expert,Vol. 9, No. 6, Dec. 1994, pp. 26. 3. J. McCarthy and P.J. Hayes, Some Philosophical Problems from the Standpoint of Artificial Intelligence, Machine Intelligence Vol. 4, B. Meltzer and D. Michie, eds., Edinburgh University Press, Edinburgh, 1969, pp. 463502. 4. D. Marr, Vision: A Computational Investigation into the Human Representation and Processing of Visual Information,W.H. Freeman, San Francisco, 1982. 5. A. Newell, The Knowledge Level, Artificial Intelligence,Vol. 18, 1982, pp. 87127.
15 | P a g e

You might also like