You are on page 1of 78
Applied Artificial Intelligence Topics to Covel * Advanced Knowledge Representation Techniques: Conceptual dependency theory, script structure, CYC theory, case grammars, semantic web. * Natural Language Processing: Sentence Analysis phases, grammars and parsers, types of parsers, semantic analysis, universal networking language, dictionary Conceptual Dependency (CD) © CD theory was developed by Schank in 1973 to 1975 to represent the meaning of NL sentences. ~ Ithelps in drawing inferences = Its independent of the language © CD representation of a sentence is not built using words in the sentence rather built using conceptual primitives which give the intended meanings of words. © CD provides structures and specific set of primitives from which representation can be built, Primitive Acts of CD theory ATRANS PTRANS: PROPEL MOVE GRASP INGEST EXPEL, MTRANS MBUILD. SPEAK ATTEND, ‘Transfer of an abstract relationship (i.e. give) “Transfer of the physical location of an abject (ep. 0) Application of physical lores to an object (eg. push) ‘Movement of a bosly part by its owner (e.g. kick) ‘Grasping of an object by an action (e.g. throw) lngesting of an object by an animal (e.g. ext) Expulsion of something from the body of an animal gen) “Transfer of mental information (c.g. tll) Building new information out of old (e.g decide) Producing of sounds (e.g. say) Focusing of a sense organ toward a stimulus (eg listen) Conceptual category There are four conceptual categories -ACT Actions fone of the CD primitives} -pp Objects {picture producers} AA Modifiers of actions {action aiders} “PA Modifiers of PP’s {picture aiders} Example © I gave a book to the man, CD representation is as follows: = ° R man (to) 1 ATRANS@ book _ (from) ® It should be noted that this representation is same for different saying with same meaning. For example ~'T.gave the man a book, ~ The man got book from me, = The book was given to: man by me ete. en iia Few conventions * Arrows indicate directions of dependency * Double arrow indicates two way link between actor and action, O — for the object case relation R — for the recipient case relation P — for past tense 1D - destination © The tse of tense and mood in describing events is extremely important and schank introduced the following modifiers: * p- pat © F future © t Transition ® t-sturt Transition © -Finished Transition ® k Continuing © 2 Interrogative ®/ Negative © Nil-Present @ delta timeless © c conditional © The absence of any modifier implies the present tense, Rule 1: PP <— ACT © It describes the relationship between an actor and the event he ‘or she causes, — This is a two-way dependency, since neither actor nor event can be considered primary. ~ The letter P in the dependency link indicates past tense, Example: John ran & CD Rep: John <2 PTRANS Rule 2: ACT © PP ® It describes the relationship between a ACT and a PP (object) of ACT. — The direction of the arrow is toward the ACT since the context of the specific ACT determines the meaning of the object relation, © Example: John pushed the bike ° CD Rep: John € PROPEL & bike Rule 3: PP <> PP It describes the relationship between two PP's, one of which belongs to the set defined by the other. Example: — John is doctor CD Rep: John € doctor Rule 4: PP © PP © It describes the relationship between two PP's, one of which provides a particular kind of information about the other. = The three most common types of information to be provided in this way are possession ( shown as POSS-BY), location (shown as LOC), and physical containment (shown as CONT). = The direction of the arrow is again toward the concept being described. * Example: — John’s dog pom-by CD Rep dog © John Rule 5: PP <> PA ® It describes the relationship between a PP and a PA that is asserted to deseribe it, ~ PA represents states of PP such as height, health ete. * Example: — John is fat CD Rep John € weight (> 80) Rule 6: PP © PA ® It describes the relationship between a PP and an attribute that already has been predicated of it. — Direction is towards PP being described. © Example: Smart John CD Rep John © smart PP (to) LEZ Rule 7: sored, i PP (from) * Itdeseribes the relationship between an ACT and the source and the recipient of the ACT Example: John tookthe book from Mary = John CD Rep: John ATRANS-R of © Mary book — PA Rule 8: PP + PA ®© It describes the relationship that describes the change in state. © Example: Tree grows size > C CD Rep: Tree . jerome aia ii = ix} Rule 9: bu @ It describes the relationship between one conceptualization and another that causes it, ~ Here {x} iseauses {y} ie., if then y * Example: Bill shot Bob {x}: Bill shot Bob fy}: not'choann is poor irr aia a =X) Rule 10: 4 2) It describes the relationship between one conceptualization with another that is happening at the time of the first. ~ Here {y} ishappening while {x} is in progress, Example: While going home I saw a snake Lam going home Tsawa snake Generation of CD representations Sentences CD Representations P ° aq Jenny cried Jenny €9 EXPEL & tears os posby T Jenny Mike went to India Mary read a novel © Prumiive stares are used to describe many state descriptions such as height, health, mental state, physical state, ® There are many more physical states than primitive actions, They use a numeric scale, © Eg. John height(+10) John is the tallest ® John height(< average) John is short ® Frank Zappa health(-10) Frank Zappa is dead ® Dave mental_state(-10) Dave is sad ® Vase physical_state(-10) The vase is broken Problems with CD Representation © Itis difficult to = construct original sentence from its corresponding CD representation. - CD representation can be used as a general model for knowledge representation, because this theory is based on representation of events as well as all the information related to events, © Rules are to be carefully designed for each primitive action in order to obtain semantically correct interpretation. Advantages of CD: © Using these primitives involves fewer inference rules, © Many inference rules are already represented in CD structure. © ‘The holes in the initial structure help to focus on the points stil to be established, Disadvantages of CD: © Knowledge must be decompose into fairly low level prisitives © Impossible or difficult to find correct set of primitives. © Alotof inference may sill be required. © Representations can be complex even for relatively simple actions, Consider: Dave het Frank fire pounds that Wales would win the Rughy World Cup. © Complex repeescntations require 4 lot of storage APPLICATIONS OF CD: © MARGIE(Meaning Analysis, Response Generation and Inference on English) -- moclel natural language understanding . © SAM(Seript Applicr Mechanism) -- Scripts to understand stories. ® PAM(Plan Applicr Mechanism) -- Scripts ta understand stories, * Ascript is a structured representation describing a stereotyped sequence of events in a particular context. * Scripts are used in natural language understanding systems to organize a knowledge base in terms of the situations that the system should understand. Scripts use a frame-like structure to represent the commonly occurring ‘experience like going to the movies eating ina restaurant, shopping in a supermarket, or visiting an ophthalmologist. * Thus, a scriptis a structure that prescribes a set of circumstances that could be expected to follow on from one another. + Scripts are beneficial because: * Events tend to occur in known runs or patterns. * A casual relationship between events exist, * An entry condition exists which allows an event to take place. * Prerequisites exist upon events taking place. lL“ _ +). Components of a script: * The components of a script include: + Entry condition: These are basic condition which must be fulfilled before events in the script can occur. * Results: Condition that will be true after events in script occurred. * Props: Slots representing objects involved in events * Roles: These are the actions that the individual Participants perform. * Track: Variations on the script. Different tracks may share components of the same scripts. * Scenes: The sequence of events that occur. PTRANS pRore! ink about or ipyerroug ea) 4 script, special symbols of actions are used. ant -i = ramp Spf gong othe barktowithdaw money. Scher: wihérow money TRACK: Bank PR0r6 tenay coun fom => token tols:Pe cumterar f= Employee Cashier Entry conditions: P has no or less mony. Thabark open sant Tea aT Scone Entering P PIRANS P into the Bank PATIEND oyastoe PMOVE Pte Scene 2:Filing form PMTRANS signal toe E ATRANS form to F P PROPEL form tor writing PATRANS form to E ATRANS form te P Scone 3: Withdrawing money P ATTEND ayes to counter P PTRANS to quaue at the counter PPIRANS tokon toe G ATRANS money toP ‘Se0ne 4: Euting the bank P PTRANS® to out of bonk “Advantages of Scripts * Ability to predict events. * Asingle coherent interpretation maybe builds up from a collection of observations. ‘Disadvantages of Scripts * Less general than frames. + May not be suitable to represent all kinds of knowledge * Cyc is an Al project that attempts to assemble a comprehensive ontology and knowledge of everyday common sense knowledge. + Its goal is to enable Al applications to perform human like reasoning. + The project was started by C¥corp, a ‘Texas based company. + All the aforementioned features were incorporated in Cyc. + Cychas a huge knowledge base which it uses for reasoning. * Contains * 15,000 predicates + 300,000 concepts + 3,200,000 assertions * All these predicates, concepts and assertions are arranged in numerous ontologies. Uncertain Results + Query: “who had the motive for the assassination of Rafik Hariri?” + Since the case is still an unsolved political mystery, there is no way we can ever get the answer. + Incases like these Cyc returns the various view points, quoting the sources from which it built its inferences. * For the above query, it gives two view points. * “USA and Israel” as quoted from a editorial in Al Jazeera * “Syria” as quoted fram a news report from CNN * It uses Google as the search engine in the background. * It filters results according to the context of the query. * For example, if we search for assassination of Rafik Hariri, then it omits results which have a time stamp before that of the assassination date. Qualitative Queries * Query: “Was Bill Clinton a gaod President of the United States?” + In-cases like these, Cyc returns the results in a pros and cons type and leave it to the user to make a conclusion. Queries With No Answer * Query: “At this instance of time, Is Alice inhaling or Exhaling?” + The Cyc system is intelligent enough to figure out queries which can never be answered correctly. * The ultimate goal is to build enough common sense into the Cyc system such that it can understand Natural Language. * Once it understands Natural Language, all the system has to do is crawl through all the online material and learn new common sense rules and evolve. * This two step process of building common sense and using machine learning techniques to learn new things will make the Cyc system an infinite source of knowledge. \ s ~— * There is no single Ontology that works in all cases. * Although Cyc is able to simulate common sense it cannot distinguish between facts and fiction. « In Natural Language Processing there is no way the Cyc system can figure out if a particular word is used in the normal sense or in the sarcastic sense. * Adding knowledge is a very tedious process. Semantic Web * The development of Sernantic Web is well underway with a goal that it would be possible for machines to understand the information on the web rather than simply display. * The major obstacle to this goal is the fact that most inforrnation on the web is designed solely for hurnan consurnption. This information should be structured in a way that machines can understand and process that inforrnation, © The concept of machine-understandable docurnents does not imply “Artificial Intelligence”. \t only indicates a machine’s ability to solve well-defined problems by performing well-defined operations on well-defined data. ® The key technological threads that are curreritly ernployed in the development of Sernantic Web are: eXtensible Markup Language at Resource Description Frarnework (RDF), DAML (DARPA Agent Markup Language). ® Most of the web’s content today is desi for hutnans to read , and not for computer programs to process illy. ° Computers can - parse the web pages. - perform routine processing (here a header, there a link, etc.) + Ingeneral, they have no reliable way to understand and process the sernantics. © The Semantic Web will bring structure to the meaningful content of the web of web pages, creating an environment where software agents roarning from page to Paes cary cut sophiticnted ‘tasks for users. © The Sernantic Web is not a separate web ANOWMCEE Representation © For Semantic Web to function, the computers should have access to * Structured Collections of Information These sets of Inference rules can be used to conduct autornated reasoning. * Technological Threads for developing the Semantic Web: XML + XML lets everyone to create their own tags. ° These tags can be used by the script programs in ways to perform various tasks, but the script writer has to know what the page writer uses each tag for. ° Inshort, XML allows you to add arbitrary structure to the documents but says nothing about what the structures mean. ® Ithas no built mechanism to convey the meaning of the user's new tags to other users. RDF ° Ascheme for defining information on the web. It provides the technology for expressing the meaning of terms and concepts in a form that computers can Teadily process, © RDF encodes this inforrnation on the XML page in sets of triples. The ‘triple is an information on the web about related things. * Each triple is a combination of Subject, Verb and Object, similar to an elementary sentence. ® Subjects, Verbs and Objects are each identified by a URI, which enable SAME ee ee aa somewhere on the web. ras RDF (contd.) ‘These triples can be written using XML tags as shown, ‘cent sro dumb ‘rnena>Ess Dundrum ride anwaging Director heal “eager “tothe ‘Subject Meth Obie! oe.ainkeumbill Nite: liveoral SomORIe2 tells yne-nstype — hilprleasuriole. eweeotitute) doc amiedumbill hit Menargle orcas "Edd Dumb” toe amiiéeumbill tip: femumnple ora/role “tMerrngpingy Director dos. sunldsduinbilhity:terarple organization “Aiea © An RDF document can make assertions that particular things (peaple, web pages or whatever) have properties ( “isa sister of”, “is the author of”) with values (another person, another person, etc.) * RDF uses a different URI for each specific concept. Solves the problem of same definition but different concepts. Eg. Address Tags in an XML page. Dr MADINA a Ontologies * Ontologies are collections of statements written in 2 language such as RDF that define relations between concepts and specifies logical rules for reasoning about them, * Computers/agents/services will understand the meaning of semantic data on ‘a web page by following links to specified ontologies. ® Ontologies can express a large number of relationships arnong entities objects) by assigning properties to classes and allowing subclasses to inherit ‘such properties, * An Ontology may express the rule, If City Code ——. State Code and Address ——* CityCode then Address —- State Code ® Enhances the functioning of semantic web: Improves accuracy of web searches, Easy development of programs that can tackle complicated queries. Evolution of Semantic Web Case Grammars * Case grammars use the functional relationships between noun phrases and verbs ‘to conduct the more deeper case of a sentence * Generally in our English sentences, the difference between different forms of a sentence is quite negligible. * In early 1970's Fillmore gave some idea about different cases of a English sentence. * He extended the transformational grammars of Chomsky by focusing more on the semantic aspects of view of a sentence. * In case grammars a sentence id defined as being composed of a preposition P, a modality constituent M, composed of mood, tense, aspect, negation and so on. Thus we can represent a sentence like S—+M+P Where P - Set of relationships among verbs and noun phrases i.e. P = (C=Case) M- Modality constituent For example consider a sentence “Ram did not eat th e apple”. Negation v c (@ecaration | Past) Eat Ram Apple Figure Case Grammar Tree Representation ‘The tree representation fora case grammar will identify the words by their modality and case, The cases may be related to the actions performed by the agents, the location and direction of actions. The cases may also be instrumental and objective. For example “Ram cuts the apple by a knife”. Here knife is an instrumental case. Tn fig 8.5 the modality constiment is the negation par, eat is the verb and Ram, apple are nouns which are under the case C| and C2 respectively. Case frames are provided for verbs to identify allowable cases, They sive the relationships which ate required and which ae optional Natural Language Processing (NLP) refers to Al method of communicating with an intelligent systems using a natural language such as English, Processing of Natural Language is required when you want an intelligent system like robot to perform as per your instructions, when you want to hear decision fram a dialogue based clinical expert system, etc. ‘The field of NLP involves making computers to perform useful tasks with the natural languages humans use. The input and output of an NLP system can be Speech Written Text i i i i “Components of NLP * There are two components of NLP as given — Natural Language Understanding (NLU) * Understanding involves the following tasks - * Mapping the given input in natural language into useful representations. + Analysing different aspects of the language. Natural Language Generation (NLG) * It is the process of producing meaningful phrases and sentences in the form of natural language from some internal representation. It involves : + Text planning ~ It includes retrieving the relevant content from knowledge base. * Sentence planning ~ It includes choosing required words, forming meaningful phrases, setting tone of the sentence. * Text Realization — It is mapping sentence plan into sentence structure. “NLP Terminology + Phonology - It is study of organizing sound systematically. * Morphology ~ It is a study of construction of words from primitive meaningful units. * Syntax — It refers to arranging words to make a sentence. It also involves determining the structural role of words in the sentence and in phrases. * Semantics - It is concerned with the meaning of words and how to combine words into meaningful phrases and sentences. * Pragmatics - It deals with using and understanding sentences in different situations and how the interpretation of the sentence is affected. * Discourse - It deals with how the immediately preceding sentence can affect the interpretation of the next sentence. * World Knowledge - It includes the general knowledge about the world. Lexical Analysis Syntactic Analysis Semantic Analysis Disclosure Integration Pragmatic Analysis Sentence Analysis Phases + Lexical Analysis - It involves identifying and analyzing the structure of words. Lexicon of a language means the collection of words and phrases in a language. Lexical analysis is dividing the whole chunk of txt into paragraphs, sentences, and words. * Syntactic Analysis (Parsing) ~ It involves analysis of words in the sentence for grammar and arranging words in a manner that shows the relationship among the words. The sentence such as “The school goes to boy” is rejected by English syntactic analyzer + Semantic Analysis ~ It draws the exact meaning or the dictionary meaning from the text, The text is checked for meaningfulness. It is done by mapping syntactic structures and objects in the task domain. The semantic analyzer disregards sentence such as “hot ice-cream”. * Discourse Integration - The meaning of any sentence depends upon the meaning of the sentence just before it. In addition, it also brings about the meaning of immediately succeeding sentence. * Pragmatic Analysis - During this, what was said is re-interpreted on what it actually meant. It involves deriving those aspects of language which require real world knowledge. Grammars And Parsers * Context-Free Grammar * It is the grammar that consists rules with a single symbol on the left- hand side of the rewrite rules. Let us create grammar to parse a sentence — “The bird pecks the grains” Articles (DETERMINER(DET)) - a | an | the Nouns - bird | birds | grain | grains Noun Phrase (NP) - Article + Noun | Article + Adjective + Noun = DETN | DETADIN Verbs - pecks | pecking | pecked Verb Phrase (VP) - NP V | V NP Adjectives (ADJ) - beautiful | small | chirping * The parse tree breaks down the sentence into structured parts so that the computer can easily understand and process it. In order for the parsing algorithm to construct this parse tree, a set of rewrite rules, which describe what tree structures are legal, need to be constructed, * These rules say that a certain symbol may be expanded in the tree bya sequence of other symbols. According to first order logic rule, if there are two strings Noun Phrase (NP) and Verb Phrase (VP), then the string combined by NP followed by VP is a sentence. The rewrite rules for the sentence are as follows — *S > NP VP * NP > DETN | DETADIN *vP > VNP Lexocon — DET >a | the ADI > beautiful | perching N- bird | birds | grain | grains V-> peek | pecks | pecking ‘The parse tree can be created as shown—- PARSING PROCESS * Parsing is the term used to describe the process of automatically building syntactic analysis of a sentence in terms of a given grammar and lexicon. + The resulting syntactic analysis may be used as , input to a process of semantic interpretation. bo] * Occasionally, parsing is also used to include both syntactic and semantic analysis. * The parsing process is done by the parser. * The parsing performs grouping and labeling of parts of a sentence in a way that displays their relationships to each other in a proper way. + The parser is a computer program which accepts the natural language sentence as input and generates an output structure suitable for analysis. | can set “Types of Parsing + The parsing technique can be categorized into two types such as = Top down Parsing - Bottom up Parsing Top down Parsing Top down parsing starts with the starting symbol and proceeds towards the goal. We can say t is the process of construction the parse tree starting at the root and proceeds towards the eaves. Itis a strategy of analyzing unknown data relationships by hypothesizing general parse tree structures and then considering whether the known fundamental structures are compatible with the hypothesis. In top down parsing words of the sentence are replaced by their categories like verb phrase (VP), Noun phrase (NP), Preposition phrase (PP), etc. Let us consider some examples to illustrate top down parsing. We will consider both the symbolical representation and the graphical representation. We will take the words of the sentences and reach at the sompiete sentence. For parsing we will consider the previous symbols like PP, NP, VP, ART, N, V and so on. Examples of top down parsing are LL (Left-to- right, left most derivation}, recursive descent parser etc. ‘tm mat wee ated ew ee ye em ener Top does Panag Bottom up Parsing * In this parsing technique the process begins with the sentence and the words of the sentence is replaced by their relevant symbols. * It is also called shift reducing parsing. * In bottom up parsing the construction of parse tree starts at the leaves and proceeds towards the root. * Bottom up parsing is a strategy for analyzing unknown data relationships that attempts to identify the most fundamental units first and then to infer higher order structures for them. * This process occurs in the analysis of both natural languages and computer languages. * It is common for bottom up parsers to take the form of general parsing engines that can wither parse or generate a parser fora specific programming language given a specific of its grammar. ce Cd ade h coe D Lt | TLL Ne. ‘NP |___,_p " Figure Example of Bottom up Parsing Semantic Analysis * Semantic Analysis is the process of drawing meaning from text. * It allows computers to understand and interpret sentences, paragraphs, or whole documents, by analysing their grammatical structure, and identifying relationships between individual words in a particular context. + It’s an essential sub-task of Natural Language Processing (NLP) and the driving force behind machine learning tools like chatbots, search engines, and text analysis. * Semantic analysis-driven tools can help companies automatically extract meaningful information from unstructured data, such as emails, support tickets, and customer feedback. How Semantic Analysis Works * Lexical semantics plays an important role in semantic analysis, allowing machines fp understand relationships between lexical items (words, phrasal verbs, etc.): * Hyponyms: specific lexical items of a generic lexical item (hypernym) e.g. orange is a hyponym of fruit (hypernym), * Meronomy: a logical arrangement of text and words that denotes a constituent part of or member of something e.g., a segment of an orange + Polysemy: a relationship between the meanings of words or phrases, although slightly different, share a common core meaning e.g. | read a paper, and | wrote a paper) * Synonyms: words that have the same sense or nearly the same meaning as another, e.g., happy, content, ecstatic, overjoyed * Antonyms: words that have close to opposite meanings e.g., happy, sad * Homonyms: two words that are sound the same and are spelled alike but have a different meaning e.g., orange (color), orange (fruit) * Semantic analysis also takes into account signs and symbols (semiotics) and collocations (words that often go together). * Automated semantic analysis works with the help of machine learning algorithms. * By feeding semantically enhanced machine learning algorithms with samples of text, you can train machines to make accurate predictions based on past observations. * There are various sub-tasks involved in a semantic-based approach for machine learning, including word sense disambiguation and relationship extraction: Word Sense Disambiguation & Relationship Extraction Word Sense Disambiguation: * The automated process of identifying in which sense is a word used according to its context. + Natural language is ambiguous and polysemic; sometimes, the same word can have different meanings depending on how it’s used. * The word “orange,” for example, can refer to a color, a fruit, or even a city in Florida! ny \, CE + The same happens with the word “date,” which can mean either a particular day of the month, a fruit, or a meeting. + Relationship Extraction * This task consists of detecting the semantic relationships present in a text. Relationships usually involve two or more entities (which can be names of people, places, company names, etc.). These entities are connected through a semantic category, such as “works at,” “lives in,” “is the CEO of,” “headquartered at.” + For example, the phrase “Steve Jobs is one of the founders of Apple, which is headquartered in California” contains two different relationships: Apple headquartered in California. [Company] [Place] What is UNL? Universal Networking Language (UNL) is a computer language that Enables computers to process information and knowledge across the language barriers. Sie epeeta ic lanige ell eiale|¥icte (eden -leireyp lesley egeeitinianlenteasse urls cl languages inhuman communication. It expresses information or knowledge in the form of semantic Resneuen Unlike natural languages, UNL expressions are unambiguous Although the UNL is a language for computers, it has all the components of a natural language. It is composed of UNL Expressions, Universal Words (UWs), Relations, Attributes. Overview Of UNL System Tt Consists Of Language Resources: =) SUNLKB (Knowledge Base) — Linguistic Knowledge on concepts that are elles hteaien Mela] letele = Universal Word Dictionary, analysis and generation rules. Tele ectopic = DeConverter - automatically deconverts UNL into native languages. = EnConverters - automatically or interactively enconverts natural languages [rer celinic ej ls Software tools: = UNL Editors - used to make UNL documents. = UNL Explorers - used to view/manage UNL document by accessing UNL language servers, UNLKB & UNL Documents. UNL Verifiers — verifies UNL expression for correctness. UNL Proxy servers — Provides communication with language servers. Concept Definitions — Defines concepts in connection with other oes UNL Documents - the documents in which UNL expression is Fe fetertg efor fol ==Te pyeeeinle signe ela loloe z NE 5 econ fy : UNL Graph o Each sentence is converted into a hyper graph = Concepts as nodes = Relations as directed arcs o Concepts are called Universal Words co Word Knowledge represented by Universal Words (UWs) which are language independent o Conceptual Knowledge captured by relating UWs through relations Example: John eats rice with a spoon < eallicdo) > y@ entry. @ present, Types of Universal Word oO Syntactic and semantic unit of UNL Represents a concept Represents node in graph of UNL expression 2 classes: = Unit concepts a Basic UWs a Restricted UWs a Extra UWs = Compound concepts: Scopes Types of Universal Words(UWs) o Basic UWs = Bare headwords with no constraint list m Eigi: a house ao drink co Restricted UWs = Headwords with a constraint list = Represents a more specific concept, or subset of concepts = Constraint List restricts the range of the concept that a Basic UW represents o Eg.: = state(icl>country) = state(icl>abstract thing) o Extra UWs = Special type of Restricted UW =» Denote concepts that are not present in English. = Foreign-language words are used as Head Words =» E.g.: a Bharatnatyam(icl>dance) UNL Knowledge Base a Defines every possible relation between concepts co Two important roles = Defines semantics of Universal Words = Gives linguistic knowledge of concepts a E.g. The anchor wrote the script = Linguistic Knowledge tells that anchor is a person = Semantics tells that only a person can write a script (Anchor(of ship) can't do so) Dictionary Also Known as UNL Dictionary. It stores concepts, represented by the language words. It stores universal words for identifying concepts, words headings that can express concepts and information on the syntactical behaviour. Each entry consists of a correspondence between a concept and a word along with information concerning syntactic properties. The Grammar for defining words of the language in the dictionary is shown below APEC)

You might also like