You are on page 1of 79

Logic in AI

Professor Abdelwadood MESLEH

Based on slides of Russel (AIMA Chapter 2,7,8


and 9)
Knowledge Representation
 Represent knowledge about the world in a manner that facilitates
inferencing (i.e. drawing conclusions) from knowledge.
 In AI: typically based on:
 Logic
 Probability
 Logic and Probability
 KR Languages
 Propositional Logic
 Predicate Calculus
 Frame Systems
 Rules with Certainty Factors
 Bayesian Belief Networks
 Influence Diagrams
 Ontologies
 Semantic Networks
 Concept Description Languages
 Non-monotonic Logic
Introduction to AI by Prof. A. MESLEH 2
Basic Idea of Logic
 By starting with true assumptions, you can deduce true conclusions.
 Components of Knowledge Representation
 Syntax: defines the sentences in the language
 Semantics: defines the “meaning” to sentences
 Inference Procedure – algorithm
 Knowledge base - KB
 Knowledge Base= set of sentences in a formal language
 Declarative approach to building an agent (or other system):
– Tell it what it needs to know
 Then it can Ask itself what to do - answers should follow from the KB
 Agents can be viewed at the knowledge level
– i.e., what they know, regardless of how implemented
 Or at the implementation level
– i.e., data structures in KB and algorithms that manipulate them

Introduction to AI by Prof. A. MESLEH 3


Introduction to logic
 Humans solves problems:
 By reusing knowledge and new facts concluded under reasoning of
knowledge even under partially observable environment
 Reasoning is very important especially in the world where the results of
actions are unknown
 Why do we need logic?
 Natural language?
the expressive power – high but ambiguous
can it be used as a representation language in AI? NO.
serves as a medium of communication
– rather than pure representation
– since 1 syntax  too many (hidden) semantics
 The central component of a knowledge base KB agent
 its memory – knowledge base, or KB
 A KB is a set of representation of facts about the world
 Each of this representation is called a sentence

Introduction to AI by Prof. A. MESLEH 4


knowledge-based agents
 The sentences are expressed in a language called knowledge representation
language:
 Many languages: Propositional Logic, Predicate Calculus, Frame
Systems, Rules with Certainty Factors, Bayesian Belief Networks,
Influence Diagrams, Semantic Networks, Concept Description
Languages.
 All popular knowledge representation systems are equivalent to (or a
subset of): Logic (Propositional Logic or Predicate Calculus) and
Probability Theory
 A design of KB agent includes:
 A formal language to express its knowledge
 A means of carrying out reasoning in this language
 These two elements together constitute logic
 Inference engine is Another main component of KB agent: This is to
answer the query by ASK based on the sentences TELLed in the KB
 TELL and ASK
 the way to add new sentences
 and the way to query what sentences the KB contains
 The query may not be answered directly from the sentences but usually
indirectly
 At the initial stage of KB Before anything is TELLed
 It contains some background knowledge about the domain
 Example: location(desk, kitchen), …

Introduction to AI by Prof. A. MESLEH 5


KB

Tell Tell

KB KB
Ask Ask
Inference engine
Introduction to AI by Prof. A. MESLEH 6
How a KB performs
 Each time the agent program is called, it does two
things:
1. it TELLs the KB what it perceives
2. it ASKs the KB what action to perform
 Reasoning
 The process is concluding the action from the sentences in
the KB
 This reasoning is done logically
 The action found is proved to be better than all others
 (given that an agent knows its goals)

Introduction to AI by Prof. A. MESLEH 7


Propositional logic

Introduction to AI by Prof. A. MESLEH 8


Propositional logic
 The symbols of propositional logic are
 the logical constants True and False
 propositional symbols such as P and Q
 the logical connectives ,,,,, and ()
 Sentences
 True and False are themselves sentences
 A Propositional symbol such P and Q
is a sentence
 Wrapping “( )” around a sentence
 yields a sentence

Introduction to AI by Prof. A. MESLEH 9


 A sentence
 formed by combining sentences with one of the five logical
connectives:
  : negation Antecedent /
  : conjunction Premise
  : disjunction
  : implication. PQ : if P then Q
  : bidirectional Conclusion /
Consequent
 An atomic sentence
 is containing only one symbol

Introduction to AI by Prof. A. MESLEH 10


 A literal
 Either an atomic sentence
 Or a negated one
 A complex sentence
 sentences constructed from simpler sentences using logical connectors

Introduction to AI by Prof. A. MESLEH 11


Semantics
 The semantics of propositional logic
 simply interpretation of truth values to symbols
assign True or False to the logical symbols
The easiest way to summarize the models is by a truth table

Note: (P  Q) equivalent to  P  Q

Introduction to AI by Prof. A. MESLEH 12


Rules

Introduction to AI by Prof. A. MESLEH 13


Inference
 Based on the existing KB
 a set of facts or rules
 we ASK a query.
 Inference is performed to give the answer and CAN be expressed by a
truth table
 For large KB
 a large amount of memory necessary
to maintain the truth table
 if there are n symbols in KB
there are totally 2n rows (models) in the truth table
which is O(2n), prohibited –complexity is High-
 also the time required is not short –Time consuming-
 so, we use some rules for inference
instead of using truth tables

Introduction to AI by Prof. A. MESLEH 14


Reasoning patterns propositional logic
 An inference rule
 a rule capturing a certain pattern of inference
 To say β is derived from α
 we write α├ β or α
 List of inference rules
β

The preceding deviation – called a proof


a sequence of applications of inference rules
similar to searching a solution path
every node is viewed as a sentence
we are going to find the goal sentence
Introduction to AI by Prof. A. MESLEH 15
Inference: Backward Chaining (Goal Reduction)
 Based on rule of modus ponens
 If know P1 ...  Pn and know (P1 ...  Pn )=> Q
 Then can conclude Q
 Example:

1. A patient with cold has fever (Cold = > Fever)


2. A patient with fever has headache (Fever = > Headache)
3. John has a cold (Cold)
Prove that John has a headache

Introduction to AI by Prof. A. MESLEH 16


Pros and cons of propositional logic
 Propositional logic is declarative
 Propositional logic allows partial/disjunctive/negated information
 (unlike most data structures and databases)
 Propositional logic is compositional:
 meaning of B  A is derived from meaning of B and of A
 Meaning in propositional logic is context-independent
 (unlike natural language, where meaning depends on context)
 Propositional logic has very limited expressive power (unlike natural
language)
 PL is not expressive enough.
 needs a huge amount of rules
 for even simple worlds
 no power on handling groups of objects
 every object is specified individually

Introduction to AI by Prof. A. MESLEH 17


First-order logic- FOL

Introduction to AI by Prof. A. MESLEH 18


FOL
 Whereas propositional logic assumes the world contains facts, First-
order logic (like natural language) assumes the world contains
 Objects: people, houses, numbers, colors, baseball games, …
 Relations: red, round, prime, brother of, bigger than, part of, comes
between, …
 Functions: father of, best friend, one more than, plus, …
 FOL: Basic elements
 Constants: name a specific object. KingJohn, 2, NUS,...
 Predicates Brother, >,...
 Functions :Mapping from objects to objects. Sqrt, LeftLegOf,...
 Variables: Refer to an object without naming it x, y, a, b,...
 Connectives , , , , 
 Equality =
 Quantifiers , 
 Terms: father-of(father-of(dog33))
 Atomic Sentences: in(father-of(dog33), food6)
Can be true or false
Correspond to propositional symbols P, Q

Introduction to AI by Prof. A. MESLEH 19


FOL
 First-order logic (FOL)
 introduces the concepts of
objects and properties
 Why do we need logic?
 Natural language?
the expressive power – high but ambiguous
can it be used as a representation language
– in AI? NO.
serves as a medium of communication
– rather than pure representation
– since 1 syntax  too many (hidden) semantics
 Relations
 are the links among objects
 can be functions – only one output for a given input
 Examples:
 Objects: people, houses, number, colors, baseball, centuries
 Relations: either unary (property) or n-ary
 unary: tall, large, small, red, round, prime, boring
 n-ary: brother of, greater than, part of, inside, after, is
 Functions: father of, best friend, one more than
Introduction to AI by Prof. A. MESLEH 20
 Almost any fact (assertion) can be thought of a combination of:
 objects
 and properties or relations
 Example: “One plus two equals three”
 Objects: one, two, three, one plus two
 Relation: equals
 Function: plus
 First order logic Advantages
 it can express A name of the object
 anything that can be programmed obtained by applying the
  Prolog function plus to the
objects one and two

Introduction to AI by Prof. A. MESLEH 21


First Order Logic vs. Propositional logic
 Propositional logic
 consider the facts: True or False
 use compositionality to see whether the sentence is true or false
Compositionality: the meaning of a sentence in compositional
language is a function of the meaning of its parts.
 Note: PL has sufficient power to deal with partial information.
 First Order Logic
 consider the RELATIONS with OBJECTS:
True or False
 also use compositionality and it can express facts about ‘some’ and ‘all’
of the objects in the universe.
 Main difference between PL and FOL lies in the ontological (‫)وجودي‬
commitment made by each language- that is what is assumed about the
nature of reality.

Introduction to AI by Prof. A. MESLEH 22


Models for FOL
 What is a model?
 Model: a mathematical abstraction used to fix the truth or falsehood of every
relevant sentence.
 in PL: A  B => C
 then how many possible combinations?
this set of truth values = a model for this world
 only True or False exists
 in FOL, a model contains more.
 objects
 domain of “first order logic” model
= set of objects it contains (domain elements)
 Sentences are true with respect to a model and an interpretation
 Model contains objects (domain elements) and relations among them
 Interpretation specifies referents for
constant symbols → objects
predicate symbols → relations
function symbols → functional relations
o Complex sentences are made from atomic sentences using connectives
o S, S1  S2, S1  S2, S1  S2, S1  S2,
o E.g. Sibling(KingJohn,Richard)  Sibling(Richard,KingJohn)
Introduction to AI by Prof. A. MESLEH 23
Example: HW

 There are five objects in this model


 two binary relations
 three unary relations
 one unary function: left-leg

Introduction to AI by Prof. A. MESLEH 24


 Formally speaking, relation is
 A set of tuples of objects that are Related.
 Tuple is a collection of objects arranged in a fixed order and is written
with angle brackets surrounding the objects
 e.g., brother relation, it is the set
{ < Richard, King John>, <King John, Richard>}
 E.g.: the crown is on king john’s head
‘on head’: relation contains one tuple, <the crown, king john>
 The “brother” and “on head” are binary relations.
 Binary relation: relate pairs of objects.
{ < Richard, King John>, <King John, Richard>}
 Strictly speaking,
 the functions used in FOL must be total functions
 i.e., a function returns a value for every input tuple
constituted by the domain elements
 e.g., brother:X  Y, any body must have a brother?
 left_leg:X  Y, anything must have a left leg?
left_leg(Leftleg1) = Leftleg2?

Introduction to AI by Prof. A. MESLEH 25


Syntax of FOL - summary

Introduction to AI by Prof. A. MESLEH 26


 Terms
 Logical expression of objects
 Terms =
 constant symbols (1, a, b, peter): stands for objects
 variables (X, Y, Human)
and function symbols (fatherof(peter), plus(1,2))
 Atomic sentences
 the facts in Prolog
= Predicate symbol + Terms
= Relation + Objects
 e.g., brother(Richard, John).
 married(Father(Richard), Mother(John))
 Symbols:
Constant symbols stands for objects.
Predicate symbol stands for relations
Function symbols stands for functions
Each predicate and function come with an arity that fixes
the number of arguments.
Interpretation is needed to specify which object, relation
and functions are referred to by the constant, predicate and
function symbols.
Introduction to AI by Prof. A. MESLEH 27
 Complex sentences
 multiple atomic sentences combined with logical connectives
 Example:
Brother(Richard, John)  Brother(John, Richard)
Older(John, 30)  Younger(John, 30)
 Quantifiers and 
 For expressing properties of
entire collection of objects – projection of a subset
 Universal quantification ()
 Meaning all, •called a variable,
reading as “For all” •lowercase
 Example: “All Kings are Persons” •if it’s a constant
x King(x)  Person(x) •ground term:
If X is a King, then X is a Person term with no
variables.

Introduction to AI by Prof. A. MESLEH 28


 Here x King(x)  Person(x) is true
 if x = any domain element,
 and the sentence is still true.
x  Richard
x  King John
x  Richard’s left leg
x  John’s left leg
x  the crown
 the above list is called the extended interpretation
 By applying this list, what we have?
x King(x)  Person(x) is true if
King(x)  Person(x) is true in each of the five extended interpretations.

Introduction to AI by Prof. A. MESLEH 29


 Are they true in the model?
 Yes!!!! (but only for interpretation)
 from the table of implication ( =>)
 whenever the premise is false
the result is true, regardless of the conclusion
 Universal quantification 
 asserts –declares, affirms- a list of sentences (reduce our work!)

Introduction to AI by Prof. A. MESLEH 30


 Existential quantification ()
 Meaning some
reading as “There exist” or “For some”
 Example
 x Crown(x)  OnHead(x, John)

Introduction to AI by Prof. A. MESLEH 31


 Here, we can see
 => is the natural connective with 
  is the natural connective with 
 NOTE:::::
 if  with , too strong,
 if => with , too weak

Introduction to AI by Prof. A. MESLEH 32


Nested Quantifiers
 When using multiple quantifiers, e.g.,
 x y Brother(x, y) => Sibling (x, y)
 we can write x, y instead of separate ones
 x y Loves (x, y)
 Everybody x loves somebody y.
 BUT y x Loves (x, y):
 [there is someone who is loved by everyone]

Quantifier’s Problems
 Problem?
 Quantifiers are not commutative
 The order cannot be interchanged
 Similar to a query
 If we want to specify the precedence,
 we should use ( )
 e.g., y ( x Loves (x, y) )

Introduction to AI by Prof. A. MESLEH 33


Connections between  and 
  is a conjunction over the universe
 And  is a disjunction.
 De Morgan rules can apply to them:
 x P   x P x P   x  P
  x P  x  P x P   x  P
 So, only one of the  or  is necessary
 we don’t really need both of them --- but we keep them for readability.
 E.g.: everybody likes ice cream=there is no one who does not like ice cream.
x Likes (x, ice-cream)=  x  Likes (x, ice-cream)
 Properties of quantifiers:
 x y is the same as y x
 x y is the same as y x
 x y is not the same as y x
 x y Loves(x,y): “There is a person who loves everyone in the world”
 y x Loves(x,y): “Everyone in the world is loved by at least one person”
 Quantifier duality: each can be expressed using the other
 x Likes(x,IceCream) x Likes(x,IceCream)
 x Likes(x,Broccoli) x Likes(x,Broccoli)
Introduction to AI by Prof. A. MESLEH 34
Equality
 Equality is represented as
 “=”
 instead of a predicate
 Example: FatherOf (John) = Henry
 To ensure two objects are not the same
 negation with equality is used
 E.g., x,y Sister(Felix, x)  Sister(Felix, y)  (x = y)
 This means that Felix has at least two sisters.

Introduction to AI by Prof. A. MESLEH 35


The uniqueness quantifier !
  just specifies
 One or more different objects
 ! is used to specify
 a unique one object
 Example: “There is only one king”
 !x King(x)
 Or x King(x)  y King(y)  (x = y)
 If X is a King & Y is a King then X must be Y

Introduction to AI by Prof. A. MESLEH 36


Common Mistakes
 A common mistake to avoid:
 Typically,  is the main connective with 
 Common mistake: using  as the main connective with :
x At(x,NUS)  Smart(x)
means “Everyone is at NUS and everyone is smart”
 Another common mistake to avoid
 Typically,  is the main connective with 
 Common mistake: using  as the main connective with :
x At(x,NUS)  Smart(x)
is true if there is anyone who is not at NUS!

Introduction to AI by Prof. A. MESLEH 37


Using first-order logic
 Domain
 Some part of the world about which we wish to express some
knowledge.
 Assertions: sentences added to KB using TELL.
 The domain of numbers, sets and lists
 Example:
TELL(KB, King (John))
TELL(KB,  x King (x)=>Person (x))
ASK(KB, King (John)) this will return TRUE.
 Queries=Goal: Questions that are asked by ASK.
Telling is done by substitution – binding list. {x/john}

Introduction to AI by Prof. A. MESLEH 38


Example: The kinship domain- family relationship
 Includes
 facts. ie. ‘Elezabet is the mother of charls’, ‘charls is the father of William’.
 Rules: ie. ‘ones grandmother is the mother of one’s parent’
 Objects: are people.
 Unary predicates: male, female.
 relations: parent, sibling, brother, sister, child, daughter, son, spouse, wife,
husband, grandparent, grandchild, cousin, aunt and uncle.
 Functions for mother and father
 The properties of the objects include
 Gender
 Age
 Height, …
 The relations are:
 parenthood, brotherhood, marriage, …

See examples next slide…

Introduction to AI by Prof. A. MESLEH 39


 One’s mother is one’s female parent:
 m,c Mother(c)=m  Female(m)  Parent(m,c)
 One’s husband is one’s male spouse:
 w,h Husband(h,w)  Male(h)  Spouse(h,w)
 Male and female are Disjoint categories:
 x Male(x)  Female(x)
 Parent and child are Inverse relations:
 p,c Parent(p,c)  Child(c,p)
 A grandparent is parent of one’s parent:
 g,c Grandparent(g,c)  p Parent(g,p)  Parent(p,c)
 A sibling is another child of one’s parents:
 x,y Sibling(x,y)  x≠y  p Parent(p,x)  Parent(p,y)
 Each of the above sentences can be viewed as an Axiom of the kinship domain.
 Axiom: In Logic, Math. a statement or proposition that needs no proof because its
truth is obvious, or one that is accepted as true without proof.
 Not all logical sentences about a domain are axioms. Some are theorems- that is,
they are entailed by axioms. ie. x,y Sibling(x,y)  Sibling(y,x).

Introduction to AI by Prof. A. MESLEH 40


The domain of numbers
 Natural numbers  Nonnegative integers
 We need a predicate to check if a number is natural
NatNum
 We need a constant symbol (basis)
0
 a function symbol S, meaning successor
 So natural number are defined as:
NatNum(0).
n NatNum(n) => NatNum(S(n)). [0 is a natural number and for
every object n, if n is a natural number then S(n) is a natural
number]
 After that, we define constraints about the function S
n 0 ≠ S(n)
m, n m≠n => S(m) ≠ S(n)
 Addition of natural numbers
m NatNum(m) => ( + (m, 0) = m ) [adding 0 to any natural
number m gives m itself]
m, n NatNum(m)  NatNum(n) => +(S(m) , n) = S(+(m,n))
‘+’ is a Binary function symbol.

Introduction to AI by Prof. A. MESLEH 41


The domain of sets
 We want to represent individual sets, including empty set.
 What we need?
 a way to ‘building up’ a set
by adding an element to a set (adjoining)
take union of two sets
take intersection of two sets
 Checking?
membership of an element
whether it is a set or other objects
 Hence we want to define the followings
 Constant symbol:
{ }: empty set
 Predicates:
 Set, Member, Subset
 Functions:
 Adjoining, Union, Intersection
{x|s}: the set resulting from adjoining element x to set s
Introduction to AI by Prof. A. MESLEH 42
Introduction to AI by Prof. A. MESLEH 43
The domain of lists
Lists are similar to sets
 difference?
Lists are ordered.
An element can appear more than once
 Here is the summary of their difference

Ø ={ } [] = Nil :constant set with NO elements

{x} = {x | { } } [x] = Cons(x, Nil)


{x, y} = {x | {y | { } } } [x,y] = Cons(x, Cons(y, Nil))
{x, y|s} = {x | { y | s} }, s is a set [x,y|l] = Cons(x, Cons(y, l))
r  s = Union(r, s)
r  s = Intersection(r, s)
x  s = Member(x, s)
r  s = Subset(r, s)

Introduction to AI by Prof. A. MESLEH 44


Knowledge Engineering in FOL
KE in FOL: Capturing domain knowledge in logical form.
Knowledge engineering process steps:
1. Assemble the relevant knowledge
2. Decide on a vocabulary of predicates, functions, and constants
3. Encode general knowledge about the domain
4. Encode a description of the specific problem instance
5. Pose queries to the inference procedure and get answers
6. Debug the knowledge base.
Notes:
• Knowledge engineering: the general process of knowledge base
construction.
• Knowledge engineer: investigates a particular domain, learns what
concepts are important in that domain, and creates a formal
representation of the objects and relations in the domain.
HW – Check for some application/ solved Example

Introduction to AI by Prof. A. MESLEH 45


Inference in first-order logic

Introduction to AI by Prof. A. MESLEH 46


Inference in first-order logic
 Inference in FOL,
 removing the quantifiers
 i.e., converting KB to PL
 then use Propositional inference
which is easy to do
 Inference rules for quantifiers
 In KB, we have

Introduction to AI by Prof. A. MESLEH 47


 A rule called Universal Instantiation (UI)
 Instantiation (‫ )تجسيد‬means ‘to represent by a concrete example; instance.
 substituting a ground term for the variable
 SUBST( , ) denotes the result of applying the substitution  to the sentence

 g: ground term.
 v: variable.
v 

SUBST({v/g}, )
 Examples:
 {x / John}, {x / Richard}, {x / Father(John)}

Introduction to AI by Prof. A. MESLEH 48


 Existential Instantiation:
 For any sentence , variable v, and constant k that does not
appear elsewhere in the KB:
v 

SUBST({v/k}, )
 x Crown(x)  OnHead(x, John)
 We can infer:
Crown(C1)  OnHead(C1, John)
If C1 does not appear elsewhere in the KB

Basically the existential sentence says there is some object


satisfying a condition, AND the instantiation process is just
giving the name to that object.

Introduction to AI by Prof. A. MESLEH 49


Reduction to Propositional Inference
 Main idea
 for existential quantifiers
find a ground term to replace the variable
remove the quantifier
add this new sentence to the KB
 for universal quantifiers
find all possible ground terms to replace the variable
add the set of new sentences to the KB

In other words:
the ’d sentence can be replaced by ONE instantiation and.
the ’d sentence can be replaced by the set of all possible
instantiations.

Introduction to AI by Prof. A. MESLEH 50


 apply universal quantifiers UI to the first sentence
 from the vocabulary of the KB
 {x / John}, {x / Richard} – two objects
 we then obtain

 view other facts as propositional variables


 use inference to induce Evil(John)

Introduction to AI by Prof. A. MESLEH 51


Propositionalization
 Applying this technique to every quantified sentence in the
KB
 we can obtain a KB consisting of propositional sentences only
 however, this technique is very inefficient in inference.
it generates other useless sentences

Introduction to AI by Prof. A. MESLEH 52


Unification and Lifting
 Unification
 a substitution θ such that applying on two sentences will make
them look the same
 e.g., θ = {x / John} is a unification
applying on x King( x)  Greedy( x)  Evil( x)
it becomes x King( John)  Greedy( John)  Evil( John)
 and we can conclude the implication
 using King(John) and Greedy(John)

Introduction to AI by Prof. A. MESLEH 53


Generalized Modus Ponens (GMP)
 The process capturing the previous steps
 A generalization of the Modus Ponens
 also called the lifted version of Modus Ponens M.P.
 For atomic sentences pi , pi', and q,
 there is a substitution  such that
 SUBST(, pi)= SUBST(, pi'), for all i:

 n+1 premises (n atomic sentences and one implication)


 Example:

Introduction to AI by Prof. A. MESLEH 54


Unification
 UNIFY, is a routine which
 takes two atomic sentences p and q
 return a substitution 
 that would make p and q look the same ie. Identical.
 it returns fail if no such substitution  exists
 Formally, UNIFY(p, q)=
 where SUBST(, p) = SUBST(, q)
  is called the unifier of the two sentences

 The last unification fails? Because x can not take ‘John’ and
‘Elizabeth’ at the same time.
 How to avoid this problem?

Introduction to AI by Prof. A. MESLEH 55


Standardizing apart
 UNIFY failed on the last sentence in finding a unifier
 Reason?
 The two sentences use the same variable name, x.
even they are having different meanings
 The problem can be avoided by standardizing apart one of the two
sentences being unified, which means renaming its variables to
avoid name clashes.
Rename x in Knows (x, Elizabeth) to Z17 (a new variable
name) without changing its meaning

Introduction to AI by Prof. A. MESLEH 56


Most Generalized Unifier (MGU)
 Most Generalized Unifier MGU
 Recall that UNIFY should return a substitution that makes the two
arguments look the same BUT there could be more than one such
unifier.
 e.g., UNIFY(Knows(John, x), Knows(y, z)) could return
{y/John, x/z} ----- John knows z.
{ y/John, x/John, z/John } ------ John knows John.
 the best unifier is the one with less constraints – restrictions.
(1st Unifier is more general than the 2nd )
 It turns out that, for every unifiable pair of expressions, there is a
single MGU that is unique up to renaming of variables.--- {y/John,
x/z}

Introduction to AI by Prof. A. MESLEH 57


Forward and backward chaining
 Forward chaining
 Start with the atomic sentences in KB and apply Modus Ponens in
the forward direction, adding new atomic sentences, until no further
inferences can be made.
 usually used when a new fact is added to the KB and we want to
generate its consequences
 First-order Definite clauses*
 Recall:
Literal u or u, where u is a variable
Clause disjunction of literals
Formula, , conjunction of clauses
 A definite clause either
is an atomic or
is an implication whose
– antecedent is a conjunction of positive literals
– consequent a single positive literal
 Not every KB can be converted into a set of definite clauses - but many
can - Because of the restriction on single-positive-literal

Introduction to AI by Prof. A. MESLEH 58


 Examples of First-order Definite clauses

 Consider the following problem:

 Prove that ‘West’ is a Criminal


 First step
 translate these cts asfa first-order definite clauses
 next figure shows the details

Introduction to AI by Prof. A. MESLEH 59


Introduction to AI by Prof. A. MESLEH 60
Simple Forward-Chaining algorithm FOL-FC
 Starts from known facts.
 Triggers all the rules whose premises are satisfied, adding their
conclusions to the known facts.
 The process repeats until the query is answered.
 Notes:
 A Fact is not new if it is a renaming of a known fact. ie. One sentence is a
renaming of another if they are identical.
 Example: Likes (x, ice-cream) and Likes (y, ice-cream)
 Example on FC:
 Implication sentences are 9.3 , 9.6, 9.7 and 9.8.
 Two iterations are required!

Introduction to AI by Prof. A. MESLEH 61


 For forward chaining, we will have two iterations

This is the proof tree


generated. By 9.3
No new inferences
can be made at this
point using the current
KB.
such a KB is By 9.7 By 9.6
By 9.8
called a fixed
point of the
inference process 9.9 9.5 9.4 9.10
Introduction to AI by Prof. A. MESLEH 62
 FC characteristics:
 Sound: every inference is just an application of Generalized Modus
Ponens.
 Complete for definite clause KB: it answers every query whose answers
are entailed by any KB of definite clauses
 FC algorithm complexity-efficiency:
 Pattern matching: Finding all possible unifiers such that the premise of a
rule unifies with suitable set of facts in KB.
 FC algorithm checks every rule to see whether its premises are satisfied.
 Matching rules against known facts
 Match the premise of a rule against the facts in the KB.
 Eg. Missile (x) => Weapon (x) is a rule. Then find all the facts that unify
with Missile (x).
 Incremental FC:
 Every new fact inferred on iteration t must be derived from at least one
new fact inferred on iteration t-1.
 Irrelevant facts:
 FC makes all allowable inferences based on the known facts, even if they
are irrelevant to the goal at hand.
 Solutions:
Backward chaining can be used.
Restrict FC to a selected subset of rules.
Rewrite the rule set, using information from the goal, so that only
relevant variable bindings – those belonging to a so called magic set -
are considered during forward inference.
– Magic set – hybrid of FC and BC.
Introduction to AI by Prof. A. MESLEH 63
Backward chaining - BC
 Start with something we want to prove (goal/query)
 Find implication sentences that would allow to conclude.
 Attempt to establish their premises in turn.
 Normally used when there is a goal to prove (query)
 Remark
 BC algorithm uses composition of substitutions;
SUBST(COMPOSE(θ1, θ2), p) = SUBST(θ2, SUBST(θ1, p))
 SUBST(COMPOSE(θ1, θ2) is the substitution whose effect is
identical to the effect of applying each substitution in turn.
 Different unification are found for different goals.
 we have to combine them.

Introduction to AI by Prof. A. MESLEH 64


Backward chaining

Introduction to AI by Prof. A. MESLEH 65


Forward vs. backward chaining
 FC is data-driven, BC is goal driven.
 Complexity of BC can be much less than linear in size of
KB.

Introduction to AI by Prof. A. MESLEH 66


Resolution
 Modus Ponens rule
 can only allow us to derive atomic conclusions
 {A, A=>B}├ B
 However, it is more natural
 to allow us derive new implication
 {A => B, B => C} ├ A=>C, the transitivity
 a more powerful tool: resolution rule

Introduction to AI by Prof. A. MESLEH 67


CNF for FOL
 Conjunctive Normal Form
 a conjunction (AND) of clauses
each clause is a disjunction (OR) of literals
the literals can contain variables
 e.g.,

Recall that: (P  Q) equivalent to  P  Q

Introduction to AI by Prof. A. MESLEH 68


 Conversion to CNF: (6 steps)
 The following example illustrate CNF conversion procedure:
 (everyone who loves all animals is loved by someone)

1.

2.

3.
Which uses the same variable name twice, change the name of one of the variables;
so our example becomes:

Introduction to AI by Prof. A. MESLEH 69


4. Skolemize
 Process of removing  by elimination.
 i.e., translate x P(x) into P(A), where A is a new constant
 Apply this rule to our sample, we obtain

 which has the wrong meaning: since A is a certain animal


– It say that “Everyone either fails to love a particular animal A or is
loved by some particular entity B”.
 To overcome this mistaken meaning, Use a Skolem functions F and
G to make the Skolem entities to depend on x.

 The general rule is that the arguments of the skolem function are
all the universally quantified variables in whose scope the
existential quantifier appears.
 As with existential instantiation, the skolemized sentence is
satisfiable exactly when the original sentence is satisfiable.

Introduction to AI by Prof. A. MESLEH 70


5.

6.

Notes:
1- Fortunately, humans seldom need to look at a CNF sentence - all the
translation process is easily automated.
2- F(x) refers to animal potentially unloved by x.
3- G(x) refers to someone who may love x.

Introduction to AI by Prof. A. MESLEH 71


Resolution inference rule
Two clauses, which are assumed to be standardized apart So that
they share no variables, can be resolved if they contain complementary
literals.
Propositional literals are complementary if one is the negation of the
other;
First order literals are complementary of one unifies with the
negation of the other.
Binary Resolution rule: the rule that resolve exactly TWO literals

Introduction to AI by Prof. A. MESLEH 72


Example proofs
 Resolution proves that KB |= α by proving KB  α
unsatisfiable, i.e., by deriving the empty clause
 This is done by CONTRADICTION.
 First convert the sentences into CNF then see procedure of resolution
proof next slide. (the resolution leads to empty clause)

 To prove the query, assume negative


 and prove the contradiction

Introduction to AI by Prof. A. MESLEH 73


So, we include the negated goal  Criminal(West)

Introduction to AI by Prof. A. MESLEH 74


Solved HW
 This example involves
 skolemization
 non-definite clauses
 hence making inference more complex
 The problem description:

 Everyone who loves all animals is loved by someone.


 Anyone who kills an animal is loved by no one.
 Jack loves all animals.
 Either Jack or Curiosity killed the cat, who is named Tuna.
 Did curiosity kill the cat?

Introduction to AI by Prof. A. MESLEH 75


 First: express the original sentences, some background
knowledge, and the negated goal G by FOL:
 A:  x [ y Animal (y)  Loves (x, y)]  [ y Loves (y, x)]
 B:  x [ y Animal (y)  Kills (x, y)]  [ z  Loves (z, x)]
 C:  x Animal (x)  Loves (Jack, x)
 D: Kills (Jack, Tuna)  Kills (Curiosity, Tuna)
 E: Cat (Tuna)
 F:  x Cat (x)  Animal (x)
  G:  Kills (Curiosity, Tuna)
 Then apply the conversion procedure to convert each
sentence to CNF:
 A1: Animal (F(x))  Loves (G(x), x)
 A2:  Loves (x, F(x))  Loves (G(x), x)
 B:  Animal (y)   Kills (x, y)   Loves (z, x)
 C:  Animal (x) Loves (Jack, x)
 D: Kills (Jack, Tuna)  Kills (Curiosity, Tuna)
 E: Cat (Tuna)
 F:  Cat (x)  Animal (x)
  G:  Kills (Curiosity, Tuna)

Introduction to AI by Prof. A. MESLEH 76


The following answers: Did Curiosity kill the cat?
First assume Curiosity didn’t kill the cat.
Add it into KB

Empty clause, so the assumption is false

Introduction to AI by Prof. A. MESLEH 77


For each resolution,
at least one of the sentences are from the query or KB

Introduction to AI by Prof. A. MESLEH 78


Other Logics

Introduction to AI by Prof. A. MESLEH 79

You might also like