You are on page 1of 76

Artificial Intelligence

Lecture 2: Propositional Logic 2 solve Problems


Problem I - A Story
❑ You roommate comes home; he/she is completely wet
❑ You know the following things:
○ Your roommate is wet
○ If your roommate is wet, it is because of rain, sprinklers, or both
○ If your roommate is wet because of sprinklers, the sprinklers must be on
○ If your roommate is wet because of rain, your roommate must not be carrying
the umbrella
○ The umbrella is not in the umbrella holder
○ If the umbrella is not in the umbrella holder, either you must be carrying the
umbrella, or your roommate must be carrying it
○ You are not carrying the umbrella
❑ Can you conclude that the sprinklers are on?
❑ Can AI conclude that the sprinklers are on?
Problem II- Are you ‘Sherlock Holmes’?
❑ There was a robbery in which a lot of goods were stolen.
The robber(s) left in a truck. It is known that :
(1) Nobody else could have been involved other than A, B and C.
(2) C never commits a crime without A's participation.
(3) B does not know how to drive.
❑ Is A innocent or guilty?
Problem III- Knights and Knaves
❑ A very special island is inhabited only by knights and knaves.
Knights always tell the truth, and knaves always lie.
❑ You meet two inhabitants: Zoey and Mel.
○ Zoey tells you that Mel is a knave.
○ Mel says ‘Neither Zoey nor I are knaves’.
❑ Can you determine what are they?
❑ who is a knight and who is a knave?
Logics
❑ Logics are formal languages for representing information such that
conclusions can be drawn.
❑ Syntax defines the sentences in the language.
❑ Semantics defines the "meaning" of sentences.
o defines the truth of each sentence w.r.t. each possible world (model)

❑ E.g., the language of arithmetic


o x+2 ≥ y is a sentence; x2+y > {} is not a sentence
o x+2 ≥ y is true iff the number x+2 is no less than the number y
o Sentence x+2 ≥ y is true in a world where x = 7, y = 1
o Sentence x+2 ≥ y is false in a world where x = 0, y = 6
Propositional logic: Syntax
❑ Propositional logic is the simplest logic and illustrates basic ideas of logic
❑ Logical constants
–True, false
❑ Atomic sentence
–A proposition symbol(P, Q, S, ...) representing a true or false statement
❑ The proposition symbols P1, P2,…are sentences (formulae)
If S is a sentence, S is a sentence (negation)
If S1 and S2 are sentences, S1  S2 is a sentence (conjunction)
If S1 and S2 are sentences, S1  S2 is a sentence (disjunction)
If S1 and S2 are sentences, S1  S2 is a sentence (implication)
If S1 and S2 are sentences, S1  S2 is a sentence (biconditional)
Examples of PL sentences
❑ “If it is hot and humid, then it is raining”
(P  Q) → R
❑ “If it is humid, then it is hot”
Q→P
❑ “It is humid.”
Q

❑ We’re free to choose better symbols, btw:


Ho = “It is hot”
Hu = “It is humid”
R = “It is raining”
Propositional logic: Semantics
❑ Each model specifies true/false values for each proposition symbol (8
possible models)
E.g. P1,2 P2,2 P3,1
false true false
❑ Rules for evaluating truth with respect to an interpretation m:
S is true iff S is false
S1  S2 is true iff S1 is true and S2 is true
S1  S2 is true iff S1 is true or S2 is true
S1  S2 is true iff S1 is false or S2 is true
S1  S2 is false iff S1 is true and S2 is false
S1  S2 is true iff S1S2 is true and S2S1 is true
❑ Simple recursive process evaluates an arbitrary sentence w.r.t. an
interpretation, e.g.,
P1,2  (P2,2  P3,1) : true  (true  false) = true  true = true
Truth tables for connectives
Some Terms
❑ The meaning or semantics of a sentence determines its interpretation.

❑ Given the truth values of all symbols in a sentence, it can be


“evaluated” to determine its truth value (True or False).

❑ A model for a KB is a “possible world” (assignment of truth values to


propositional symbols) in which each sentence in the KB is True.
Model for a KB
Let the KB be [PQ→R, Q → P]
What are the possible models? Consider all possible assignments of T|F to
P, Q and R and check truth tables
FFF: OK
P: it’s hot
FFT: OK
FTF: NO
Q: it’s humid
FTT: NO R: it’s raining
TFF: OK
TFT: OK
TTF: NO
TTT: OK
If KB is [PQ→R, Q → P, Q], then the only model is TTT
Validity and satisfiability
❑ A sentence is valid or tautology if it is true under all
interpretations, no matter what the world is actually like or
what the semantics is.
❑ Example: “It’s raining or it’s not raining.”, True, A A, A  A, (A  (A  B))  B

❑ An inconsistent sentence (contradiction, or unsatisfiable) is a


sentence that is False under all interpretations. The world is
never like what it describes
❑ Example : “It’s raining and it’s not raining.”, A A

❑A sentence is satisfiable if it is true in some model


❑ Example: A B, C
Quiz
List the models for the following formulas:
1. A ∧ ¬B
2. (A ∧ B) ∨ (B ∧ C)
3. A ∨ B → C
4. ¬A ↔ B ↔ C
Knowledge base for the story
❑ RoommateWet
❑ RoommateWet➔(RoommateWetBecauseOfRain ∨
RoommateWetBecauseOfSprinklers)
❑ RoommateWetBecauseOfSprinklers➔SprinklersOn
❑ RoommateWetBecauseOfRain➔¬(RoommateCarryingUmbrella)
❑ UmbrellaGone
❑ UmbrellaGone➔(YouCarryingUmbrella ∨ RoommateCarryingUmbrella)
❑ ¬ (YouCarryingUmbrella)
Knowledge base for the story
❑ W
❑ W →(R ∨ S)
❑S→O

❑ R → ¬(C)
❑G
❑ G → (Y ∨ C)
❑ ¬(Y)
Entailment
❑ Entailment means that one thing follows from another
KB╞ α

❑ Knowledge base KB entails sentence α if and only if α is true in all worlds


where KB is true
❑ E.g., the KB containing “Baramkeh is in Damascus” and “Damascus is in Syria” entails
“Baramkeh is in Syria”
❑ E.g., the KB containing x+y = 4 entails 4 = x+y

❑ Entailment is a relationship between sentences (i.e., syntax) that is based


on semantics
Exercise: Prove using truth tables, the following deductions
1. Double negative elimination
¬¬P ⊨P
2.Conjunction introduction/ elimination
(a) {P, Q} ⊨P∧Q; (b) P ∧ Q ⊨P; (c) P ∧ Q ⊨Q.
3.Disjunction introduction/ elimination
(a) P ⊨ P ∨ Q; (b) Q ⊨P ∨ Q;
(c) {P ∨ Q, P → R, Q → R} ⊨R
4.Bi-conditional introduction/ elimination
(P → Q) ∧ (Q → P) ⊨(P ↔ Q)
5.De Morgan
(a) ¬(P ∧ Q) ⊨¬P ∨ ¬Q;
(b) ¬(P ∨ Q) ⊨¬P ∧ ¬Q
Proofs of the Deduction Rules
Logical equivalence
Two sentences are logically equivalent iff they are true in same models: α ≡ ß
iff α╞ β and β╞ α
Quiz
Given that:
P = (A ∨ B ) ∧ (¬C ∨ ¬D ∨ E)
Q1 = A ∨ B
Q2= (A ∨ B ∨ C) ∧ ((B ∧ C ∧ D) → E)
Q3= (A ∨ B) ∧ (¬D ∨ E)

Does :
1- P ⊨ Q1?
2- P ⊨ Q2?
3- P ⊨ Q3?
Inference
❑ KB ├i α : sentence α can be derived from KB by procedure i, i proves α
Logical inference is used to create new sentences that logically follow from
a given set of sentences (KB).

❑ Soundness: i is sound if whenever KB ├i α, it is also true that KB╞ α


❑ An inference rule is sound if every sentence X produced by an inference rule
operating on a KB logically follows from the KB. (That is, the inference rule does
not create any contradictions)

❑ Completeness: i is complete if whenever KB╞ α, it is also true that KB ├i α


❑ An inference rule is complete if it is able to produce every expression that logically
follows from (is entailed by) the KB
Sound rules of inference
❑Here are some examples of sound rules of inference
A rule is sound if its conclusion is true whenever the premise is true

❑Each can be shown to be sound using a truth table


RULE PREMISE CONCLUSION
Modus Ponens A, A → B B
And Introduction A, B AB
And Elimination AB A
Double Negation A A
Unit Resolution A  B, B A
Resolution A  B, B  C AC
Other Inference Rules
❑ Modus Tollens
P →Q , Q
P

❑ Chaining (Syllogismal Hypothetic)


P →Q, Q →R
P →R
Example: Modus Ponens
❑ Latin for "mode that affirms"
❑ Whenever sentences of form a=> b and a are given, the
sentence b can be inferred
❑ R1: Green => Martian
❑ R2: Green
❑ Inferred: Martian
Example: And-Elimination
❑Any of conjuncts can be inferred
❑ R1: Martian ^ Green
❑ Inferred: Martian
❑ Inferred: Green
❑Use truth tables if you want to confirm inference
rules
Soundness of modus ponens

A B A→B OK?
True True True 
True False False 
False True True 
False False True 
Soundness of modus Tollens
(the way that denies by denying)
A B A→B B OK?
True True True

True False False

False True True

False False True


Soundness of the resolution inference rule
Resolution is not complete
❑ Not completeness of resolution inference rule:
PR ⊨PR
But we cannot use resolution to prove P  R from P  R????
Proving things
❑ A proof is a sequence of sentences, where each sentence is either a
premise or a sentence derived from earlier sentences in the proof by one of
the rules of inference.
❑ The last sentence is the theorem (also called goal or query) that we want to
prove.
❑ Example for the “weather problem” given above.

1. Humid Premise “It is humid”


2. Humid→Hot Premise “If it is humid, it is hot”
3. Hot Modus Ponens(1,2) “It is hot”
4. (HotHumid)→Rain Premise “If it’s hot & humid, it’s raining”
5. HotHumid And Introduction(1,2) “It is hot and humid”
6. Rain Modus Ponens(4,5) “It is raining”
Inference Rules: Resolution
❑ Conjunctive Normal Form (CNF)
conjunction of disjunctions of literals, often called clauses
E.g., (A  B)  (B  C  D)

❑ Resolution inference rule (for CNF):


li …  lk, m1  …  mn
li  …  li-1  li+1  …  lk  m1  …  mj-1  mj+1 ...  mn

where li and mj are complementary literals.

❑E.g., P1,3  P2,2, P2,2


P1,3
Conversion to CNF
B1,1  (P1,2  P2,1)
1. Eliminate , replacing α  β with (α  β)  (β  α).
(B1,1  (P1,2  P2,1))  ((P1,2  P2,1)  B1,1)

2. Eliminate , replacing α  β with α β.


(B1,1  P1,2  P2,1)  ((P1,2  P2,1)  B1,1)

3. Move  inwards using de Morgan's rules:


(B1,1  P1,2  P2,1)  ((P1,2  P2,1)  B1,1)

4. Apply distributivity law ( over ) and flatten:


(B1,1  P1,2  P2,1)  (P1,2  B1,1)  (P2,1  B1,1)
Quiz: Conversion to CNF
(P  Q) (R  P)
1. (PQ)(RP)
2. (P  Q)(RP)
3. (P  R  P)  (Q  R  P)
4. (P  R)  (Q  R  P)
Using Inference Rules
1. It is not sunny this afternoon and it is colder than yesterday.
2. If we go swimming it is sunny.
3. If we do not go swimming then we will take a canoe trip.
4. If we take a canoe trip then we will be home by sunset.
We will be home by sunset ?

Propositions Sentences Representations


P It is sunny this afternoon 1.  p  q
q it is colder than yesterday 2. r → p
R we go swimming 3.  r → s
S we will take a canoe trip 4. s → t
T We will be home by sunset
Using Inference Rules
Propositions
p it is sunny this afternoon
q it is colder than yesterday
r we go swimming
s we will take a canoe trip
t we will be home by sunset
We will be home by sunset ?

Sentences Representations
1.  p  q
2. r → p
3.  r → s
4. s → t
Using the resolution rule (an example)
1. Anna is skiing or it is not snowing.
2. It is snowing or Bart is playing hockey.
Anna is skiing or Bart is playing hockey ??

Propositions
p Anna is skiing
q Bart is playing hockey
r it is snowing

Sentences Representations
1. p   r pq
Resolution Rule
2. r  q Anna is skiing or Bart is playing hockey
Resolution Refutation Principle
❑ Resolution refutation proves a theorem by negating the
statement to be proved and adding this negated goal to the
set of axioms that are known to be true.
❑ Use the resolution rule of inference to show that this leads
to a contradiction.
❑ Once the theorem prover shows that the negated goal is
inconsistent with the given set of axioms, it follows that the
original goal must be consistent.
Example (resolution refutation)
❑Using resolution refutation principle, show that C  D is a
logical consequence of:
S = {A  B,  A  D, C  B}

❑ First, we add negation of logical consequence (i.e., (CVD) ≡ C  D)


to the set S.
❑ Get S’={A  B, A  D, C  B, C, D}.
❑ Now show that S’ is unsatisfiable by deriving contradiction using
resolution principle.
Example (resolution refutation)
Resolution Refutation Example
KB = (B1,1  (P1,2 P2,1))  B1,1, α = P1,2
The moving robot example
Bat-ok(of robot) and Liftable(block) then Moves(robot-arm)

Bat_ok, Liftable →Moves

Moves, Bat_ok
Example
1. Battery-OK  Bulbs-OK  Headlights-Work
2. Battery-OK  Starter-OK  Empty-Gas-Tank  Engine-Starts
3. Engine-Starts  Flat-Tire  Car-OK
4. Headlights-Work
5. Battery-OK
6. Starter-OK
7. Empty-Gas-Tank
8. Car-OK
9. Flat-Tire
10. Starter-OK  Empty-Gas-Tank  Engine-Starts
11. Battery-OK  Empty-Gas-Tank  Engine-Starts
12. Battery-OK  Starter-OK  Engine-Starts
13. Engine-Starts  Flat-Tire
14. Engine-Starts  Car-OK
15. .....
Example

Battery-OK
Starter-OK
Empty-Gas-Tank
Car-OK
Flat-Tire

Battery-OK  Starter-OK  Engine-Starts
Engine-Starts  Flat-Tire

Battery-OK  Starter-OK  Flat-Tire

Starter-OK  Flat-Tire

Flat-Tire

False (empty clause)
Resolution Heuristics
❑ Set-of-support heuristics:
❑ At least one ancestor of every inferred clause comes from a

❑ Shortest-clause heuristics:
❑Generate a clause with the fewest literals first
❑Unit Resolution

❑ Simplifications heuristics:
❑ Remove any clause containing two complementary literals (tautology)
❑ If a symbol always appears with the same “sign”, remove all the clauses that
contain it (pure symbol)
Example (Set-of-Support)
1. Battery-OK  Bulbs-OK  Headlights-Work
2. Battery-OK  Starter-OK  Empty-Gas-Tank  Engine-Starts
3. Engine-Starts  Flat-Tire  Car-OK
4. Headlight-Work
5. Battery-OK
6. Starter-OK
7. Empty-Gas-Tank
8. Car-OK
9. Flat-Tire
Example (Set-of-Support)
1. Battery-OK  Bulbs-OK  Headlights-Work
2. Battery-OK  Starter-OK  Empty-Gas-Tank  Engine-Starts
3. Engine-Starts  Flat-Tire  Car-OK
4. Headlight-Work
5. Battery-OK
6. Starter-OK
Note the goal-directed
7. Empty-Gas-Tank
flavor
8. Car-OK
9. Flat-Tire
10. Engine-Starts  Car-OK
11. Engine-Starts
12. Battery-OK  Starter-OK  Empty-Gas-Tank
13. Starter-OK  Empty-Gas-Tank
14. Empty-Gas-Tank
15. False
Example (Shortest-Clause)
1. Battery-OK  Bulbs-OK  Headlights-Work
2. Battery-OK  Starter-OK  Empty-Gas-Tank  Engine-Starts
3. Engine-Starts  Flat-Tire  Car-OK
4. Headlight-Work
5. Battery-OK
6. Starter-OK
7. Empty-Gas-Tank
8. Car-OK
9. Flat-Tire
Example (Shortest-Clause)
1. Battery-OK  Bulbs-OK  Headlights-Work
2. Battery-OK  Starter-OK  Empty-Gas-Tank  Engine-Starts
3. Engine-Starts  Flat-Tire  Car-OK
4. Headlight-Work
5. Battery-OK
6. Starter-OK
7. Empty-Gas-Tank
8. Car-OK
9. Flat-Tire
10. Engine-Starts  Car-OK
11. Engine-Starts
12. Bulbs-OK  Headlights-Work
13. Battery-OK  Starter-OK  Empty-Gas-Tank
14. Starter-OK  Empty-Gas-Tank
15. Empty-Gas-Tank
16. False
Example (Pure Literal)
1. Battery-OK  Bulbs-OK  Headlights-Work
2. Battery-OK  Starter-OK  Empty-Gas-Tank  Engine-Starts
3. Engine-Starts  Flat-Tire  Car-OK
4. Headlights-Work
5. Battery-OK
6. Starter-OK
7. Empty-Gas-Tank
8. Car-OK
9. Flat-Tire
Horn sentences
❑A Horn sentence or Horn clause has the form:
P1  P2  P3 ...  Pn → Q
or alternatively (P → Q) = (P  Q)
P1   P2   P3 ...   Pn  Q
where Ps and Q are non-negated atoms

❑To get a proof for Horn sentences, apply Modus Ponens


repeatedly until nothing can be done
Forward and backward chaining
Horn Form (restricted)
KB = conjunction of Horn clauses

Horn clause =
proposition symbol; or
(conjunction of symbols)  symbol
E.g., C  (B  A)  (C  D  B)

Modus Ponens (for Horn Form): complete for Horn KBs


α1, … ,αn, α1  …  αn  β
β

Can be used with forward chaining or backward chaining.


These algorithms are very natural and run in linear time
Forward and backward chaining
❑ Horn clauses: disjunction of literals of which at most one is positive
❑ Important because Horn clauses can be written as an implication whose
premise is a conjuction of positive literals and whose conclusion is a single
positive literal
❑ Definite clauses: exactly one positive literal
▪ Positive literal forms the head
▪ Negative literals form the body
Inference with Horn clauses can be done by forward or backward chaining in
a time that is linear in the size of the KB.
Forward chaining
Fire any rule whose premises are satisfied in the KB
Add its conclusion to the KB, until query is found
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Forward chaining example
Backward chaining
❑ Work backwards from the query q
❑To prove q by BC,
❑ check if q is known already, or
❑ prove by BC all premises of some rule concluding q

❑Avoid loops:
❑check if new sub-goal is already on the goal stack

❑Avoid repeated work:


❑ check if new sub-goal has already been proved true, or
❑ has already failed
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Forward vs. backward chaining
FC is data-driven, automatic, unconscious processing,
e.g., object recognition, routine decisions

May do lots of work that is irrelevant to the goal !!!

BC is goal-driven, appropriate for problem-solving,


e.g., Where are my keys? How do I get into a PhD program?

Complexity of BC can be much less than linear in size of KB (only


relevant facts are considered)
Forward and backward chaining
❑ Forward chaining is often preferable in cases where there are many rules with the same
conclusions.
❑ Rule systems of taxonomic hierarchies: the taxonomy of the animal kingdom :
❑ animal(X) :- sponge(X). ❑ vertebrate(X) :- mammal(X) ...
❑ animal(X) :- arthopod(X). ❑ mammal(X) :- carnivore(X) ...
❑ animal(X) :- vertebrate(X). ... ❑ carnivore(X) :- dog(X).
❑ vertebrate(X) :- fish(X). ❑ carnivore(X) :- cat(X). ...

❑ Now, suppose we have the fact "dog(fido)" and we query whether "animal(fido)".
❑ In forward chaining, we will successively add "carnivore(fido)", "mammal(fido)",
"vertebrate(fido)", and "animal(fido)". The query will then succeed immediately. The
total work is proportional to the height of the hierarchy.
❑ In backward chaining, the query " animal(fido)" will unify with the first rule, and
generate the subquery " sponge(fido)", which will initiate a search for Fido through all
the subdivisions of sponges, and so on. Ultimately, it searches the entire taxonomy of
animals looking for Fido.
Propositional logic is a weak language
❑ Hard to identify “individuals” (e.g., Mary, 3)
❑ Can’t directly talk about properties of individuals or relations between
individuals (e.g., “Bill is tall”)
❑ Generalizations, patterns, regularities can’t easily be represented (e.g.,
“all triangles have 3 sides”)
❑ First-Order Logic (abbreviated FOL or FOPC) is expressive enough to
concisely represent this kind of information
❑ FOL adds relations, variables, and quantifiers, e.g.,
❑ “Every elephant is gray”:  x (elephant(x) → gray(x))

❑ “There is a white alligator”:  x (alligator(X) ^ white(X))


Summary
Logical agents apply inference to a knowledge base to derive new information and
make decisions
Basic concepts of logic:
syntax: formal structure of sentences
semantics: truth of sentences wrt models
entailment: necessary truth of one sentence given another
inference: deriving sentences from other sentences
soundness: derivations produce only entailed sentences
completeness: derivations can produce all entailed sentences
Resolution for propositional logic
Forward, backward chaining are linear-time, complete for Horn clauses
Propositional logic lacks expressive power

You might also like