You are on page 1of 49

Uncertainty in

Rule-Based Expert Systems

• Fundamental ideas
• Bayesian reasoning
• Certainty factors theory
• Comparison

SCCS 451 1
Uncertainty
• Uncertainty can be defined as the lack of exact
knowledge to reach a perfectly reliable
conclusion
• Common characteristics of info are
incomplete, inconsistent, uncertain
Æ uncertainty unavoidable
• Without uncertainty, exact rules are feasible

SCCS 451 2
Uncertainty
• Without uncertainty, exact rules are feasible
– A = TRUE Æ A = ~FALSE
– B = FALSE Æ B = ~TRUE
• Classical logic permits only exact reasoning:
TRUE, FALSE (clear-cut knowledge)

SCCS 451 3
Source of Uncertainty
• We can identify four main sources of uncertainty:
– Weak implications: Rule-based ES suffers from handle
vague association. It is difficult to establish concrete
correlation between condition and action part. (It needs
ability to handle vague association, e.g., by accepting the
degree of correlation.)
– Imprecise language: We describe facts with the terms
“often”, “sometimes”, “hardly”.
– Unknown data: It should accept the value ‘unknown’ and
proceed to approximate reasoning.
– Difficulty of combining the views of different experts
• different conclusion from different experts on the same question. (It
may attach a weight to each expert, then calculate the composite
conclusion.)

SCCS 451 4
Expert Systems and Uncertainty
• ES should be able to manage uncertainties
because
– Uncertainty is unavoidable in real-world
applications
– Mechanism to allow some uncertainty in expert
system
– Most popular paradigm:
• Bayesian reasoning
• Certainty factors

SCCS 451 5
Definition of Probability
• Probability: proportion of cases in which the
event occurs
• Scientific measure of chance
• Range : [0,1]
# Event_occured
p=
# Possible_event

SCCS 451 6
Conditional and Joint Probability
• Conditional probability : probability that event
A will occur if event B occurs
p( A I B)
p( A | B) =
p( B)
• Joint probability (p(A ∩B)) : probability that
both event A and B occur

SCCS 451 7
Common Relationships
p ( A ∩ B) = p ( B ∩ A)
p( A ∩ B) = p( A | B) p( B)
p ( A | B) p ( B) = p ( B | A) p ( A)
n
p ( A) = ∑ p ( A | Bi ) p ( Bi )
i =1

p ( B | A) p ( A)
p( A | B) =
p ( B | A) p ( A) + p ( B | ¬A) p (¬A)
SCCS 451 8
Bayesian Reasoning
• Rules:
IF E is true
THEN H is true (with probability p),
where E : evidence, H : hypothesis
• When event E occurs, H will occurs with
probability p(H|E)
• p(H|E) : posterior probability

SCCS 451 9
Bayesian Reasoning

p( E | H) p( H)
p ( H| E) =
p( E | H) p( H) + p( E | ¬H) p(¬H)
where
p(H) is the prior probability of hypothesis H being true;
p(E|H) is the probability that hypothesis H being true will result in
the evidence E;
p(~H) is the prior probability of hypothesis H being false;
p(E|~H) is the probability of finding evidence E even when
hypothesis H is false.

SCCS 451 10
Multiple Hypotheses
• Select a hypothesis from a number of
hypotheses to explain the current evidence

p( E | H i ) p( H i )
p( H i | E ) = m

∑ p( E | H
k =1
k ) p( H k )

SCCS 451 11
Multiple Evidences, Multiple Hypothesis
• Select a hypothesis from a number of
hypotheses to support the existing multiple
evidences

p ( E1 E2 ...En | H i ) p ( H i )
p ( H i | E1 E2 ...En ) = m

∑ p( E E ...E
k =1
1 2 n | H k ) p( H k )

SCCS 451 12
Difficulty and Solution
• All cases of conditional probabilities must be
known
• Assume conditional independence among
evidences
p ( E1 | H i ) p ( E2 | H i )... p( En | H i ) p ( H i )
p( H i | E1 E2 ...En ) = m

∑ p( E
k =1
1 | H k ) p ( E2 | H k )... p ( En | H k ) p ( H k )

SCCS 451 13
Computation Example
Hypothesis
Probability i=1 i=2 i=3
p(Hi) 0.40 0.35 0.25
P(E1|Hi) 0.3 0.8 0.5
P(E2|Hi) 0.9 0.0 0.7
P(E3|Hi) 0.6 0.7 0.9
Find P(Hi|E3), P(Hi|E1E3) and P(Hi|E1E2E3)

SCCS 451 14
Case Study: FORECAST
• See Table 3.3 (page 66)
• Two basic rules
– 1: Today is rain. Æ Tomorrow is rain.
– 2: Today is dry. Æ Tomorrow is dry.
– Error: 10 mistakes in 30 forecasts

How to include probability into rules….

SCCS 451 15
Parameters for Belief in Hypothesis
• Likelihood of Sufficiency (LS): measure of the expert
belief in hypothesis H if evidence E is present.
p( E | H )
LS =
p ( E | ¬H )
• Likelihood of Necessity (LN): measure of discredit to
hypothesis H if evidence E is missing.
p ( ¬E | H )
LN =
p ( ¬E | ¬H )
SCCS 451 16
Determination of LN and LS
• Directly decided by experts
• High LS (LS >> 1) : rule strongly supports the
hypothesis
• Low LN (0 < LN < 1): rule strongly opposes
the hypothesis in case of missing evidence
• LN cannot be derived from LS

SCCS 451 17
Probability Inclusion
• Assume each event equally possible (p = 0.5)
• Two adapted rules:
– 1: Today is rain {LS 2.5, LN .6} Æ Tomorrow is
rain {prior 0.5}
– 2: Today is dry {LS 1.6, LN .4} Æ Tomorrow is
dry {prior 0.5}

SCCS 451 18
Calculation for Overall Probability
• Objective: find max. posterior probabilities
• Use prior probability for the 1st calculation
• Update by LS if the evidence is true
• Update by LN if the evidence is false
• Continue until all events are applied

SCCS 451 19
Prior Odd
• Define prior odd: ratio of the occurrence to
non-occurrence
p( H )
O( H ) =
1 − p( H )
• Help in easy calculation of the posterior
probability

SCCS 451 20
Find Posterior Probability by O(H)
O( H | E ) = LS × O( H )
O( H | ¬E ) = LN × O( H )
O( H | E )
p( H | E ) =
1 + O( H | E )
O ( H | ¬E )
p ( H | ¬E ) =
1 + O ( H | ¬E )
SCCS 451 21
FORECAST: Rules (1)
• 1: Today is rain (LS 2.5, LN 0.6) Æ Tomorrow
is rain {prior 0.5}
• 2: Today is dry (LS 1.6, LN 0.4) Æ Tomorrow
is dry {prior 0.5}
• 3: Today is rain and rainfall is low (LS 10, LN
1) Æ Tomorrow is dry {prior 0.5}

SCCS 451 22
FORECAST: Rules (2)
• 4: Today is rain; rainfall is low and
temperature is cold (LS 1.5, LN 1) Æ
Tomorrow is dry {prior 0.5}
• 5: Today is dry and temperature is warm (LS 2,
LN 0.9) Æ Tomorrow is rain {prior 0.5}
• 6: Today is dry; temperature is warm and sky
is overcast (LS 5, LN 1) Æ Tomorrow is rain
{prior 0.5}

SCCS 451 23
Information about Today
• Weather: rain
– use rules with the event on weather
• Rainfall: low
– use rules with the event on rainfall
• Temperature: cold
– use rules with the event on temperature
• Cloud cover: overcast
– use rules with the event on cloud cover
SCCS 451 24
Bayesian Inference
• Describes the application domain as a set of
possible outcomes termed hypotheses
• Requires an initial probability for each
hypothesis in the problem, prior probability
• Updates probabilities using evidence
• Each piece of evidence may update the
probability of a set of hypotheses

SCCS 451 25
Problems of Bayesian Method
• People are bad probability estimators
• Prior probability may inconsistent with LS &
LN
• Conditional independence of evidence rarely
satisfied
• Requires massive amount of data
• Impractical for large knowledge bases

SCCS 451 26
Certainty Factor Theory
• It is a popular alternative to Bayesian
reasoning
• It is not reliable on statistical data of the
problem domain
• Certainty factor (c.f.) is used for measuring
expert believes

SCCS 451 27
Certainty Factor Theory
• It expresses belief in an event .
– Fact or hypothesis
• It is based on evidence.
– Expert assessment
• Composite number can be used to guide
reasoning
• Hypotheses are ranked after all evidence has
been considered
SCCS 451 28
Certainty Factor Theory
• Certainty factor, cf(x), is a measure of how
confident we are in x
• Range from -1 to +1
– cf = -1 very uncertain
– cf = +1 very certain
– cf = 0 neutral
• Certainty factors are relative measures
• Do not translate to measure of absolute belief
SCCS 451 29
Strength of Belief
• Certainty factor, cf(x), combine belief and
disbelief into a single number based on some
evidence
• Measure of Belief, MB(H, E)
• Measure of Disbelief, MD(H, E)
• Strength of belief or disbelief in H depends on
the kind of evidence E observed.

SCCS 451 30
Belief
• Positive cf implies evidence supports
hypothesis since MB > MD
• cf of 1 means evidence definitely supports the
hypothesis
• cf of 0 means either there is no evidence or the
belief is cancelled out by the disbelief.
• Negative cf implies that the evidence favors
negation of hypothesis since MB < MD
SCCS 451 31
cf and Probabilities
MB( H , E ) − MD( H , E )
cf =
1 − min[MB( H , E ), MD( H , E )]
⎧ 1 ; p(H) = 1
⎪ max[ p ( H | E ), p ( H )] − p ( H )
MB( H , E ) = ⎨
; otherwise
⎪⎩ max[1,0] − p ( H )
⎧ 1 ; p(H) = 0
⎪ min[ p ( H | E ), p ( H )] − p ( H )
MD( H , E ) = ⎨
; otherwise
⎪⎩ min[1,0] − p ( H )
MB : Measure of Belief, MD : Measure of Disbelief
SCCS 451 32
Rules in Certainty Factors Theory
IF <evidence> THEN <hypothesis> {cf}
Term Certainty factor
Definitely not -1.0
Almost certainly not -0.8
Probably not -0.6
Maybe not -0.4
Unknown -0.2 to 0.2
Maybe +0.4
Probably +0.6
Almost certainly +0.8
Definitely +1.0
SCCS 451 33
Example of Rules
• IF A is X
THEN B is Y {cf 0.7}
B is Z {cf 0.2}
• More than one value assigned to B
• For the remaining 10%, anything can happen
including value not yet been observed.

SCCS 451 34
Propagation of cf
• cf(H,E) = cf(E) × cf
• E.g.
IF the sky is clear
THEN the forecast is sunny {cf 0.8}
Current cf for the sky is clear= 0.5
cf(H,E) = 0.5 × 0.8 = 0.4
Æ It maybe sunny

SCCS 451 35
Propagation of cf
• Factor, cf, assigned by the rule is propagated
through the reasoning chain.
• Calculates the net certainty of the consequent
when the evidence for the antecedent is
uncertain

SCCS 451 36
Stanford Certainty Factor Algebra
• There are rules to combine cfs of several facts
– cf(x1) AND cf(x2) = min((cf(x1), cf(x2))
– cf(x1) OR cf(x2) = max(cf(x1), cf(x2))
• A rule may also have a certainty factor cf(rule)
– cf(action) = cf(condition) cf(rule)

SCCS 451 37
Multiple Evidences: Conjunctive Rules
• A hypothesis implied from a number of evidences
cf ( H , E1 ∩ E 2 ∩ ... ∩ E n ) =
min[ cf ( E1 ), cf ( E 2 ),..., cf ( E n )] × cf
• Example
IF sky is clear {cf 0.9}
AND the forecast is sunny {cf 0.7}
THEN the action is “wear sunglasses” {cf 0.8}

cf(H, E1 and E2) = min(0.9,0.7) x 0.8 = 0.7 x 0.8 = 0.56

SCCS 451 38
Multiple Evidence: Disjunctive Rules
• A hypothesis implied from one or more evidences
cf ( H , E1 ∪ E2 ∪ ... ∪ En ) =
max[cf ( E1 ), cf ( E2 ),..., cf ( En )] × cf
• Example
IF sky is overcast {cf 0.6}
OR the forecast is rain {cf 0.8}
THEN the action is “take an umbrella” {cf 0.8}
cf(H, E1 or E2) = max(0.6, 0.8) x 0.8 = 0.8 x 0.9 =0.72

SCCS 451 39
Rules Combination
• Rules that lead to the same hypothesis
• Rule 1: IF A is X
THEN C is Z {cf 0.8}
• Rule 2: IF B is Y
THEN C is Z {cf 0.6}
How to combine the two rules…

SCCS 451 40
Combined cf

⎧cf1 + cf 2 (1 − cf1 ) ; cf1 > 0 ∩ cf 2 > 0


⎪⎪ cf1 + cf 2
cf (cf1 , cf 2 ) = ⎨ ; cf1 < 0 ∪ cf 2 < 0
⎪1 − min( cf1 , cf 2 )
⎪⎩cf1 + cf 2 (1 + cf1 ) ; cf1 < 0 ∩ cf 2 < 0

Combine according to the confidence of the


hypothesis from each evidence.

SCCS 451 41
Consequent from multiple rules
• cf(E1) = cf(E2) =1.0
• cf1(H,E1) = cf(E1) x cf = 1.0 x 0.8 = 0.8
• cf2(H,E2) = cf(E2) x cf = 1.0 x 0.6 = 0.6
• cf(cf1, cf2) = cf1(H, E1)
+ cf2(H, E2) x [1 – cf1(H, E1)]
= 0.8 + 0.6 x (1 – 0.8) = 0.92

SCCS 451 42
FORECAST: Rules for cf (1)
• Control cf
• Control “threshold 0.01”
• 1: Today is rain Æ Tomorrow is rain {cf 0.5}
• 2: Today is dry Æ Tomorrow is dry {cf 0.5}
• 3: Today is rain and rainfall is low Æ
Tomorrow is day {cf 0.6}

SCCS 451 43
FORECAST: Rules for cf (2)
• 4: Today is rain; rainfall is low and
temperature is cold Æ Tomorrow is dry {cf
0.7}
• 5: Today is dry and temperature is warm Æ
Tomorrow is rain {cf 0.65}
• 6: Today is dry; temperature is warm and sky
is overcast Æ Tomorrow is rain {cf 0.55}
• Seek tomorrow
SCCS 451 44
Information about Today
• Weather: rain
• Rainfall: low with cf of 0.8
• Temperature: cold with cf of 0.9

SCCS 451 45
Certainty Factors
• Practical alternative to Bayesian reasoning
• Heuristic manner of combining certainty
factors differs from the way in which they
would be combined if they were probabilities
• Not mathematically pure
• Does mimic thinking process of human expert

SCCS 451 46
Comparison of Bayesian Reasoning &
Certainty Factors
• Bayesian Reasoning
– Use probability theory
– Works well in areas such as forecasting &
planning, or the areas where statistical data are
available and probability statements made.
– Relies on statistical information
– If the assumption of conditional independence
cannot be made, it may lead to dissatisfaction with
the method.
SCCS 451 47
Comparison of Bayesian Reasoning &
Certainty Factors
• Certainty Factors
– Lack mathematical correctness of probability
theory
– Outperforms Bayesian reasoning in areas such as
diagnostics and particularly medicine
– Used in cases where probabilities are not known
or too difficult or expensive to obtain
– Evidential reasoning
– Provide better explanations of control flow
SCCS 451 48
Bayesian Reasoning VS cf
Bayesian Reasoning Certainty Factor Theory
• Example: • Example:
PROSPECTOR MYCIN

Both aim for ES to be able to quantify personal,


subjective and qualitative information.

SCCS 451 49

You might also like