Professional Documents
Culture Documents
Example Application
MYCIN Medical diagnosis • Knowledge-base
DENDRAL Molecular structure analysis
• Inference engine
MACSYMA, Symbolic differentiation and
SAINT, MATHLAB integration • Explanation generator
HEARSAY Speech understanding
R1 (XCON) Configuration of computer
• User-interface
components
plus lots more …
1
Factual Information and
Production Rules
Prolog Representation
We can represent some knowledge in the form of We may also need factual information. E.g.:
production rules, of general form:
EB-virus is classified as herpesvirus-4.
IF condition THEN action Herpesvirus-4 is transmitted by saliva.
where condition is a propositional expression, or a
well-formed formula in predicate logic. E.g.: In Prolog:
IF viral_infection(Patient, ‘EB-virus’) :-
patient has high temperature AND temperature(Patient, high),
patient has sore throat AND symptoms(Patient, sore_throat),
patient’s heterophil-agglutination test is positive test(Patient, heteroagglu, positive).
THEN classify(‘EB-virus’, herpesvirus_4).
patient has symptoms of being infected by EB-virus transmission(herpesvirus_4, saliva).
We may also need specific case knowledge, e.g. stored as The natural inference mechanism for Prolog is backward
frames: chaining, i.e. start from goal.
E.g. suppose we have the following knowledge base:
patient_details([name: N, age: A, occupation: O]).
A if B, C, D
test_results([blood: B,
B if E, F
munofluorescene: MF,
C if G
heteroagglu: HA,
microagglu: MA]). To discover A, find B, C and D
To discover B, find E and F
Variables represent slots to be filled with values when we To discover C, find G
have the information. D, E, F, G are data items supplied by the user.
This is goal-driven inference.
2
Limitations of Expert Systems Restaurant Knowledge-Base Rules
• Difficulty in capturing ‘deep’ knowledge of problem best_restaurant(X) :- serve_favourites(X), good_price(X),
domain. E.g. MYCIN lacks knowledge of human close_location(X).
physiology. serve_favourites(X) :- your_favourites(Y),
• Lack of robustness and flexibility. Humans can go back specialities(X, Z), included(Y, Z).
to first principles.
good_price(X) :- your_price(Y), average_price(X, Y).
• Inability to provide ‘deep’ explanations. Normally they
can only say what they did, not why they took the close_location(X) :- your_location(Y),
particular approach. restaurant_location(X, Z), close(Y, Z).
• Difficulties in verification (e.g. 100s of rules, safety
included([], _).
critical applications …).
• Little or no learning from experience – performance included([H|T], L) :- member(H, L), included(T, L).
does not improve on its own.
3
User Interface (cont.) Inference Engine
Extend the Prolog meta-interpreter:
option(5) :- retract(already_asked(X)), process(true) :- !.
retract(already_answered(Y)), fail. process((Goal, Goals)) :- !, process(Goal), process(Goals).
option(5) :- nl, write(‘Goodbye!’). process(Goal) :- clause(Goal, SubGoals),
option(X) :- out_of_range(X), process(SubGoals).
nl, write(‘No such option – please try again.’), process(Goal) :- tobe_filled(Goal), Goal =.. [Pred, Arg],
menu. already_asked(Pred), !,
option(_) :- nl, already_answered(Goal).
write('Sorry I cannot find such a restaurant'), process(Goal) :- tobe_filled(Goal), Goal =.. [Pred, Arg],
menu. Goalx =.. [Pred, X], ask_user(Goalx),
out_of_range(X) :- not(member(X, [1, 2, 3, 4, 5])). assert(already_asked(Pred)),
assert(already_answered(Goalx)),
Goal = Goalx.
4
Extend User Interface (cont.) ‘Why’ Explanations
explain_step(‘match with a fact’, _) :- Extend the Prolog meta-interpreter:
!, write(‘match with a fact’), nl. process(true, ‘match with a fact’, Rules) :- !.
explain_step(user_given, _) :- process((Goal, Goals), and(Proof, Proofs), Rules) :- !,
!, write(‘user given’), nl. process(Goal, Proof, Rules),
explain_step(Proof, Intent) :- process(Goals, Proofs, Rules).
NewIn is Intent + 4, nl, nl, explain(Proof, NewIn). process(Goal, by(Goal, Proof), Rules) :-
clause(Goal, SubGoals),
process(SubGoals, Proof, [(Goal :- SubGoals)|Rules]).