You are on page 1of 5

What is an Expert System?

An expert system is a computer system which may be able


Lecture 4: Expert Systems to perform as usefully as a human expert in areas such as:
• Interpretation of sensory data
• Planning actions
• Diagnosing faults or diseases and suggesting remedies
ACSC 368 • Configuring industrial components
• Teaching
• Control of industrial plant
Harris Papadopoulos
• Analysis of experimental results
• etc.

Real Life Expert Systems Components of an Expert System

Example Application
MYCIN Medical diagnosis • Knowledge-base
DENDRAL Molecular structure analysis
• Inference engine
MACSYMA, Symbolic differentiation and
SAINT, MATHLAB integration • Explanation generator
HEARSAY Speech understanding
R1 (XCON) Configuration of computer
• User-interface
components
plus lots more …

Architecture of an Expert System Advantages of Expert Systems


User
• Knowledge is independent of personnel
• Easy transfer of knowledge
User Explanation
Interface Generator • Knowledge is more predictable and consistent
• Cheaper to access than a human expert
Inference
• Knowledge is easier to document
Engine • Knowledge is made explicit
But
Knowledge Knowledge acquisition may be difficult
Base

1
Factual Information and
Production Rules
Prolog Representation
We can represent some knowledge in the form of We may also need factual information. E.g.:
production rules, of general form:
EB-virus is classified as herpesvirus-4.
IF condition THEN action Herpesvirus-4 is transmitted by saliva.
where condition is a propositional expression, or a
well-formed formula in predicate logic. E.g.: In Prolog:

IF viral_infection(Patient, ‘EB-virus’) :-
patient has high temperature AND temperature(Patient, high),
patient has sore throat AND symptoms(Patient, sore_throat),
patient’s heterophil-agglutination test is positive test(Patient, heteroagglu, positive).
THEN classify(‘EB-virus’, herpesvirus_4).
patient has symptoms of being infected by EB-virus transmission(herpesvirus_4, saliva).

Specific Case Knowledge Inference Mechanisms

We may also need specific case knowledge, e.g. stored as The natural inference mechanism for Prolog is backward
frames: chaining, i.e. start from goal.
E.g. suppose we have the following knowledge base:
patient_details([name: N, age: A, occupation: O]).
A if B, C, D
test_results([blood: B,
B if E, F
munofluorescene: MF,
C if G
heteroagglu: HA,
microagglu: MA]). To discover A, find B, C and D
To discover B, find E and F
Variables represent slots to be filled with values when we To discover C, find G
have the information. D, E, F, G are data items supplied by the user.
This is goal-driven inference.

Inference Mechanisms (cont.) ‘How’ and ‘Why’ Explanations


Another inference mechanism is forward chaining, i.e.
start from data (facts). • ‘How’ explanations tell the user how the
E.g. suppose we have the same knowledge base:
system came up with a given solution; i.e.
A if B, C, D give the sequence of rules that were applied
B if E, F in order to get to the result.
C if G
and we have the facts D, E, F, G. • ‘Why’ explanations tell the user why a
G gives C particular piece of information is needed;
E and F give B i.e. display the rule which needs the
D with derived B and C give A
information in question.
This is data-driven inference.

2
Limitations of Expert Systems Restaurant Knowledge-Base Rules
• Difficulty in capturing ‘deep’ knowledge of problem best_restaurant(X) :- serve_favourites(X), good_price(X),
domain. E.g. MYCIN lacks knowledge of human close_location(X).
physiology. serve_favourites(X) :- your_favourites(Y),
• Lack of robustness and flexibility. Humans can go back specialities(X, Z), included(Y, Z).
to first principles.
good_price(X) :- your_price(Y), average_price(X, Y).
• Inability to provide ‘deep’ explanations. Normally they
can only say what they did, not why they took the close_location(X) :- your_location(Y),
particular approach. restaurant_location(X, Z), close(Y, Z).
• Difficulties in verification (e.g. 100s of rules, safety
included([], _).
critical applications …).
• Little or no learning from experience – performance included([H|T], L) :- member(H, L), included(T, L).
does not improve on its own.

Restaurant Knowledge-Base Facts Restaurant Knowledge-Base Facts


(cont.)
tobe_filled(your_favourites(X)).
tobe_filled(your_price(X)). :
tobe_filled(your_location(X)). close(X, X).
specialities(‘Le Marecage’, [chicken, duck, frog]). :
specialities(‘Le Grand Couteau’, [duck, frog, snake, snail]). close(‘Paris 15’, ‘Paris 18’).
: close(‘Paris 18’, ‘Paris 17’).
average_price(‘Le Marecage’, cheap).
average_price(‘Le Grand Couteau’, expensive).
:
restaurant_location(‘Le Marecage’, ‘Clamart’).
restaurant_location(‘Le Grand Couteau’, ‘Paris 17’).

User Interface User Interface (cont.)


expert :- option(1) :- process(best_restaurant(X)), nl,
nl, write(‘Hello! I am an expert on Paris restaurants.’), write(‘The restaurant that suits you best is ’),
menu. write(X), menu.
menu :- option(2) :- process(serve_favourites(X)), nl,
nl, write(‘Choose your question:’), write(‘The restaurant that serves your
nl, write(‘1 Which restaurant suits me best?’), favourites is ’), write(X), menu.
nl, write(‘2 Which restaurant serves my favourite option(3) :- process(good_price(X)), nl,
dishes?’), write(‘The restaurant that charges your prices
nl, write(‘3 Which restaurant charges my prices?’), is ’), write(X), menu.
nl, write(‘4 Which restaurant is close to me?’), option(4) :- process(close_location(X)), nl,
nl, write(‘5 Exit’), write(‘The restaurant that is closer to you is ’),
nl, write(‘Please choose: ’), read(X), option(X). write(X), menu.

3
User Interface (cont.) Inference Engine
Extend the Prolog meta-interpreter:
option(5) :- retract(already_asked(X)), process(true) :- !.
retract(already_answered(Y)), fail. process((Goal, Goals)) :- !, process(Goal), process(Goals).
option(5) :- nl, write(‘Goodbye!’). process(Goal) :- clause(Goal, SubGoals),
option(X) :- out_of_range(X), process(SubGoals).
nl, write(‘No such option – please try again.’), process(Goal) :- tobe_filled(Goal), Goal =.. [Pred, Arg],
menu. already_asked(Pred), !,
option(_) :- nl, already_answered(Goal).
write('Sorry I cannot find such a restaurant'), process(Goal) :- tobe_filled(Goal), Goal =.. [Pred, Arg],
menu. Goalx =.. [Pred, X], ask_user(Goalx),
out_of_range(X) :- not(member(X, [1, 2, 3, 4, 5])). assert(already_asked(Pred)),
assert(already_answered(Goalx)),
Goal = Goalx.

More User Interface ‘How’ Explanations


Extend the Prolog meta-interpreter:
ask_user(your_favourites(X)) :- process(true, ‘match with a fact’) :- !.
nl, write(‘Please list your favourite dishes, e.g. process((Goal, Goals), and(Proof, Proofs)) :- !,
[lobster, fish, clam] etc.’), nl, write(‘: ’), read(X). process(Goal, Proof), process(Goals, Proofs).
ask_user(your_price(X)) :- process(Goal, by(Goal, Proof)) :-
nl, write(‘What price band do you want to pay: cheap, clause(Goal, SubGoals), process(SubGoals, Proof).
middle, or expensive?’), nl, write(‘: ’), read(X). process(Goal, by(Goal, user_given)) :-
ask_user(your_location(X)) :- tobe_filled(Goal), Goal =.. [Pred, Arg],
nl, write(‘Where do you live?’), nl, write(‘: ’), read(X). already_asked(Pred), !, already_answered(Goal).
process(Goal, by(Goal, user_given)) :- tobe_filled(Goal),
Goal =.. [Pred, Arg], Goalx =.. [Pred, X],
ask_user(Goalx), assert(already_asked(Pred)),
assert(already_answered(Goalx)), Goal = Goalx.

Extend User Interface Extend User Interface (cont.)


option(1) :- process(best_restaurant(X), Proof), nl, can_explain(Proof) :-
write(‘The restaurant that suits you best is ’), nl, write(‘Do you want to know how, yes or no? ’),
write(X), can_explain(Proof), menu. read(X), explain_respond(X, Proof).
option(2) :- process(serve_favourites(X), Proof), nl, explain_respond(no, _).
write(‘The restaurant that serves your favourites explain_respond(yes, Proof) :-
is ’), write(X), can_explain(Proof), menu. explain(Proof, 2).
option(3) :- process(good_price(X), Proof), nl, explain(by(Goal, Proof), Intent) :-
write(‘The restaurant that charges your prices tab(Intent), write(Goal), write(‘ by ’),
is ’), write(X), can_explain(Proof), menu. explain_step(Proof, Intent).
option(4) :- process(close_location(X), Proof), nl, explain(and(Proof1, Proof2), Intent) :-
write(‘The restaurant that is closer to you is ’), explain(Proof1, Intent), nl, tab(Intent), write(‘and’),
write(X), can_explain(Proof), menu. nl, nl, explain(Proof2, Intent).

4
Extend User Interface (cont.) ‘Why’ Explanations
explain_step(‘match with a fact’, _) :- Extend the Prolog meta-interpreter:
!, write(‘match with a fact’), nl. process(true, ‘match with a fact’, Rules) :- !.
explain_step(user_given, _) :- process((Goal, Goals), and(Proof, Proofs), Rules) :- !,
!, write(‘user given’), nl. process(Goal, Proof, Rules),
explain_step(Proof, Intent) :- process(Goals, Proofs, Rules).
NewIn is Intent + 4, nl, nl, explain(Proof, NewIn). process(Goal, by(Goal, Proof), Rules) :-
clause(Goal, SubGoals),
process(SubGoals, Proof, [(Goal :- SubGoals)|Rules]).

‘Why’ Explanations (cont.) Extend ask_user


Extend the Prolog meta-interpreter: ask_user(your_favourites(X), Rules) :-
process(Goal, by(Goal, user_given), Rules) :- nl, write(‘Please list your favourite dishes, e.g. [lobster,
tobe_filled(Goal), Goal =.. [Pred, Arg], fish, clam] etc.’), nl, write(‘: ’), read(Y),
already_asked(Pred), !, already_answered(Goal). respond(Y, your_favourites(X), Rules).
process(Goal, by(Goal, user_given), Rules) :- ask_user(your_price(X), Rules) :-
tobe_filled(Goal), nl, write(‘What price band do you want to pay: cheap,
Goal =.. [Pred, Arg], Goalx =.. [Pred, X], middle, or expensive?’), nl, write(‘: ’), read(Y),
ask_user(Goalx, Rules), assert(already_asked(Pred)), respond(Y, your_price(X), Rules).
assert(already_answered(Goalx)), Goal = Goalx. ask_user(your_location(X), Rules) :-
nl, write(‘Where do you live?’), nl, write(‘: ’), read(Y),
respond(Y, your_location(X), Rules).

Extend ask_user (cont.) Extend option


respond(why, Goal, [Rule|Rules]) :- !, nl, option(1) :- process(best_restaurant(X), Proof, []), nl,
display_rule(Rule), nl, ask_user(Goal, Rules). write(‘The restaurant that suits you best is ’),
respond(why, Goal, []) :- !, nl, write(X), can_explain(Proof), menu.
write(‘No more explanation!’), nl, ask_user(Goal, []). option(2) :- process(serve_favourites(X), Proof, []), nl,
respond(Ans, Goal, Rules) :- write(‘The restaurant that serves your favourites
Goal =.. [Pred, Arg], Arg = Ans. is ’), write(X), can_explain(Proof), menu.
option(3) :- process(good_price(X), Proof, []), nl,
display_rule((Goal :- SubGoals)) :-
write(‘The restaurant that charges your prices
write(Goal), write(‘ :- ’), write(SubGoals).
is ’), write(X), can_explain(Proof), menu.
option(4) :- process(close_location(X), Proof, []), nl,
write(‘The restaurant that is closer to you is ’),
write(X), can_explain(Proof), menu.

You might also like