Expert System | Inference | Expert


Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana


An expert system behaves like an expert in some narrow area of expertise
Typical examples:  Diagnosis in an area of medicine  Adjusting between economy class and business class seats for future flights  Diagnosing paper problems in rotary printing Very fashionable area of AI in period 1985-95


MYCIN (Shortliffe, Feigenbaum, late seventies) Diagnosis of infections

 

PROSPECTOR (Buchanan, et al. ~1980) Ore and oil exploration
INTERNIST (Popple, mid eighties) Internal medicine

Structure of expert systems .

Structure of Expert System Components database of facts knowledge base inference engine explanation mechanism Forward and Backward Chaining .

It consists not only of data and information but interrelationships. Knowledge is not the same as data or information because it is active and can generate new understanding. Below is an example of how IF THEN rules might be applied in our Animal-ID expert system. experience and experimentation. The knowledge base contains the facts and rules or knowledge of the expert. The following form gives a very simple example of how a database of facts might be built up in an expert system to identify types of animals. The user may begin by entering as much as they know about the problem or the inference engine may prompt for details or ask whether certain conditions exist. Someone who is very knowledgeable in a specific field is known as an expert. While knowledge in humans is gained by learning. The quality and quantity of data gained from the user will influence the reliability of the decision.         Animal-ID Backbone: Yes No Body parts: Two Three Four Outer body: Hair Feathers Scales Exoskeleton Number of legs: Two Four Six Eight Eight Knowledge base Expert systems differ from other information systems in that they hold knowledge. consequences and predictions. knowledge in a computer is often represented by rules.   Database of facts This holds the user's input about the current problem.    IF animal has backbone THEN vertebrate IF animal is vertebrate AND has hair THEN mammal IF animal is mammal AND has pointed teeth AND has claws THEN carnivore . Gradually a database of facts is built up which the inference engine will use to come to a decision. Known facts would be entered.

if some parts of the IF THEN rules could not be matched with data held in the database of facts.4 or 40%. Because it is a rule based. In our Animal-ID system for example.             Explanation mechanism The explanation mechanism is an important feature of expert systems. Forward chaining starts with the symptoms and works forward to find a solution. IF THEN. Backward chaining starts with a hypothesis and works backwards to prove or disprove it. such as 0. It interprets the rules and draws conclusions. The user interface for both of these systems may be similar. However. It explains how the system has reached a decision and can also allow users to find out why it has asked a particular question. it is how they use the rules that is different. the variables can then be processed by the system to reach a decision. Read through and then complete the activity at the end which looks at the difference between forward and backward chaining.forward chaining and backward chaining. This allows the user to make a judgement on the reliability of the decision. It begins with a goal and then looks at the evidence (data and rules) to determine whether or not it is correct. the system could not be as confident. It is based on the way that humans make decisions. . the system is able to provide the user with an explanation of which rules were used during the inference process. Chaining Activity Fuzzy logic In expert systems there are usually many different variables. It begins with the available data. the user or may not know whether the animal has pointed teeth and so therefore most but not all rules would match and the certainty value would be reduced to say 0. compares it with the facts and rules held in the knowledge base and then infers or draws the most likely conclusion. Some expert systems use both backward and forward chaining. For example: The animal has no backbone therefore it is an invertebrate The invertebrate has an exoskeleton therefore it is an arthropod The arthropod has three main body parts therefore it is an insect The insect has a triangular shaped head and elliptical shaped body therefore it is a cockroach Certainty In the examples from the Animal-ID expert system all of the data held in the database of facts matched all of the IF THEN rules. With rule based expert systems there are two main types of reasoning .8. THEN IF. Forward and Backward Chaining  Below are rules for an Expert System used to identify animals. so how does the system know which are the most or least important? Fuzzy logic allows expert systems to reach decisions when there are no completely true or completely false answers. The system could therefore confidently (1 or 100%) predict that the conclusions were true. Backward chaining Backward chaining is a 'goal driven' method of reasoning.    Inference engine The inference engine connects the knowledge base and the database of facts. Forward chaining Forward chaining is a 'data driven' method of reasoning. Each variable is assigned a value between 0 and 1.

FEATURES OF EXPERT SYSTEMS     Problem solving in the area of expertise Interaction with user during and after problem solving Relying heavily on domain knowledge Explanation: ability of explain results to user .

REPRESENTING KNOWLEDGE WITH IF-THEN RULES   Traditionally the most popular form of knowledge representation in expert systems Examples of rules:    if condition P then conclusion C if situation S then action A if conditions C1 and C2 hold then condition C does not hold .

DESIRABLE FEATURES OF RULES     Modularity: each rule defines a relatively independent piece of knowledge Incrementability: new rules added (relatively independently) of other rules Support explanation Can represent uncertainty .

TYPICAL TYPES OF EXPLANATION  How explanation Answers user‟s questions of form: How did you reach this conclusion? Why explanation Answers users questions: Why do you need this information?  .

RULES CAN ALSO HANDLE UNCERTAINTY  If condition A then conclusion B with certainty F .

7) that the identity of the organism is bacteroides. and 2 the site of the culture is one of the sterilesites. .EXAMPLE RULE FROM MYCIN if 1 the infection is primary bacteremia. and 3 the suspected portal of entry of the organism is the gastrointestinal tract then there is suggestive evidence (0.

S = 400] if Diagnosing equipment failure on oil platforms. S = 2000] . and the relief valve on V-01 has lifted then the V-01 relief valve opened early (the set pressure has drifted) [N = 0.005. Reiter 1980 NOT the pressure in V-01 reached relief valve lift pressure.EXAMPLE RULES FROM AL/X  if the pressure in V-01 reached relief valve lift pressure then the relief valve on V-01 has lifted [N = 0.001.

Bratko 1982 if 1 there is a hypothesis. and 3 there are facts: H1 is false. and H2. and 2 there are two hypotheses. H. that a plan P succeeds. that a plan R1 refutes plan P. H3. and 2 generate the fact: H3 implies not(H) .EXAMPLE RULE FROM AL3 Game playing. that a plan R2 refutes plan P. H1. that the combined plan „R1 or R2' refutes plan P. and H2 is false then 1 generate the hypothesis.

A TOY KNOWLEDGE BASE: WATER IN FLAT Flat floor plan Inference network .

kitchen_dry.IMPLEMENTING THE „LEAKS‟ EXPERT SYSTEM  Possible directly as Prolog rules: leak_in_bathroom :hall_wet. Employ Prolog interpreter as inference engine Deficiences of this approach:  Limited syntax  Limited inference (Prolog‟s backward chaining)  Limited explanation   .

. Rules:  Facts: fact( hall_wet). fact( window_closed). fact( bathroom_dry).TAILOR THE SYNTAX WITH OPERATORS  if hall_wet and kitchen_dry then leak_in_bathroom.

is_true( P2). % A relevant rule % whose condition is true . is_true( P1 and P2) :is_true( P1). is_true( P2). is_true( P1 or P2) :is_true( P1) . is_true( Condition). is_true( P) :if Condition then P.BACKWARD CHAINING RULE INTERPRETER is_true( P) :fact( P).

forward % Continue . write( P).A FORWARD CHAINING RULE INTERPRETER forward :new_derived_fact( P). write( 'Derived: '). % A new fact !. write( 'No more facts'). assert( fact( P)). nl. % All facts derived .

% Condition true? composed_fact( Cond) :fact( Cond).FORWARD CHAINING INTERPRETER. . % Simple fact composed_fact( Cond1 and Cond2) :composed_fact( Cond1). composed_fact( Cond2). CTD. % Rule's conclusion not yet a fact composed_fact( Cond). composed_fact( Cond2). new_derived_fact( Concl) :if Cond then Concl. % A rule not fact( Concl). % Both conjuncts true composed_fact( Cond1 or Cond2) :composed_fact( Cond1) .

...FORWARD VS. goals evidence . causes   Backward chaining: “goal driven” Forward chaining: “data driven” ... diagnoses. observations . BACKWARD CHAINING  Inference chains connect various types of info.. hypotheses findings.:     data . explanations... diagnoses manifestations .

g. medical diagnosis water in flat: observe hall_wet. infer forward leak_in_bathroom. BACKWARD CHAINING    What is better? Checking a given hypothesis: backward more natural If there are many possible hypotheses: forward more natural (e.g. monitoring: data-driven)   Often in expert reasoning: combination of both e.FORWARD VS. check backward kitchen_dry .

User: How did you get this? .g.: System: There is leak in kitchen.GENERATING EXPLANATION     How explanation: How did you find this answer? Explanation = proof tree of final conclusion E.

Proof1). is_true( P2. is_true( P1 or P2. xfx. Proof1 and Proof2) :is_true( P1.% is_true( P. is_true( P2. . is_true( P. Proof). <=). CondProof).op( 800. Proof) . P) :fact( P). is_true( P1 and P2. is_true( P. Proof) :is_true( P1. Proof) Proof is a proof that P is true :. is_true( Cond. P <= CondProof) :if Cond then P. Proof2).

WHY EXPLANTION     Exploring “leak in kitchen”. system asks: Was there rain? User (not sure) asks: Why do you need this? System: To explore whether water came from outside Why explanation = chain of rules between current goal and original (main) goal .

. . “degree of belief”.. Proposition: CertaintyFactor if Condition then Conclusion: Certainty   . “subjective probability”.UNCERTAINTY  Propositions are qualified with: “certainty factors”.

c(P2)) c(P1 or P2) = max( c(P1). c(P2)) Rule “if P1 then P2: C” implies: c(P2) = c(P1) * C .A SIMPLE SCHEME FOR HANDLING UNCERTAINTY    c( P1 and P2) = min( c(P1).

then c( a or b) = 0.DIFFICULTIES IN HANDLING UNCERTAINTY   In our simple scheme.too many?!?   .5.5 This is counter intuitive! Proper probability calculus would fix this Problem with probability calculus: Needs conditional probabilities .5 Still: c( a or b) = 0. c(b) = 0. suppose that c(b) increases to 0.5 Now. for example: Let c(a) = 0.

TWO SCHOOLS RE. see e. Modern Approach).g. Russell & Norvig 2003 (AI. UNCERTAINTY    School 1: Probabilities impractical! Humans don‟t think in terms of probability calculus anyway! School 2: Probabilities are mathematically sound and are the right way! Historical development of this debate: School 2 eventually prevailed .Bayesian nets. Bratko 2001 (Prolog Programming for AI) .

Sign up to vote on this title
UsefulNot useful