Professional Documents
Culture Documents
Tom M. Mitchell. Machine Learning. [1] McGraw Hill. 1997. Chapter 12.
and his slides [2].
1 Motivation
Inductive Analytical
Learning Learning
Hypothesis fits Hypothesis fits
Goal
data domain theory
Statistical Deductive
Justification
inference Inference
Requires little
Learns from
Advantages prior
scarce data
knowledge
Scarce data, Imperfect
Pitfalls
incorrect bias domain theory
Determine
A hypothesis that best fits the training examples and domain theory.
https://jmvidal.cse.sc.edu/talks/indanalytical/allslides.html 1/6
11/28/22, 10:41 PM Combining Inductive and Analytical Learning
3 KBANN
The Knowledge-Based Artificial Neural Network (KBANN [3]) algorithm uses prior
knowledge to derive
hypothesis from which to begin search.
It first constructs a ANN that classifies every instance as the
domain theory would.
So, if B is correct then we are done!
Otherwise, we use Backpropagation to train the network.
Domain theory
https://jmvidal.cse.sc.edu/talks/indanalytical/allslides.html 2/6
11/28/22, 10:41 PM Combining Inductive and Analytical Learning
Graspable ← HasHandle
OpenVessel ← HasConcavity, ConcavityPointsUp
Training Examples
Cup Non-Cups
BottomIsFlat X X X X X X X X
ConcavityPointsUp X X X X X X X
Expensive X X X X
Fragile X X X X X X
HandleOnTop X X
HandleOnSide X X X
HasConcavity X X X X X X X X X
HasHandle X X X X X
Light X X X X X X X X
MadeOfCeramic X X X X
MadeOfPaper X X
MadeOfStyrofoam X X X X
https://jmvidal.cse.sc.edu/talks/indanalytical/allslides.html 3/6
11/28/22, 10:41 PM Combining Inductive and Analytical Learning
4 TangentProp
The TangentProp [4] algorithm
incorporates the prior knowledge into the error criterion
minimized by gradient descent.
Specifically, the prior knowledge is in the form of known
derivatives of the target function.
5 EBNN
The Explanation-Based Neural Network (EBNN [5]) algorithm extends TangentProp.
It computes the derivatives itself.
The value of $\mu$ is chosen independently for each
example.
It represents the domain theory with a collection of
neural networks.
Then, learns the target function as another network.
https://jmvidal.cse.sc.edu/talks/indanalytical/allslides.html 4/6
11/28/22, 10:41 PM Combining Inductive and Analytical Learning
6 FOCL
The FOCL [6] system is
an extension of the purely inductive FOIL.
FOIL generates each candidate specialization by adding a
new literal to the preconditions.
FOCL generates additional specializations based
on the domain theory. How?
We say that a literal is operational if it is
allowed to be used in describing an output
hypothesis.
At each point in the search FOCL adds candidates by
1. Considering each operational literal for the
precondition.
2. Creating an operational and logically sufficient
condition for the target concept
according to the
domain theory. Prune any unnecessary preconditions.
https://jmvidal.cse.sc.edu/talks/indanalytical/allslides.html 5/6
11/28/22, 10:41 PM Combining Inductive and Analytical Learning
URLs
1. Machine Learning book at Amazon,
http://www.amazon.com/exec/obidos/ASIN/0070428077/multiagentcom/
2. Slides by Tom Mitchell on Machine Learning, http://www-2.cs.cmu.edu/~tom/mlbook-
chapter-slides.html
3. KBANN
paper, http://citeseer.nj.nec.com/towell94knowledgebased.html
4. TangentProp paper, http://research.microsoft.com/~patrice/PS/tang_prop.ps
5. EBNN
paper, http://citeseer.nj.nec.com/mitchell92explanationbased.html
6. citeseer:pazzani92utility, http://citeseer.nj.nec.com/pazzani92utility.html
https://jmvidal.cse.sc.edu/talks/indanalytical/allslides.html 6/6