Professional Documents
Culture Documents
Bangalore
2 Introduction
Given an event, predict is category.
Examples:
Who won a given ball game?
How should we file a given email?
What word sense was intended for a
occurrence of a word?
given
Event =list of features.
Ball game: Which players were on
Examples:
offense?
Email: Who sent the email?
Disambiguation: What was the
word?
preceding
Introduction
3
New
Events
Training
Decision
Events and
Tree
Categories
Category
4 Decision Tree for PlayTennis
Outlook
near(stocking) near(race)
yes no yes no
run3
(Note: Decision trees for WSD tend to be quite large)
WSD: Sample Training Data
8
Features Word
pos near(race) near(river) near(stockings) Sense
noun no no no run4
verb no no no run1
verb no yes no run3
noun yes yes yes run4
verb no no yes run1
verb yes yes no run2
verb no yes yes run3
9 Decision Tree for Conjunction
Outlook=Sunny Wind=Weak
Outlook
Wind No No
Strong Weak
No Yes
10 Decision Tree for Disjunction
Outlook=Sunny Wind=Weak
Outlook
No Yes No Yes
11 Decision Tree for XOR
Outlook=Sunny XOR Wind=Weak
Outlook
(Outlook=Sunny Humidity=Normal)
(Outlook=Overcast)
(Outlook=Rain Wind=Weak)
13
When toconsider Decision Trees
Instances describable by attribute-value pairs
Target function is discrete valued
Disjunctive hypothesis may be required
Possibly noisy training data
Missing attribute values
Examples:
Medical diagnosis
Credit riskanalysis
Object classification for robot
manipulator (Tan 1993)
Top-Down Induction of Decision
Trees ID3
14
G H L M
G H True False
Over
Sunny Rain
cast
No Yes No Yes
”If two theories explain the facts equally weel, then the
simpler theory is to be preferred”
Arguments in favor:
Fewer short hypotheses than long
hypotheses
A short hypothesis that fits the data is unlikely to be
a coincidence
A long hypothesis that fits the data might be a
coincidence
Arguments opposed:
There are many ways to define small sets of
hypotheses
Overfitting
27
To say it in simple words: Random forest builds multiple decision trees and
merges them together to get a more accurate and stable prediction
Random Forest Classifier
Training Data
M features
N examples
Random Forest Classifier
Create bootstrap samples
from the training data
M features
N examples
....
…
Random Forest Classifier
Construct a decision tree
M features
N examples
....
…
Random Forest Classifier
At each node in choosing the split feature
choose only among m<M features
M features
N examples
....
…
Random Forest Classifier
Create decision tree
from each bootstrap sample
M features
N examples
....
....
…
…
Random Forest Classifier
M features
N examples
Take the
majority
vote
....
....
…
…
39 Unknown Attribute Values
What if some examples have missing values of A?
Use training example anyway sort through tree
If node n tests A, assign most common value of A
among other examples sorted to node n.
Assign most common value of A among other
examples with same target value
Assign probability pi to each possible value vi of A
Assign fraction pi of example to each descendantin
tree