You are on page 1of 1

Decision Tree Example: Transport Attribute

Example taken from Callan, R. (2003). Artificial Intelligence, Basingstoke, UK:


Palgrave MacMillan, p. 242-247.
1. For the 20 examples, calculate the entropy for each value of the Transport
attribute:
a. How many examples are there with:
i. Transport = Average: |T1| = _____
ii. Transport = Poor: |T2| = _____
iii. Transport = Good: |T3| = _____
b. How many of these examples have a positive or negative class?
i. Transport = Average: positive _____, negative _____
ii. Transport = Poor: positive _____, negative _____
iii. Transport = Good: positive _____, negative _____
c. Calculate the probability of a positive or negative outcome for each value:
i. Transport = Average: pi=positive = _____, pi=negative = _____
ii. Transport = Poor: pi=positive = _____, pi=negative = _____
iii. Transport = Good: pi=positive = _____, pi=negative = _____
d. Calculate the entropy for each value:
i. E(Transport = Average): E(T1) = - ___ log2 ___ - ___ log2 ___ = ___
ii. E(Transport = Poor): E(T2) = - ___ log2 ___ - ___ log2 ___ = ___
iii. E(Transport = Good): E(T2) = - ___ log2 ___ - ___ log2 ___ = ___
2. For the 20 examples, calculate the entropy for all attributes:
a. How many examples are there: |T| = _____
b. How many have a positive or negative class: positive _____, negative
_____
c. Calculate the probability: pi=positive = _____, pi=negative = _____
d. Calculate the entropy: E(T) = - ___ log2 ___ - ___ log2 ___ =
___
3. Calculate the information gain for the Transport attribute:
Tj
E T j =
V
Gain T , Transport E T
j 1 T

log10 x
log 2 x
log10 2
Matthew Casey Page 1 of 1

You might also like