Professional Documents
Culture Documents
A Node/feature
Link/branch
(decision rule)
B C
Yes No Outcome
32 Truck Low
CarType
68 Family Low Not
Sports Sports High
High No
✓ windy rainy
rainy
mild
cool
high
normal
FALSE
FALSE
yes
yes
• Class (y) rainy cool normal TRUE no
✓ play
overcast cool normal TRUE yes
sunny mild high FALSE no
sunny cool normal FALSE yes
• All of the feature values are categorical rainy mild normal FALSE yes
sunny mild normal TRUE yes
• Binary Classification Problem overcast mild high TRUE yes
• We will apply ID3 algorithm overcast hot normal FALSE yes
rainy mild high TRUE no
|C |
Pr(c ) = 1,
j =1
j
v | Dj |
entropyAi ( D) = | D | entropy( D )
j =1
j
*** Calculate Entropy and Information Gain for ‘temp’, ‘humidity’, and ‘windy’ class.
CSE4107: Artificial Intelligence (Spring-2020) 19
ML
Decision Tree (Math)
Class — outlook
Play Tennis Dataset
Entropy: 0.693
outlook temp humidity windy play
Information Gain: 0.940 – 0.693 = 0.247 sunny hot high FALSE no
sunny hot high TRUE no
Class — temp overcast hot high FALSE yes
rainy mild high FALSE yes
Entropy: 0.911
rainy cool normal FALSE yes
Information Gain: 0.940 – 0.911 = 0.029 rainy cool normal TRUE no
overcast cool normal TRUE yes
Class — humidity sunny mild high FALSE no
sunny cool normal FALSE yes
Entropy: 0.788
rainy mild normal FALSE yes
Information Gain: 0.940 – 0.788 = 0.152 sunny mild normal TRUE yes
overcast mild high TRUE yes
Class — windy overcast hot normal FALSE yes
rainy mild high TRUE no
Entropy: 0.892
No
ML
CSE4107: Artificial Intelligence (Spring-2020) 31
An example
6 6 9 9
entropy( D) = log 2 + log 2 = 0.971
15 15 15 15
5 5 5
entropyAge ( D) = entropy( D1 ) + entropy( D2 ) + entropy( D3 ) Age Yes No entropy(Di)
15 15 15 young 2 3 0.971
5 5 5 middle 3 2 0.971
= 0.971 + 0.971 + 0.722
15 15 15 old 4 1 0.722
= 0.888
ML
CSE4107: Artificial Intelligence (Spring-2020) 32
An example
6 6 9 9
entropy( D) = log 2 + log 2 = 0.971
15 15 15 15
6 9
entropyOwn _ house ( D) = entropy( D1 ) + entropy( D2 )
15 15
6 9
= 0 + 0.918
15 15
= 0.551
5 5 5
entropyAge ( D) = entropy( D1 ) + entropy( D2 ) + entropy( D3 ) Age Yes No entropy(Di)
15 15 15 young 2 3 0.971
5 5 5 middle 3 2 0.971
= 0.971 + 0.971 + 0.722
15 15 15 old 4 1 0.722
= 0.888