You are on page 1of 3

SUBMITTED BY: ARAIB

KHAN

Certainly! Let's calculate the information gain for each feature based on the
provided dataset.

Entropy Calculations:

\[ Entropy(D) = - \frac{5}{14} \cdot \log_2\left(\frac{5}{14}\right) - \


frac{5}{14} \cdot \log_2\left(\frac{5}{14}\right) - \frac{4}{14} \cdot \
log_2\left(\frac{4}{14}\right) \]
\[ Entropy(D_{\text{Sunny}}) = - \frac{3}{5} \cdot \log_2\left(\frac{3}{5}\
right) - \frac{2}{5} \cdot \log_2\left(\frac{2}{5}\right) \]
\[ Entropy(D_{\text{Overcast}}) = 0 \]
\[ Entropy(D_{\text{Rain}}) = - \frac{3}{5} \cdot \log_2\left(\frac{3}{5}\
right) - \frac{2}{5} \cdot \log_2\left(\frac{2}{5}\right) \]

Information Gain Calculations:

\Outlook:
\[ Gain(\text{Outlook}) = Entropy(D) - \left(\frac{5}{14} \cdot
Entropy(D_{\text{Sunny}}) + \frac{4}{14} \cdot Entropy(D_{\
text{Overcast}}) + \frac{5}{14} \cdot Entropy(D_{\text{Rain}})\right) \]

Temperature:
\[ Gain(\text{Temperature}) = Entropy(D) - \left(\frac{4}{14} \cdot
Entropy(D_{\text{Hot}}) + \frac{4}{14} \cdot Entropy(D_{\text{Mild}}) +
\frac{6}{14} \cdot Entropy(D_{\text{Cool}})\right) \]

Humidity
\[ Gain(\text{Humidity}) = Entropy(D) - \left(\frac{7}{14} \cdot
Entropy(D_{\text{High}}) + \frac{7}{14} \cdot Entropy(D_{\
text{Normal}})\right) \]

Wind:
\[ Gain(\text{Wind}) = Entropy(D) - \left(\frac{8}{14} \cdot
Entropy(D_{\text{Weak}}) + \frac{6}{14} \cdot Entropy(D_{\
text{Strong}})\right) \]

Results:
Outlook:
- \( Gain(\text{Outlook}) = 0.246 \)

Temperature:
- \( Gain(\text{Temperature}) = 0.029 \)

Humidity:
- \( Gain(\text{Humidity}) = 0.151 \)

Wind:
- \( Gain(\text{Wind}) = 0.048 \)

You might also like