You are on page 1of 17

MULTIPLE DISCRIMINANT

Group 1 ANALYSIS
Roll No Name
13000118055 Sayak Khan

13000118035 Sritiman Adak

13000118054 Sayan Mukherjee

13000118069 Rajtilak Pal

13000118066 Raunak Bahadur Sinha

13000118036 Srinjoy Ghosh


1
CONTENTS
• Introduction
• Why MDA?
• Steps To Perform MDA
• Finding the W Unit Vector
• Transform the Within class Scattering
• Transform the Between class Scattering
• Final Expression
• Conclusion
• References

2
INTRODUCTION

1. MDA is a technique that distinguishes datasets from each other based on


their characteristics
2. It is a dimensionality reduction technique
3. It does not directly perform classification , but it compresses high dimension
signals into low dimension which helps most classifiers to overcome the
overfitting problem which is quite predominant while working with data in very
high dimensional space.
4. Along with the feature vectors, it also takes their class labels into account

3
WHY MDA

Drawback of Principal component analysis:


● PCA is also a dimensionality reduction method but it does not take class label
information into account. PCA may remove the discriminatory components from the
data that is necessary for classification.
● For eg. if PCA is applied to two characters O and Q, then the tail of Q may be
removed during PCA, which is the discriminatory feature. This will hamper
classification.

So, to overcome these and preserve the discriminatory components after dimensionality
reduction, MDA is used.

4
Steps to Perform MDA
Prerequisite Informations

5
Step 1
Computation of the mean vector (centre of the ith class) for all the classes present in
the dataset -> given by mi

Computation of the grand mean vector of the dataset which is basically the
center of all the mean vectors (mi) in the n dimensional space -> given by m

6
STEP 2
Determination of Intra-class scattering (Sw) and inter-class scattering(SB)

Discrimination Matrix : S-1wSB

7
STEP 3

Determination of the eigen values of the discrimination matrix

in the descending order.

8
STEP 4

Determination of the unit eigen vectors of the discrimination matrix

9
STEP 5

In order to reduce the dimension of each feature vector from n-dimension

to l-dimension where n>>l, the vector is projected on the unit

eigenvectors

10
Finding the W unit vector

11
Transform the Within class scattering

12
Transform the Between Class Scattering

13
Finally:

● We have to maximise the between-class scattering and minimise the within-class


scattering of the transformed space.
● For the quantity:

14
CONCLUSION

● Used by financial planners to evaluate potential investments when a number of


variables are taken into account.
● Used for research purposes by researchers and statisticians
● Used to focus on the most important data points and make a decision while
considering a number of stocks.
● Used by financial professionals as a way to develop Markowitz efficient sets.

15
References

● https://www.slideshare.net/muhammadHasrath/multiple-discriminant-analysis-11552
9335
● https://en.wikipedia.org/wiki/Multiple_discriminant_analysis
● https://www.researchgate.net/file.PostFileLoader.html?id=54eb12afef97130f298b457
6&assetKey=AS%3A273713604300800%401442269816239
● https://www.investopedia.com/terms/m/multiple-discriminant-analysis.asp#:~:text=M
ultiple%20discriminant%20analysis%20is%20a,while%20screening%20for%20sever
al%20variables
.

16
That’s All From Our End :)

THANK YOU

17

You might also like