You are on page 1of 29

Introduction to linear

classification models
Generative model

 A generative classifier tries to learn the model that generates the


data behind the scenes by estimating the assumptions and
distributions of the model.
 It then uses this to predict unseen data, because it assumes the
model that was learned captures the real model.
 Generative classifiers commonly learn a model of the joint
probability, p(x,y), of the input x and the label y, or equivalently the
likelihood p(x|y) according to Bayes’ rule.
 The examples of generative models are Gaussian mixture model
(GMM), Hidden Markov Model (HMM),naïve bayes etc.
Discriminative model

 A discriminative classifier tries to model by just depending


on the observed data.
 It makes fewer assumptions on the distributions but
depends heavily on the quality of the data.
 The examples of discriminative models are Support Vector
Machines (SVM), Artificial Neural Network (ANN), Decision
tree (DT) etc
Sr. No. Generative Models Discriminative Models
1 It generates the data estimating the assumption It tries to model by just depending on observed
and distributions of the model. data.

2 More assumptions. Fewer assumptions.


3 As observed data are not used, it do not have Good quality data must require.
quality issue of data.
4 Risky performance. Discriminative classifiers outperform generative
classifiers, if you have a lot of data.

5 High asymptotic error Less asymptotic error


6 Number of training examples are logarithmic. Number of training examples are linear.

7 Generative model learns the joint probability Discriminative model learns the conditional
distribution probability distribution

8 It is also known as dynamic classifier. It is also known as static classifier.


9 Examples: Hidden Markov Model, Naïve Bayes , Examples: Linear regression, Support Vector
Gaussian Mixture Model etc. Machine, Artificial Neural Network, Decision tree
Discriminant Functions

 A discriminant is a function that takes an input vector x and assigns it


to one of K classes, denoted Ck.
 To simplify the discussion, we consider first the case of two classes
and then investigate the extension to K > 2 classes.
Discriminant Functions with Two Classes
Multiple Classes
Conti..

 Each point is then classified according to a majority vote amongst the


discriminant functions.
 However, this too runs into the problem of ambiguous regions
Solution
Generative Probabilistic Models
GMM
Example - GMM
 The Gaussian Mixture Model Classifier (GMM) is basic but useful classification algorithm.
 In image, Every pixel is characterized by its intensity in the RGB color space. At that point,
the probability of detecting the current pixel value is considered given by the following
equation in the multidimensional case:

 where the parameters are K is the number of distributions, is a weight associated to the ith
Gaussian at time t with mean and standard deviation . is a Gaussian probability density
function:

 For computational reasons, it assumed that the RGB color components are not dependent and
have the similar variances. So, the covariance matrix is of the form:

 So, each pixel is characterized by a mixture of K Gaussians.


Latent variable

 latent variables are variables that are not directly observed but are rather
inferred (through a mathematical model) from other variables that are
observed (directly measured).
Advantages and disadvantages

 Adv:
 It is a good algorithm to use for the classification of static postures and non-
temporal pattern recognition
 Disadv:
 for computational reasons, it can fail to work if the dimensionality of the problem
is too high.
 the user must set the number of mixture models that the algorithm will try and fit
to the training dataset

You might also like