Training and Testing Multi-class Logistic Classifier

Pi19404
March 31, 2014

Contents

Contents
Training and Testing Multi-class Logistic Classier
Introduction . . . . . . . 0.1.1 Loss Function . 0.1.2 Theano . . . . . . 0.1.3 Example . . . . . . 0.2 Theano Code . . . . . . References . . . . . . . . . . . 0.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3
3 4 5 6 7 8

2 | 9

Training and Testing Multi-class Logistic Classifier

Training and Testing Multi-class Logistic Classifier
0.1 Introduction
In this article we will look at training and testing of a Multi-class Logistic Classifier 



Logistic regression is a probabilistic, linear classifier. It is parametrized by a weight matrix W and a bias vector b. Classification is done by projecting data points onto a set of hyperplanes, the distance to which is used to determine a class membership probability. Mathematically this can be expressed as
P (Y = i x; W; b) =

j

Pe e
j

Wi x+bi Wj x+bj 

 

Corresponding to each class y logistic classifier is paramemterized by a set of parameters W ; b .
i i i

These parameters are used to compute the class probability. Given a unknown vector x,The prediction is performed as
ypred = argmaxi P (Y = i x; W; b) ypred = argmaxi

Pe e
j

j

Wi x+bi Wj x+bj 

Given a set of labelled training data X ; Y where i in1; : : : ; N we need to estimate these parameters.
i i

3 | 9

Training and Testing Multi-class Logistic Classifier

0.1.1 Loss Function 

Ideally we would like to compute the parameters so that the loss is minimized
`0;1 =

0

 1

jDj X I
i=0

f (x

(i) )=y (i)

6

f (x) = argmaxk P (Y = yk x; )

j 

    

P (Y = yk x; ) is modelled using logistic function.

The 0   1 loss function is not differentiable ,hence optimizing it for large modes is computationally infesible. Instead we maximize the log-likelyhood of the classifier given the training data D. Maximum Likelyhood estimation is used to perform this operation. Estimate the parameters so that likelyhood of training data is maximized under the model parameters

j

D

It is assumed that the data samples are independent ,so the probability of the set is the product of probabilities of individual examples.
L( = W; b; L(; L(;

D) = argmax
N i=1

YP Y
N

(

= yi X = xi ; W; b)

D) = argmax D) =

X logP Y
i=1

j

(

= yi X = xi ; W; b)

X  argmin logP Y
N

j

(

= yi X = xi ; W; b)

j

i=1 

  

It should be noted that Likelyhood of correct class is not same as number of right predictions. Log Likelyhood function can be considered as differential version of the 0   1 loss function. In the present application negative log-likelyhood is used as the loss function Optimal parameters are learned by minimizing the loss function.

4 | 9

Training and Testing Multi-class Logistic Classifier 

 

In the present application gradient based methods are used for minimization. Specifically stochastic gradient descent and conjugated gradient descent are used for minimization of the loss function. The cost function is expressed as
L(; L(; L(;

D) = D) = D) =

X   logP Y
N

(

= yi X = xi ; W; b)

X   loge
i=1 N i=1 N i i=1

X   log Pe
i=1 N

j

Wi x+bi

j

eWj x+bj

Wi x+bi

  log

Xe
j j

Wj x+bj

L(;

D) =  

XW x

+ bi + log

Pe

1

Wj x+bj 

 

The first part of the sum is affine,second is a log of sum of exponentials which is convex Thus the loss function is convex. Thus we can compute the parameters corresponding to global maxima of the loss function using gradient descent methods. Thus we compute the derivatives of the loss function L(; D) with respect to ,@`=@W and @`=@b

0.1.2 Theano 

  

Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It is a expression compiler,and can evaluate symbolic expression when executed.Typically programs which are implemented in C/C++ can be written concisely and efficiently in Theano. computing the gradients in most programming languages (C/C++, Matlab, Python), involves manually deriving the expressions for the gradient of the loss with respect to the parameters @`=@W , and @`=@b, This approah not only involves manual coding but the derivatives can get difficult to compute for complex models,

5 | 9

Training and Testing Multi-class Logistic Classifier 

With Theano, this work is greatly simplified as it performs automatic differentiation .

0.1.3 Example 

For demonstration,we will use MNIST dataset The MNIST dataset consists of handwritten digit images and it is divided in 60,000 examples for the training set and 10,000 examples for testing. The official training set of 60,000 is divided into an actual training set of 50,000 examples and 10,000 validation examples All digit images have been size-normalized and centered in a fixed size image of 28 x 28 pixels. In the original dataset each pixel of the image is represented by a value between 0 and 255, where 0 is black, 255 is white and anything in between is a different shade of grey. The dataset can be found at http://deeplearning.net/data/mnist/ mnist.pkl.gz. The data set is pickled can be loaded using python pickle package. The data set consists of training,validation and test set. The data set consists of feature vector of length and number of classes are 10.
28x28 = 784 

   

A class called Logistic Regression is defined which encapsulates the methods that are used to perform training and testing of multi-class Logistic Regression classifier.

6 | 9

Training and Testing Multi-class Logistic Classifier

0.2 Theano Code 



The python code for training and testing can be found in the git repository https://github.com/pi19404/OpenVision ImgML/LogisticRegression. py file. the ImgML/load_datasets.py contains methods to load datasets from pickel files or SVM format files.

""" symbolic expressions defining input and output vectors""" x=T.matrix('x'); y=T.ivector('y'); """ The mnist dataset in pickel format""" model_name1="/media/LENOVO_/repo/mnist.pkl.gz" """ creating object of class Logistic regression""" """ input is 28*28 dimension feature vector ,and output lables are digits from 0-9 """ classifier = LogisticRegression(x,y,28*28,10); """ loading the datasets""" [train,test,validate]=load_datasets.load_pickle_data(model_name1); """ setting the dataset""" classifier.set_datasets(train,test,validate); # #classifier.init_classifier(model_name1);n_out """ Training the classifiers""" classifier.train_classifier(0.13,1000,30); """ Saving the model """ classifier.save('1'); #x=classifier.train[0].get_value(borrow=True)[0]; #classifier.predict(x); """ Loading the model""" classifier.load('1') x=train[0].get_value(borrow=True); y=train[1].eval(); print 'True class:'+`y` xx,yy=classifier.predict(x); print 'Predicted class:' + `yy` classifier.testing();

7 | 9

Training and Testing Multi-class Logistic Classifier 

C/C++ code has been also written using Eigen/OpenCV and incorporated in OpenVision Library. This can be found in the files https://github.com/pi19404/OpenVision ImgML/LogisticRegression. cpp and ImgML/LogisticRegression.hpp files.

8 | 9

Bibliography

Bibliography
[1] James Bergstra et al.  Theano: a CPU and GPU Math Expression Compiler. In: Presentation. Austin, TX, June 2010.

Proceedings of the Python for Scientic Computing Conference (SciPy). Oral

9 | 9

Master your semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master your semester with Scribd & The New York Times

Cancel anytime.