Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Radial Basis Function in Neural Network for Clustering Data

Radial Basis Function in Neural Network for Clustering Data

Ratings: (0)|Views: 363|Likes:
Published by ARVIND

More info:

Published by: ARVIND on Mar 30, 2010
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as DOC, PDF, TXT or read online from Scribd
See more
See less





1. Abstract:
In this paper we investigate alternate designsof a Radial Basis Function Network asclassifier in a face recognition system. Inputto the RBF is the projections of a faceimage over the principal components. Adatabase of 250 facial images of 25 personsis used for training and evaluation. TwoRBF designs are studied: the forwardselection and the Gaussian mixture model.Both designs are also compared to theconventional Euclidean and Mahalanobisclassifiers. A set of experiments evaluatesthe recognition rate of each method as afunction of the number of principlecomponents used to characterize the imagesamples. The results of the experimentsindicate that the Gaussian mixture modelRBF achieves the best performance byallowing less neurons in the hidden layer.The Gaussian mixture model approachshows also to be less sensitive to the choiceof the training set.
2. Introduction:
Most node functions considered in theliterature are monotonically non-increasingfunctions of their net inputs. This is not the best choice for some problems encounteredin practice, where all samples of one classare clustered together.Although it is possible to solve this problem using a one hidden layer feedforward network with sigmoid functions, thenature of the problem calls for a differentsimpler solution. With a traditional feedforward network using sigmoid functions,four or five hidden nodes may be requiredfor a simple problem.On the other hand, only one nodewould be sufficient to discriminate betweenthe two classes if we could use a nodefunction that approximates the circle. This isone motivation for a different class of nodefunctions especially for higher dimensional problems.A function is radically symmetric (or is anRBF) if its output depends on the distance of the input sample (vector) from another stored vector. Neural networks whose nodefunctions are radically symmetric functionsare referred to as RBF nets. Eachcommonly used RBF ρ is a non-increasingfunction of a distance measure u which isits only argument; with ρ(u1)>= ρ(u2)whenever u1<u2.Function ρ is applied to theEuclidean distance u=||µ-i||,between thecenter or stored vector µ and the inputvector i. Vector norms other than theEuclidean distance may also be used for e.gthe generalized distance norm (µi-xj)’A
A(µi-xj) for some square matrix Asuggested by Tomoso Poggio and F.Girosi(1990). Generalized distance norms areuseful because all coordinates of a vector input may not be equally important. But themain difficulty with this measure is indetermining an appropriate ‘A’ matrix. InRBF networks the Gaussian described by theequationP
(u) α e
Is the most widely used radially symmetricfunction.The simple model for the facerecognition system is given here.RBF nets are generally called uponfor use in function approximation problems, particularly for interpolation.
3.Previous work on face recognition:
Earlier face recognition systems were based mainly on geometric facial featuresand template matching. In those works aface was characterized by a set of featuressuch as mouth position, chin shape, nosewidth and length which are potentiallyinsensitive to illumination conditions.Brunelli et al. (1993) compared thisapproach with a traditional templatematching scheme which produced higher recognition rates for the same face database(90% against 100%) Cox,Ghosn andYianilos (1996) [11] proposed a mixturedistance technique which achieved the bestreported recognition rate among thegeometric feature approaches using the samedatabase. Those results were obtained in nexperiment where features were extractedmanually.Turk and Pentland use the projectionsof the face images onto the principalcomponents of the training images as theface features. It achieves recognition ratesaround 96%, 85% and 64% res. For lighting,orientation and scale variation. Recognitionrate around 95% are reported by Pentlandfor a database consisting of 3000 accurateregistered and aligned faces.Available results on neural network basedapproaches come from experiments withfew individuals, what makes it difficult tocompare with other reported approaches. Allthose works rely on a preprocessing todetect a face in a scene and to compensatefor variation of lighting,position,rotation andscale.The work reported here studies a facerecognition system consisiting of a standardPCA used for dimensionality reduction,followed by RBF network acting as aclassifier. As in the most approachesmentioned before, the database used for theevaluation contains face images withmoderate lighting, rotation, scale andviewing variation .A previous work hasindicated that a RBF network performs better than conventional distance classifiersThe present work focus on the study of 
alternative network optimization tomaximize the recognition rate.The RBF network for facerecognition has already been studied byHowell and Buxton.instead of using principal components,they use either r theimage itself or the output of a difference of Gaussian filter and the output of a gabor filter as the input to the RBFnetwork.Vallentin,Abdi and Edelman usedPCA followed by a RBF Network to modelhow faces are stored in human memory.their work neither compares the performance of RBF network with any other classifier nor analyses alternative network designs.The main contribution of this work is a better understanding of how parameters of the RBF network can be optimized tomaximize its performance for the facerecognition task.
4. Classification schemes:
A classifier is essentially a mapping of theinput space onto a set of classes.theliterature on pattern recognition presents ahuge number of schemes to construct thismapping from data.In the present work, two basic schemeswere tested: RBF networks and minimumdistance to centroids classifiers with twodifferent distance measures-Euclidean andMahalanobis.
5. The RBF Network classifier
:The RBF Network is an one hidden layer neural network with several forms of radial basis activation functions. The mostcommon one is the Gaussian functiondefined by,Where σ is the width parameter, μ is thevector determining the center of basisfunction f and x is the dimensional inputvector In a RBF Network, a neuron of the hiddenlayer is activated whenever the input vector is close enough to its heuristics fooptimizing the basis functions parametersand determining the number of hiddenneurons needed to best classification. Thiswork discusses two training algorithms:forward selection and Gaussian mixturemodel. The first one allocates one neuron toeach group of faces of each individual and if different faces of the same individual are notclose to each other, more than 1 neuron will be necessary .the second traininig methodregards the basis functions as thecomponents of a mixture density model,whose parameters are to beoptimized by maximum likelihood.in thislatter, the number k of B F is treated as aninput to the model and is typically much lessthan the total number of input data points{x}The second layer of RBF network which isthe output layer comprises one neuron toeach individual. their output is linear function of the outputs of the neurons in thehidden layer and is equivalent of a OR operator. The final classification is given byoutput neuron with the greatest output.With RBF networks, theregions of the input space associated to each
Fig. RBF Network 

Activity (12)

You've already reviewed this. Edit your review.
1 thousand reads
1 hundred reads
Umar liked this
ansmechit liked this
negin17h liked this
negin17h liked this
Sayeesh Kapu liked this
ssaura_bh liked this

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->