(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 6, 2010
pass filter, respectively. The rows and columns of imageare processed separately and down sampled by a factorof 2 in each direction which may cause losing importantfeature. Resulting in one low pass image LL and threedetail images HL, LH, and HH.
shows theone-level decomposition of
in the spatialdomain. The LH channel contains image information of low horizontal frequency and high vertical frequency,the HL channel contains high horizontal frequency andlow vertical frequency, and the HH channel containshigh horizontal and high vertical frequencies. Three-level frequency decomposition is shown in
.Note that in multi-scale wavelet decomposition only theLL sub-band is successively decomposed .
: A one-level wavelet analysis filter bank.
: Wavelet frequency decomposition
C. Wavelet Neural Network :
WNN is a combination technique between neuralnetwork and wavelet decomposition .The advantages of the WNN are a high-speed learning and a goodconvergence to the global minimum .The reason forthe application of WNN in case of such a problem asclassification is that the feature extraction andrepresentation properties of the wavelet transform aremerged into the structure of the ANN to further extendthe ability to approximate complicated patterns .The WNN can be considered an expanded perceptron. The WNN is designed as a three-layer structurewith an input layer, a wavelet layer, and an output layer.The topological structure of the WNN is illustrated in
.In WNN, both the position and dilation of the waveletsas well as the weights are optimized. The basic neuronof a WNN is a multidimensional wavelet in which thedilation and translation coefficients are considered asneuron parameters .The output of WNN is therefore alinear combination of several multidimensionalwavelets .
: The structure of the Wavelet Neural Network
In this WNN model, the hidden neurons have waveletactivation functions
and have two parameter
which represent dilation and translation parameter of wavelet function
is the weight connecting theinput layer and hidden layer
is the weightconnecting the hidden layer and output layer.Let
the WNNinput to no. n sample
representsthe output of WNN
represents theexpected output
represents the connection weightbetween no.
node (input layer) and
represents the connection weight betweenno.
node (hidden layer) and
node (output layer)
is the number of Sample
S is the number of output node
is the number of input node
is thenumber of hidden layer.III.
WAVELET NEURAL NETWORK FOROFF-LINE HANDWRITTEN SIGNATURERECOGNITIONAccording to the fact that
two genuinesignatures of one person are precisely the same, manyefforts have been done in order to comprehend thedelicate nuances of person signatures . Especiallyoff-line signature recognition needs more effort becauseof the absence of dynamic information
that can’t be
extracted from static image . Also, the problems of translation, rotation and scale variation of signatureimage are still found when dealing with signature imagepixels
.This paper presents an implementation for off-linehandwritten signature recognition system using DWTtechnique in feature extraction phase and WNN inclassification phase to overcome all the above problemswith off-line handwritten signature recognition system.DWT technique depends on analyzing all signatureshapes (continuous case) instead of analyzing the pixelsintensity or segmentation part of signature (discretecase). Because of the problem of down-sample caused