Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Save to My Library
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Hybrid Model of Texture Classification using 2D Discrete Wavelet Transform and Probablistic Neural Network

Hybrid Model of Texture Classification using 2D Discrete Wavelet Transform and Probablistic Neural Network

Ratings: (0)|Views: 166 |Likes:
Published by ijcsis
In this paper, we present a combinational approach for texture classification. The proposed method analyzes texture by 2D Discrete Wavelet Transforms (DWT); wavelet energy and some statistical features construct the features vector that characterizes texture. For improving accuracy the Probabilistic Neural Network (PNN), which is considered as a good estimator to the probability density function, is used as a classifier that maps input features vectors to the most appropriate texture classes. Two comparative evaluations have been done in order to ensure the effectiveness and efficiency of this model.
In this paper, we present a combinational approach for texture classification. The proposed method analyzes texture by 2D Discrete Wavelet Transforms (DWT); wavelet energy and some statistical features construct the features vector that characterizes texture. For improving accuracy the Probabilistic Neural Network (PNN), which is considered as a good estimator to the probability density function, is used as a classifier that maps input features vectors to the most appropriate texture classes. Two comparative evaluations have been done in order to ensure the effectiveness and efficiency of this model.

More info:

Published by: ijcsis on Sep 05, 2010
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

09/05/2010

pdf

text

original

 
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 5, 2010
Hybrid Model of Texture Classification using 2D DiscreteWavelet Transform and Probablistic Neural Network 
Reem Abd El-Salam El-Deeb
Department of Computer Science,
 
Faculty of Computer andInformation Sciences,
 
Mansoura University, Egypt,
 
P.O.BOX:35516Reemm_db@yahoo.com 
Taher. Hamza
Department of Computer Science,
 
Faculty of Computer andInformation Sciences,
 
Mansoura University, Egypt,
 
Elsayed Radwan
Department of Computer Science,
 
Faculty of Computer andInformation Sciences,
 
Mansoura University, Egypt,
 
 Abstract
 — 
In this paper, we present a combinational approachfor texture classification. The proposed method analyzes textureby 2D Discrete Wavelet Transforms (DWT); wavelet energy andsome statistical features construct the features vector thatcharacterizes texture. For improving accuracy the ProbabilisticNeural Network (PNN), which is considered as a good estimatorto the probability density function, is used as a classifier thatmaps input features vectors to the most appropriate textureclasses. Two comparative evaluations have been done in order toensure the effectiveness and efficiency of this model.
 Keywords-
 
Texture classification, feature extraction, discretewavelet transform, probabilistic neural network
I.
 
I
NTRODUCTION
Texture is the variation of data at scales smaller than thescales of interest [7]. Techniques for the analysis of texture indigital images are essential to a range of applications in areasas diverse as robotics, medicine and the geo-sciences. Inbiological vision, texture is an important cue allowing humansto discriminate objects. This is because the brain is able todecipher important variations in data at scales smaller thanthose of the viewed objects. Texture may be important as wellin object recognition as it tells us something about the materialfrom which the object is made. In order to deal with texture indigital data, many techniques have been developed by imageprocessing researchers [14] [7].
 
Texture classification aims to assign texture labels to unknowntextures, according to training samples and classification rulesby finding the best matched category for the given textureamong existing textures. Two major issues are critical fortexture classification: the texture feature extraction and textureclassification algorithms [4]
.
Texture feature extraction is considered as the main base of theefficiency of the texture classification algorithm. In order todesign an effective algorithm for texture classification, it isessential to find a set of texture features with gooddiscriminating power. Unfortunately, because of scaledependency of texture, its feature extraction has become adifficult problem. There have been many studies in solvingtexture classification problem based on various types of features and different methods of feature extraction. Most of the textural features are generally obtained from the applicationof a local operator, statistical analysis, or measurement in atransformed domain [14]. Generally, the features are estimated
from Law’s texture energy measures, Markov random field
models, Gibbs distribution models and local linear transformswere found not to be robust enough to allow one-to-onemapping between patterns and parameter sets for manyreasons: the parameters computed rely on the model assumed,the neighborhoods used must not be self-contradictory and theyrely on the number of samples available for each combinationof neighborhoods. In short, no model fits the observed texturesperfectly, and so no model parameters are perfect in capturingall characteristics of a texture image [7][8]. Other studies usedFourier transform domain, fractals and co-occurrence matrices.The co-occurrence features such as contrast, homogeneity etc.,were found to be the best of these features that they are populardue to the perceptual meaning they have. However, they arenot adequate for texture and object discrimination as theythrow away most of the information conveyed by the co-occurrence matrices [7].In the recent years, wavelet analysis has become a powerfultool for multi-resolution analysis. Discrete Wavelet transform(DWT) and Gabor Transform are extensively used for textureanalysis. While the DWT uses fixed filter parameters forimage decomposition across scales, the Gabor Transformrequires proper tuning of filter parameters for different scalesof decomposition. Further, Wavelet based methods are shownto be efficient in detection, classification and segmentation formany reasons: the wavelet transform is able to de-correlate thedata and achieve the same goal as the linear transformation, itprovides orientation sensitive information which is essential intexture analysis and the computational complexity issignificantly reduced by considering the waveletdecomposition
 
[11][14].
 
As denoted before the efficiency of any classificationsystem depends on effective characterization as well as
148http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 5, 2010
choosing the appropriate classifier. Some classifieralgorithms such as support vector machines are used insome works which faced some problems because of highalgorithmic complexity and extensive memory requirements[16]; the distance classifier is also used for measurement of similarity and consequent labeling but it suffered from somelimitation in speed and adding parameters may cause theclassifier to fail. [4].In this study, a hybrid model based on the combinationalapproach is proposed, which combine 2D Discrete WaveletTransform (DWT) and Probabilistic Neural Network (PNN) for solving texture classification problem. In thehybrid configuration, the 2D DWT is used for textureanalysis and constructing features vector that characterizesthe texture image by capturing all essential information.The obtained features vectors are then fed into the PNNwhich is used as a good estimator to probability densityfunction that help in mapping each texture feature vectorto the best appropriate class with fast and efficientperformance. For illustrating the effectiveness of thismodel, two comparative evaluations have been done. Thefirst one was among variety of wavelet filters for findingthe best features extractor that provides the bestcharacterization. The other was between the PNN andBackpropagation Neural Network (NN) as a classifieraccording to the mean success rates.This paper is organized as follows; in section II, DiscreteWavelet transform (DWT), Probabilistic Neural Network (PNN) and Wavelet energy are mentioned. The hybridmodel of 2D DWT and PNN is described in section III.The effectiveness of the proposed hybrid model forclassification of texture images and comparativeevaluations are demonstrated in section IV .Finally,section V presents discussion and conclusion.II.
 
P
RELIMINARIES
 
 A.
 
 Discrete wavelet transform
 
Wavelets are functions that satisfy certain mathematicalrequirements. They are used to cut up data into differentfrequency components and then study each component with aresolution matched to its scale. The basic idea of the wavelettransform is to represent any arbitrary function as asuperposition of wavelets. Any such superpositiondecomposes the given function into different scale levelswhere each level is further decomposed with a resolutionadapted to that level [12].By applying DWT, the image is actually divided i.e.,decomposed into four sub-bands and critically sub sampled asshown in Figure 1. (a). These four sub-bands arise fromseparable applications of vertical and horizontal filters. Thesub-bands labeled LH1, HL1 and HH1 represent the finestscale wavelet coefficients, i.e., detail images while the subband LL1 corresponds to coarse level coefficients, i.e.,approximation image. To obtain the next coarse level of wavelet coefficients, the sub band LL1 alone is furtherdecomposed and critically sampled. This result in two- levelwavelet decomposition as shown in Figure 1. (b).This processcontinues until some final scale is reached. The values ortransformed coefficients in approximation and detail images(sub-band images) are the essential features, which are shownhere as useful for texture analysis and discrimination. Astextures have non-uniform pixel value variations, they can becharacterized by the values in the sub-band images or theircombinations or derived features from these bands [11].
 B.
 
 Energy
 Energy is one of the most commonly used features fortexture analysis [4]. Wavelet energy reflects the distributionof energy along the frequency axis over scale and orientationand has proven to be very powerful for texture classification.The energy of sub-band
containing N coefficients isdefined as in equation (1) [5],

=
1

,
 

2
 
,
 
 (1) 
C.
 
Probabilistic neural network 
 It is shown that, by replacing the Sigmoid activation functionoften used in neural networks with an exponential function, aneural network can be formed which computes nonlineardecision boundaries. The resulted network is considered asan estimator to the probability density functions which canbe used to map input patterns to output patterns and toclassify patterns. This technique yields decision surfaceswhich approach the Bayes optimal under certain conditions[3]
.
PNN is a kind of these networks that called radial basisnetwork. It is an artificial neural network with radial basisfunction (RBF) as a transfer function. RBF is a bell shapefunction that scales variable nonlinearly [15]. This network provides a general solution to pattern classification problemsby following an approach developed in statistics, calledBayesian classifiers [6]. PNN is suitable for these kinds of classification problems for many advantages: Its trainingspeed is many times faster than standard feed forwardbackprobagation network, it can approach a Bayes optimalresult under certain easily met conditions and it is robust tonoise examples.The most important advantage of PNN is that training is easy
and instantaneous that weights are not “trained” but
assigned. Existing weights will never be alternated but onlynew vectors are inserted into weight matrices when training.So, it can be used in real-time. Since the training and running
HL1LL1HH1
 
LH1HL1LH1HH1
HL2
 
HH2
 
LL2
 
LH2
 
(b) Two-Level
 
 
(a) One-Level
 
HL1LL1HH1
 
LH1HL1LH1HH1
HL2
 
HH2
 
LL2
 
LH2
 
(b) Two-Level
 
Figure 1. Image decomposition
149http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 5, 2010
procedure can be implemented by matrix manipulation, thespeed of PNN is very fast [15].The probabilistic neural network uses a supervised trainingset to develop distribution functions within a pattern layer.These functions, in the recall mode, are used to estimate thelikelihood of an input feature vector being part of a learnedcategory, or class. The learned patterns can also becombined, or weighted, with the a priori probability, alsocalled the relative frequency, of each category to determinethe most likely class for a given input vector. If the relativefrequency of the categories is unknown, then all categoriescan be assumed to be equally likely and the determination of category is solely based on the closeness of the input featurevector to the distribution function of a class [6][1].Probabilistic neural networks can be used for classificationproblems. When an input is presented, the first layercomputes distances from the input vector to the traininginput vectors and produces a vector whose elements indicatehow close the input is to a training input. The second layersums these contributions for each class of inputs to produceas its net output a vector of probabilities. Finally, a competedtransfer function on the output of the second layer picks themaximum of these probabilities, and produces a 1 for thatclass and a 0 for the other classes. The architecture for thissystem is shown below in Figure 2. [6].It is assumed that there are
 
input vector/target vector pairs.Each target vector has
elements. One of these elements is1 and the rest are 0. Thus, each input vector is associatedwith one of 
classes.The first-layer input weights

1
are set to the transpose of the matrix formed from the
 
training pairs,
 
′
. When aninput is presented, the
||

||
box produces a vector whoseelements indicate how close the input is to the vectors of thetraining set. These elements are multiplied, element byelement, by the bias and sent to the radial basis transferfunction. An input vector close to a training vector isrepresented by a number close to 1 in the output vector a
1
. If an input is close to several training vectors of a single class,it is represented by several elements of a
1
that are close to 1.The second-layer weights

2
are set to the matrix T
 
of target vectors. Each vector has a 1 only in the row associated
with that particular class of input, and 0’s elsewhere. The
multiplication Ta
1
sums the elements of a
1
due to each of the
input classes. Finally, the second-layer transfer function,compete, produces a 1 corresponding to the largest elementof n
2
, and 0’s elsewhere. Thus, the network classifies the
input vector into a specific
class because that class has themaximum probability of being correct [9].The bias
b
allows the sensitivity of the radial basis neuron tobe adjusted. Each bias in the first layer is set to0.8326/SPREAD. This determines the width of an area in theinput space to which each neuron responds. SPREAD shouldbe large enough that neurons respond strongly tooverlapping regions of the input space [13].The Probabilistic Neural Network is based on Bayesianclassification and the estimation of probability densityfunction that is necessary to classify the input vectors intoone of the target classes approaching the Bayesian optimality[15].III.
 
H
YBRID MODEL FOR TEXTURE CLASSIFICATION
 The texture classification scheme is based on twoprinciples, choosing features that provide the bestcharacterization to the texture image and working with fast,easy and robust classifier in order to reach the bestclassification result.In this study a viable algorithm with high precision andlow calculating load is proposed to classify texture imagesusing wavelet transform and its combination withprobabilistic neural network. In the proposed combinatoryconfiguration the DWT and PNN function as black boxesin a complementary manner. The functionality mannerinvolved can be combined in two phases:
 
i.
 
Texture characterization phase andii.
 
PNN classification phase.The texture classification phase starts with taking the textureimages as an input and with the help of the DWT, the textureimages are analyzed and features vectors are constructed.The obtained features vectors are entered to the PNN fortraining which starting the PNN classification phase thatcontinues with testing and ends with displaying theclassification result as illustrated in Figure 3.
Figure 2. PNN Architecture
150http://sites.google.com/site/ijcsis/ISSN 1947-5500

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->