This action might not be possible to undo. Are you sure you want to continue?

Texture Recognition

In partial fulfillment of the Requirements for CMP605 Image Processing and Computer Vision

Presented by

Ahmed Mohamed Ahmed El Sheikh Jihad Ibrahim

May 2012

Contents

Abstract ...................................................................................................................................... 4 Acknowledgement ...................................................................................................................... 4 Chapter 1: Introduction ............................................................................................................... 5 1.1 Motivation and Justification ......................................................................................... 5

Applications......................................................................................................................... 5 1.2 Problem Definition ............................................................................................................ 5 1.3 Summary of Approach ....................................................................................................... 8 1.4. Report Overview............................................................................................................... 8 Chapter 2: Literature Survey........................................................................................................ 9 Model-based Approaches [3,11,15] ......................................................................................... 9 Statistical Approaches [3] ...................................................................................................... 11 Structural Approach [3] ......................................................................................................... 14 Transform Methods............................................................................................................... 14 Chapter 3: Necessary Background ............................................................................................. 16 Wavelet Transform................................................................................................................ 16 Singular Value Decomposition ............................................................................................... 20 Chapter 4: System Description .................................................................................................. 21 Selected Approaches ............................................................................................................. 21 Declined Approaches............................................................................................................. 21 System Block Diagram ........................................................................................................... 21 Histogram and GLCM Statistics .............................................................................................. 22 Laws’ Masks .......................................................................................................................... 23 Wavelet Decomposition ........................................................................................................ 24 Curvelet Decomposition ........................................................................................................ 24 Gabor Filter Bank .................................................................................................................. 24 Singular Value Decomposition ............................................................................................... 24 Classifiers .............................................................................................................................. 25 Classifier Fusion [9] ............................................................................................................... 25

Chapter 5: Results ..................................................................................................................... 27 Histogram and GLCM Statistics .............................................................................................. 27 Laws’ Masks .......................................................................................................................... 28 Wavelet Decomposition ........................................................................................................ 28 Curvelet Decomposition ........................................................................................................ 29 Gabor Filter Feature .............................................................................................................. 29 Singular Value Decomposition ............................................................................................... 30 Fusion Results ....................................................................................................................... 31 Chapter 6: Conclusion and future work. .................................................................................... 32 Conclusions ........................................................................................................................... 32 Future Work .......................................................................................................................... 32 References ................................................................................................................................ 33 Appendix A. Development Tools and Environment .................................................................... 34 Used Toolboxes and Functions .............................................................................................. 34 Appendix B. Sample MATLAB Functions and Scripts................................................................... 35 FeatureExtract Function ........................................................................................................ 35 Wavelet Feature Function ..................................................................................................... 36 SVD Feature .......................................................................................................................... 37

is useful in a variety of applications and has been a subject of intense study by many researchers. each texture has 40 samples [1]. One immediate application of image texture is the recognition of image regions using texture properties. Texture is the most important visual cue in identifying these types of homogeneous regions. 30 are used for training and 10 for testing. we sugges t some possible future work in this field. Implementation is done using MATLAB®. . A proposed technique ‘Singular Value Decomposition’ is examined and compared with the existing. Ahmed for his well guided self-study course and outlined project. This project is concerned with exploiting various texture analysis techniques. Existing techniques are classified and implemented for comparison purposes.Abstract Image texture. This technique give a good spatial representation that differs from the other techniques and therefore useful in decision fusion. Finally. The highest recognition rate achieved using a single feature is 82% and the highest accuracy using fusion techniques is 93%. Acknowledgement We thank Dr. from which we have gained significant experience. defined as a function of the spatial variation in pixel intensities (gray values). We are classifying between 20 textures.

1. document processing. we found the following to be the most relevant: “A region in an image has a constant texture if a set of local statistics or other local properties of the picture function are constant.1 Motivation and Justification Texture analysis techniques are not only used with textures by its strict meaning. So. there are many definitions given for a texture in computer vision.2 Problem Definition We are testing the various analysis techniques that show promising results from the literature survey that will be discussed later.Chapter 1: Introduction 1. each has 40 samples. or approximately periodic”. In some of the mature domains (such as remote sensing) texture already has played a major role. we thought exploiting the various techniques for textural analysis will be a good experience and increase the benefit from the course. Applications Areas of application are endless.1. many of the computer vision applications encounter textures by its descriptive meaning. Although. We are only classifying between 20 of them that are shown in Fig. . medical image processing. Our texture dataset [1] contains 25 types of natural and synthetic textures. slowly varying. We use 30 samples from each for the training phase and 10 for testing i. given the fact we have no prior knowledge of the problem. while in other fields (such as surface inspection) new applications of texture are being found.e. Texture analysis can be also used in automated inspection. a total of 600 training samples and 200 testing. We are using neural networks for the classification stage.

Figure 1 Samples of dataset .

samples of dataset .Figure 2 cont.

Chapter 5 will discuss the results. Chapter 3 will give some necessary background for the implemented algorithms. A final stage is added to fuse the decision of various classifiers to improve the recognition rate. 1. Report Overview Chapter 2 will be a literature survey for the texture analysis techniques that were previously implemented.3 Summary of Approach The main flow of our system can be summarized in the following diagram Processing (if necessary) Analysis Classification • Resizing • Equaliztion • Feature Extraction • Learning phase (targets given) • Recognition (targets required) The input to the above system is the raw grayscale images and the output is the class recognized class (in the recognition phase).1. And finally. Chapter 4 will be a detailed system description.4. . chapter 6 will suggest some future work in the field.

‘es’ denotes an independent and identically distributed noise. The analysis of texture images is executed.Chapter 2: Literature Survey Texture recognition techniques can be classified into one of the following categories: Model Based. This category of analysis offers a good possibility to recreate realistic examples of natural textures. Structural. AR Model [3.13. ‘Ns’ is a neighborhood of ‘s’ and ‘θ’ is a vector of model parameters. Transform. deﬁning a model and considering each texture group’s parameters. Statistical. Model-based Approaches [3. Assuming image f is a zero-mean random field.15] The autoregressive (AR) model assumes a local interaction between image pixels in that pixel intensity is a weighted sum of neighboring pixel intensities.15] Those methods consider texture as random process. which is governed by some parameters. The estimation of the parameters can serve to classify and to segment textures. Causal AR models have an advantage of simplicity and efficiency in parameter estimation. an AR causal model can be defined as 1 Where ‘fs’ is image intensity at site ‘s’. .11. We shall now discuss the concept on which each of them is based and give various example approaches for each.

the probability that a cell is in a given state is entirely determined by probabilities for states of neighboring cells. Then FD can be defined as: FD= log (Nr)/log(r -1) 4 Where Nr is the number of non-overlapping copies of a set similar to the original set.15] A Markov random field (MRF) is a probabilistic process in which all interactions are local. j are integers }. S={ (i. assume S a grid. here for simplicity. global effects can still occur as a result of propagation. If the size of the measuring tool is taken as λ. that there may be perceptually very different textures that have very similar fractal dimensions. Fractal Model [11.14] It’s used with natural surfaces that have a statistical quality of roughness and self-similarity at different scales. scaled down by a ratio r. j)) = { (k.j)2 < r constant } A subset ‘C’ of ‘S’ is a clique if any two different elements of ‘C’ are neighbours.i)2 + (l .j) ||S are defined as: ∂((i. However. . Neighbours of s(i. Example of 8-neighborhood (r = 2): 2 Figure 3 8-neighborhood and its cliques Direct interaction occurs only between immediate neighbors. Let ‘S’ be a set of locations.Markov Random Fields Model [3. the measured quantity will be 3 Where D is known as the fractal dimension which Fractal model depends on. l) | 0<(k . j) | i. But fractal dimension is not sufficient to capture all textural properties.

called lacunarity . 2nd Order Statistics Based Approach (Co-occurrence Matrices) [10. Depending on the number of pixels defining the local feature. Statistical Approaches [3] Statistical methods analyze the spatial distribution of gray values. entropy. second-order (two pixels) and higher-order (three or more pixels) statistics.j) of pd is the number of occurrences of the pair of gray levels (I. dispersion. Features that can be used with this method are: as mean. and do not consider pixel neighborhood relationships.order (one pixel). Therefore we cannot distinguish between them using first order statistical analysis.Another measure. by computing local features at each point in the image. skewness and kurtosis. mean square value or average energy. . The entry (I. The reason behind this is the fact that the spatial distribution of gray values is one of the defining qualities of texture. variance. Grey level co-occurrence matrix pd for a d= (dx. has been suggested in order to capture the textural property that will let one distinguish between such textures. and deriving a set of statistics from the distributions of the local features. 2 completely different images each with a 50% black and 50% white pixels (such as a checkerboard and a Salt & Pepper noise pattern) may produce the same gray level histogram. dy) displacement vector is defined as follows. 1st Order Statistics Based Approach (Histogram) [12] First order texture measures are statistics calculated from the original image values. For example.3] Grey level co-occurrence matrices are two dimensional histograms of occurrence of pairs of grey levels for a given displacement vector.j) and which are a d distance apart. statistical methods can be further classified into first. It suffers from the limitation that it provides no information about the relative position of pixels to each other.

For example. consider the following 4*4 image containing 3 different grey values and Displacement vector d = (1. 0) 1100 1100 0022 0022 Pd = 402 220 002 Note: Different authors define the co-occurrence matrix a little differently in two ways: • By defining the relationship operator p by an angle θ and distance d. σx and σy are the standard deviations of Pd(x) and Pd(y) . and • By ignoring the direction of the position operator and considering only the (bidirectional) relative relationship Features that can be used with this method are: Figure 4 Texture features of GLCM μx and μy are the means.

The basic filters used were common Gaussian. The average is the absolute sum of all elements in a given window of size (2p+1)x(2p+1). The Laws’ masks are constructed by convolving together just three basic 1x3 masks: 5 The initial letters of these masks indicate Local averaging. edge detector and Laplacian-type filters. smoothing the various filtered images characterizes textures efficiently. 7 Second level feature extraction can be in the form of statistical measures for the energy matrix. In fact. Masks that don’t average to zero are not used because they are more sensitive to image intensity than the texture itself. 2D masks can be obtained by outer product of pairs of the above masks.Laws’ Energy Filters [2] This method involved the application of simple filters to digital images. the 1x5 masks obtained by convolving pairs of these 1x3 masks together form a complete set. The filtered images are then averaged over larger moving windows to get macro-features of the texture. and were designed to highlight points of high ‘texture energy’ in the image. By identifying these high energy points. Similarly. Edge detection and Spot detection. . 6 Where R5 is Ripple detection and W5 is Wave detection mask. these basic masks span the entire 1x3 subspace and form a complete set.

LH: Vertical high frequency component (Horizontal edges).3 for further illustration). producing 4 matrices at each layer from the decomposition of the blur of the previous layer (see the graphical interpretation and the wavelets’ section in ch. however. Examples of these statistics are the mean. Various statistical measures of these resulting matrices are taken for each of the resulting layers. one must define the primitives and the placement rules. The choice of a primitive (from a set of primitives) and the probability of the chosen primitive to be placed at a particular location can be a function of location or the primitives near the location. HH: Diagonal high frequency component (oblique edges). . The abstract descriptions can be ill defined for natural textures because of the variability of both micro. To describe the texture.Structural Approach [3] This technique aims at representing texture by well-defined primitives (micro-texture) and a hierarchy of spatial arrangements (macro-texture or grammar) of those primitives. Transform Methods Wavelet Decomposition [4. standard deviation and the covariance.and macrostructure and no clear distinction between them. HL: Horizontal high frequency component (vertical edges).5] The texture image is decomposed using any type of wavelets up to a certain level of decomposition. this feature is more useful for synthesis than analysis tasks. Graphical interpretation Figure 5 Wavelet decomposition & Produced matrices LL: Low frequency information (Blur). The advantage of the structural approach is that it provides a good symbolic description of the image.

besides various orientations in the scenery requires averaging over the various curvelet orientations causing some sort of blurring in a curved way. this feature is not of significance in our task as there are no clear edge in most of the textures. Yet. The wavelet used has the form of a complex sinusoid modulated by a Gaussian window. it has an orientation parameter like curvelets. This orientation gives a more sparse representation for curved edges. The main difference is that curevelets have an orientation parameter in 2D and higher dimensional spaces. Rotated Gaussian Oriented Complex Sinusoid 8 9 Statistics such as mean and standard deviation for each frequency and scale are calculated and averaged over the various orientations.Curvelet Transform [6] Curvelet transform is one of the modified wavelet’s families. but its main advantage is that it achieves minimum uncertainty between the time and frequency domain. Figure 6 Curvelet transform Gabor Filter [5] Gabor filter is also one of wavelet’s families. . Also.

Chapter 3: Necessary Background In this chapter we include two of the most essential backgrounds for understanding the implementation of the chosen techniques that were also mentioned previously in the literature survey. Therefore lacks localization in the time (or space in 2D). Since multiplication in the TD corresponds to convolution in the FD this results in what is called leakage. Wavelet Transform Historical Motivation Fourier Transform FT decomposes a given signal in an orthogonal set of sinusoids of different frequencies. These sinusoids have deterministic frequencies but extended throughout the whole time domain. 11 Where w[n] is the windowing function in time domain. . The deltas of the frequencies are now wider. A simple rectangular window corresponds to a sinc function in FD. 10 Where xn is the signal in the time domain (TD) and Xk in the frequency domain (FD). Therefore we are now limited by a certain resolution between the frequency and time domain. This resolution is fixed because the window size is fixed. Short-Time Fourier Transform (STFT) If we wish to gain some information on our location in the time (space) domain we can perform the FT on a window of the signal.

This results in an over complete set such as the wavelets. Figure 8 Time frequency resolution This is in contrast to the STFT which has fixed resolution that can be interpreted by the following figure . 12 13 Where ψ is the wavelet basis function. as we get more localized in tine we lose localization in frequency and vice versa. The time frequency resolution can be interpreted as the following figure. m is the scaling factor and n is the translation factor.Figure 7 FT on a window of the signal Multi-Resolution Analysis (Wavelets) Since STFT has fixed resolution we can use a variable size window to obtain various resolutions.

Figure 9 STFT's fixed resolution Various modifications to the original wavelets have arisen such as Curvelets. Ridigilets and Gabor filers. .

Curvelet Transform This transformation was motivated by the fact that wavelets in 2D don’t sparsely represent curved edges. The space frequency resolution map can be regarded as the following figure Figure 10 Space frequency resolution map This can be thought of as taking the time axis of the wavelets resolution map and rotating it towards the frequency axis clockwise. Gabor Filter Gabor thought of this resolution problem as an analogous problem to the Heisenberg Uncertainty Principle and found that ΔtΔf >= 1/(4π). By seeking the equality he found that the wavelet that gives highest resolution in both time and frequency is a complex sinusoid modulated by a Gaussian window (as was shown in the literature survey). . So. the family of wavelets with an orientation parameter was devised to overcome such complication. the above figure will be the result.

written σ1. numbered so that Avj = σjuj This mathematical identity motivated the proposed techniques. Thus the vector σiui is the ith largest principal semiaxis of AS... . numbered to correspond with the singular values. Next.. we define the n right singular vectors of A.Singular Value Decomposition The SVD is motivated by the following geometric fact: The image of the unit sphere under any m xn matrix is a hyperellipse..} ϵ S that are the pre-images of the principal semiaxes of AS. we define the n left singular vectors of A These are the unit vectors {u1. we define the n singular values of A. These are the lengths of the n principal semiaxes of AS. As the singular values are thought to be characteristic for a given bases and matrix operating on it. u2…} oriented in the directions of the principal semiaxes of AS. It is conventional to assume that the singular values are numbered in descending order. Finally. These are the unit vectors {v1. Figure 11 Unit sphere changes to hyperellipse First. σ2 . v2.

o Curvelet decomposition.Chapter 4: System Description Selected Approaches Statistical o Histogram and GLCM. . Declined Approaches Model-based methods. Transform o Wavelet decomposition. Structural approaches. Reasoning: The model-based and structural approaches are better used in synthesis than in recognition. System Block Diagram Read Image Pre-processing Image Analysis Classificaiton Multiple Classifier Fusion Final Decision We shall know describe the implementation procedure used for each approach and the implemented fusion techniques. The search for the parameters of an assumed model or a building unit along with a placement rule fails in case of natural textures. Proposed Technique o Singular Value Decomposition (SVD). inclination and rotation). o Laws’ Masks. due to not only the variability in natural scenes but also the variability in the image acquisition view (zooming in and out. o Gabor filter bank.

3. 2. Summary of GLCM Properties [7] Summary of Histogram Properties [3] . variance. skewness. energy and entropy of the histogram. kurtosis. Calculate the histogram of the image. Calculate the mean. along with the mean and standard deviation. Take the average of the above values to overcome rotations in the scenery. 5. Calculate the GLCM using MATLAB command ‘graycomatrix’ for the four main orientations. Get the statistical properties of the GLCM using the MATLAB command ‘graycoprops’.Histogram and GLCM Statistics Procedure 1. 4.

Average the convolved image using a 15x15 mask. 3. Calculate the normalization image by convolving with L5L5. 4. Take the average of the final two images. Do the above steps using the reversed mask. 7. Take five statistical parameters for the averaged image.: E5L5 and L5E5.g. Form the required mask by outer product of the following masks: 2. E. 6. STD. kurtosis and entropy. skewness.Laws’ Masks Procedure 1. Convolve the zero-average 5x5 mask with the image and normalize. 5. Summary of Laws’ Masks Properties [8] . mean.

3. Calculate up to the 3rd level Haar decomposition using MATLAB ‘dwt2’ command 3 times in succession on the ‘blur’. 3. Calculate the curvelet decomposition using the CurveLab Toolbox for MATLAB via warping. Filter the image using various scales. 3. orientations and frequencies using MATLAB file exchange central implementation. Obtain the mean and STD of the final ‘blur’. For each cell calculate the SVD. Gabor Filter Bank Procedure 1. 2. 2. Singular Value Decomposition Procedure 1. 3.Wavelet Decomposition Procedure [4] 1. Split the image into a 5x5 cell. Curvelet Decomposition Procedure 1. From each cell take the largest 5 singular values. 2. For each of the high frequency details at each level obtain the mean and STD. . Calculate the mean and STD for this average. For each scale and frequency average over all orientations. 2. For each level of decomposition take the average of the various angles. For each of the above averages calculate the mean and STD.

2. Procedure: Sum the outputs of all classifiers for a given test sample. . There are various techniques. Voting and Decision templates. Procedure: 1.When taking the decision. 3. Fusion helps combine these differences to achieve a higher recognition rate. This matrix is called the decision profile for the test sample. Voting Majority count leads to ties. Decision Templates Its idea is based on the fact that the classifier outputs can be regarded as a second layer of features. But we implemented only three for comparison. Use the soft outputs to vote.For each class prepare a matrix containing the average of the outputs of each classifier when this class is introduced for classification. Classifier Fusion [9] Each trained classifier has different knowledge of the dataset. Procedure: For each test sample generate a vector containing the highest output value form all classifiers for each class. Confidence Take the decision of the classifier that has the highest soft output. Confidence. This matrix is called the decision template for this class.Classifiers Classification is done using neural networks trained on the extracted features using the MATLAB nprtool.Calculate the Frobenius norm for the difference between the decision profile and the decision templates of all classes and then decide on the minimum. prepare a matrix containing the outputs of all classifiers for the test sample.

they are computationally expensive. especially the decision templates because it acts as a second training/classification phase. .Diagram for Decision Template Decision Fusion: Figure 12 Decision template fusion Note: Although the above techniques enhance the recognition rate as will be shown in the results chapter. So using any of the mentioned techniques is actually a compromise between the required accuracy and the expense of lengthy calculations.

Histogram and GLCM Statistics Feature vector size: 12 Recognition Rate The various distances for take for the neighboring pixels showed different recognition rates Neighbor Distance: 1-pixel: 69% 4-pixels: 8-pixels: 80% 67% Confusion The following three classes cause the main confusion. highlighting the main classification errors that each feature causes. It is very obvious that the Granite and Bark have almost the same Histogram and GLCM distributions. Granite Bark Pebbles . But pebbles confuse with them as a result of the fact that we use neither the whole GLCM nor the Histogram for classification but rather statistics of each of them.Chapter 5: Results In this section we will be discussing the results of the previously mentioned techniques.

same as Statistical. Note: Although the wavelet feature shows better performance. not to mention their statistics of course. But the fact that wavelet feature shows better performance than the statistical methods is that the wavelet decompositions provide a better representation for the image’s information than the statistical features. But here the reasons differ.Laws’ Masks Feature vector size: 70 Recognition rate 79 % Confusion The main confusion is the Granite and Bark. Granite and Bark are the main confusing classes. yet it has higher dimensionality than the statistical feature. Wavelet Decomposition Feature vector size: 20. . The feature extraction phase of the WD is almost the same as the statistical methods. Recognition Rate 82 % Confusion Also. the high frequencies along the vertical and horizontal directions along with the blur of both classes are quite similar. the same as the GLCM and Histogram confusion. This also makes sense since both of these techniques belong to the same category. but in the classification phase the WD requires more calculations due to the higher dimensionality.

This is due to averaging along the various orientations. This confuses the classifier between the Granite and Carpet classes mainly. The higher recognition rate is due to the fact that Gabor filter has a control on the frequency of the complex sinusoid of the filter which in turn gives more information.Curvelet Decomposition Feature vector size: 20 Recognition Rate 73% Confusion Curvelets have an orientation control parameter that differs from the regular wavelets. This averaging causes some sort of blurring or brushing in a curved way. . To overcome the variation in scene rotations during feature extraction averaging is done on the various orientations at a certain level of decomposition. Granite Carpet Gabor Filter Feature Feature vector size: 18 Recognition Rate 75% Confusion The same as curvelet.

because its confusion is unquestionable for the other features while it enhances their other decisions. 125 Marble Floor Wood 1 Wood 2 . The SVD causes confusion such as shown below. Although this feature has poor recognition rate. yet it improves the recognition rate of the fused decision.Singular Value Decomposition Feature vector size: Recognition Rate 65% Confusion Due to the fact that some textures have very close basis for their vector spaces.

Fusion Results Used Classifiers: • Laws. And as mentioned before these fusion techniques are computationally expensive.11%. Curvelets and Gabor. . Recognition Rate for the Different Techniques • Confidence: 84% • Voting: 91. yet the increase in the recognition rate over the single classifier accuracy ranges from 2 .4 pixel distances • Wavelets. • SVD. Statistical with 1.5% • Decision templates: 93% The above results may vary when different combinations of the implemented classifiers are used.

Chapter 6: Conclusion and future work. As using the same parameters for all types of textures might not be the optimum solution.g. Methods’ performance was determined with 3 main factors: Recognition rate (direct result). Conclusions Implementing texture recognition system with the mentioned methods is fully functional. Future Work Searching for the optimum parameters (e. Exploring texture analysis techniques when used with colored images. Vector size (cost) and Confusion (drawback). Expanding the analysis to motion textures. . neighbor distance and orientation) for each analysis technique. a new stage was added to the system (Multiple Decision classifier) that leads to significant improvement of the results. To improve those factors and to get the most accurate results. which means that data representation and compression is similar to some extent. The best performance achieved out of the chosen categories was very close. Decision fusion using different types of classifiers like K-Nearest Neighbor and Discriminant Function Analysis.

Lespessailles & C. Departament d'Electrònica. Davies. Department of Computer and Information Science. Australia. National Laboratory of Pattern Recognition (NLPR). orlando. physics dept. Department of Physics Royal Holloway. Gadois & E. kasparis’. Kolkata. IN 46202-5132. [8] Laws’ masks descriptors applied to bone texture analysis: an innovative and discriminant tool in osteoporosis. People’s Republic of China . [2] Introduction to texture analysis. Ishrat Jahan Sumana. Tieniu Tan∗. and Jean Ponce. Marchadier & C. 723 W.a. E. Institute of Electronics. [10] Introduction to Texture Analysis. Andrzej Materka and Michal Strzelecki. Michigan St. [6] Content Based Image Retrieval Using Curvelet Transform. Mihran Tuceryan. s. Beijing. fl 32816. University of London. m.Yan-fei . Jianguo Zhang. Gippsland School of Information Technology. London. India.s. Institute of Automation. Cuenca Asensi. García Crespi. Benhamou. ba~~iouni* and q. x Lladó Bardera . chen’ ‘department of electrical and computer engineering and 2department of computer science. Yao-wei . Indianapolis. Indiana University -Purdue University at Indianapolis. National Laboratory of Pattern Recognition (NLPR). university of central florida. Universidad de Alicante. [7] Texture Analysis. Jianguo Zhang. [12] A comparative study of texture analysis algorithms in textile inspection applications. Ludmila I. Universitat de Girona. L. Monash University. Md. 3 Institute of Computing Technology. Departamento de Tecnología Informática y Computación. Ibarra Pico. Chinese Academy of Sciences. Graduate School of Chinese Academy of Science. School of Education Technology.References [1] A Sparse Texture Representation Using Local Affine Regions. Ranjan Parekh. Morales Benavente. R. Chappard & C. Royal Holloway. Dengsheng Zhang and Guojun Lu. tzannes’. Rachidi & A. Lorenzo Quintanilla. Kuncheva. Tieniu Tan. Informàtica i Automàtica. Jadavpur University. Machine Vision Group. Monirul Islam. IEEE Transactions on Pattern Analysis and Machine Intelligence. Chinese Academy of Sciences. u. Institute of Automation. Technical University of Lodz. [11] Chapter 5 Texture recognition. Cordelia Schmid. Svetlana Lazebnik. Churchill. M. Machine vision group. [14] Texture description using fractal energy features and energy features. n. [15] Brief review of invariant texture analysis methods. Campus de San Vicente [13] A regularized simultaneous autoregressive model for texture classification. [9] Switching Between Selection and Fusion in Combining Classifiers: An Experiment. t. [3] Texture Analysis Methods – A Review. Victoria 3842. [5] Brief review of invariant texture analysis methods. [4] Improving Texture Recognition using Combined GLCM and Wavelet Features. Wen Yong .

Used Toolboxes and Functions 1.org/software. 5. 2.Wavelet Toolbox Used for calculating the 2D decomposition of the images using Haar wavelets.curvelet. higher order statistics such as skewness.Statistics Toolbox Used for calculating the various statistics as the 2d mean and STD.CurveLab Toolbox It is a free Toolbox for academic use only obtained from: http://www.html This is used to calculate the Curvelet decomposition for the given images. Also.com/matlabcentral/fileexchange/5237 4. Development Tools and Environment The project is implemented completely using MATLAB. 3. kurtosis and entropy. .Appendix A.Gabor Filter Functions A function shared at the MATLAB file exchange central.mathworks.Neural Network Toolbox Used for training feed-forward neural networks used for classification. It can be obtained from the following thread: http://www.

jpg']). 40 . [ TestCell{:} ] = deal( zeros(VectorSize. which returns a feature vector for a given image. disp(['Texture number ' Tno ' is done']). TrainTargets(:. OutFileName) %% Data Container Initialization NumberOfTextures = 20.Appendix B. TestTargets(:. and outputs a mat file containing easily accessed matrices and corresponding target matrices which are ready for training and testing. 40 . 1). Loc : Loc + TrainSamples .'%02d') '. . TrainSamples) ). TrainTargets = zeros(NumberOfTextures.1). Target = zeros(NumberOfTextures. TrainSamples. TrainSamples). FeatureFunction. %% Feature Extraction Exclude = [15 7 4 3 21]. for t = 1:25%NumberOfTextures if ~isempty(Exclude(Exclude == t)).jpg']). TestTargets = zeros(NumberOfTextures.TrainSamples). TestCell{counter}(:.1).TrainSamples) + 1. Target(counter) = 1. Loc : Loc + 40 . FHandle = str2func(FeatureFunction). Function’s Code function FeatureExtract(VectorSize.TrainSamples) * NumberOfTextures). continue. %% Test Samples for j = TrainSamples + 1: 40 Tno = num2str(t. 1. % Number of Textures [ TrainCell{:} ] = deal( zeros(VectorSize. Sample MATLAB Functions and Scripts FeatureExtract Function This function takes a function handle to the desired function for feature extraction. 1. % Number of Textures TestCell = cell(NumberOfTextures.'%02d') '. I = imread(['Images/T' Tno '/T' Tno '_' num2str(i. TrainSamples * NumberOfTextures). counter = 1.j-TrainSamples) = FHandle(I).TrainSamples) ). TrainCell{counter}(:.'%02d'). I = imread(['Images/T' Tno '/T' Tno '_' num2str(j.'%02d').TrainSamples . TrainCell = cell(NumberOfTextures. % Feature Extracting Fucntion end Loc = (counter-1) * (40 . % Feature Extracting Fucntion end Loc = (counter-1)*TrainSamples + 1.1) = repmat( Target.i) = FHandle(I). Target = zeros(NumberOfTextures.1). %% Train Samples for i = 1:TrainSamples Tno = num2str(t. Target(counter) = 1.1) = repmat( Target. end. (40 .

end end .1).'haar'). RV = CV./sqrt(DV*DV'). v = [v.1). D{3}] = dwt2(B{2}. [B{3}. H{2}. std2(V{i}). D = cell(3.1). RD = CD. B = cell(3. [B{2}. for i = 1:3 CD = cov(D{i}). mean2(RH). DH = diag(CH). V{2}. 'TrainCell'.'haar')./sqrt(DH*DH'). mean2(RV). 'TestCell'. TestMat = [TestCell{:}]. V{3}. CH = cov(H{i}). D{2}] = dwt2(B{1}. Function’s code function v = WaveletFeature(I) %Vector Size is 20 V = cell(3. 'TrainTargets'. mean2(RD). end %% Saving The Data TrainMat = [TrainCell{:}]. V{1}. H = cell(3. % Feature Name of Saved Data end Wavelet Feature Function This function takes an image and returns a feature vector of size 20. RH = CH. std2(B{3})]. save(OutFileName. D{1}] = dwt2(I. v = [mean2(B{3}). H{3}. DV = diag(CV).1). [B{1}.'haar'). 'TrainMat'. 'TestMat'. H{1}.counter = counter + 1. DD = diag(CD). std2(H{i})./sqrt(DD*DD'). CV = cov(V{i}). 'TestTargets'). std2(D{i})].

Row/RowDiv*ones(1. d(1:5) ]. Col/ColDiv*ones(1.j}). C = mat2cell(Id. end end end . Function’s code function v = SVDFeature(I) Id = double(I). RowDiv = 5. v = []. for i = 1:ColDiv for j = 1: RowDiv d = svd(C{i. [Row Col] = size(I).RowDiv). v = [ v.SVD Feature This function takes an image and returns a feature vector of size 125. ColDiv = 5.RowDiv)).

Various techniques for texture recognition

Various techniques for texture recognition

- Hybrid Model of Texture Classification using 2D Discrete Wavelet Transform and Probablistic Neural Networkby ijcsis
- PDE BASED FEATURES FOR TEXTURE ANALYSIS USING WAVELET TRANSFORMby James Moreno
- 09.10.Texture characterization via joint statistics of wavelet coefﬁcient magnitudesby Alessio
- Ocrby api-3719303

- Exploring textural analysis for historical documents characterizationby Journal of Computing
- A New Method for License Plate Location Based on Top-hat Transform and Wavelet Transform.pdfby Basil
- Statistical Texture Characterization From Discrete Wavelet Representationsby ss_barpanda8473
- 3D-Discrete Wavelet and Multiwavelet Transform Based on Recognition System Design ofLatin Handwritten Text Features Extractionby International Organization of Scientific Research (IOSR)

- Hybrid Model of Texture Classification using 2D Discrete Wavelet Transform and Probablistic Neural Network
- PDE BASED FEATURES FOR TEXTURE ANALYSIS USING WAVELET TRANSFORM
- 09.10.Texture characterization via joint statistics of wavelet coefﬁcient magnitudes
- Ocr
- 4
- REGION CLASSIFICATION BASED IMAGE DENOISING USING SHEARLET AND WAVELET TRANSFORMS
- Wavelet Based Histogram Method for Classification of Textu
- A Study on Weed Discrimination Through
- Untitled
- Wavelet GLCM
- Gray Level Co-Occurrence Matrix Computation Based On Haar Wavelet
- WAVELET BASED FEATURES FOR TEXTURE CLASSIFICATION ipaper
- Wavelet Ann Texture
- 04566452
- wavelet transformation
- Fractal 02
- Exploring textural analysis for historical documents characterization
- A New Method for License Plate Location Based on Top-hat Transform and Wavelet Transform.pdf
- Statistical Texture Characterization From Discrete Wavelet Representations
- 3D-Discrete Wavelet and Multiwavelet Transform Based on Recognition System Design ofLatin Handwritten Text Features Extraction
- 1-s2.0-S0923596509001192-main.pdf
- 10.1.1.5
- 05376155
- IJETTCS-2014-06-25-145
- A Review of Recent Texture Classification
- Paper 057
- 59213325 Content Based Image Retrieval
- Content Based Image Retrieval
- ICIP_07
- 1213-10011-1-PB
- CMP605 Texture Recognition Report Final

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd