You are on page 1of 13

Pattern Anal Applic (2013) 16:69–81

DOI 10.1007/s10044-011-0207-0

THEORETICAL ADVANCES

Rotation invariant features for color texture classification


and retrieval under varying illumination
B. Sathyabama • M. Anitha • S. Raju •

V. Abhaikumar

Received: 27 April 2010 / Accepted: 3 March 2011 / Published online: 22 March 2011
Ó Springer-Verlag London Limited 2011

Abstract This article proposes a new quaternion-based 1 Introduction


method for rotation invariant color texture classification
under illumination variance with respect to direction and The escalating growth of computer vision applications has
spectral band. The color of an object varies according to increased the need for faster and more accurate image
the spectral power distribution, object-illumination, and analysis algorithms. One application of image analysis that
viewing geometry of the light source. The quaternion has been studied for a long time is texture analysis
representation of color is shown to be effective, which [1].Texture classification and retrieval are subsets of tex-
treats color channels as single unit rather than separate ture analysis that are particularly well suited for the auto-
components. New texture signatures are extracted by cal- matic grading of products such as ceramic tiles, textiles,
culating the norm of the Quaternion fourier spectrum. foliage, wood, and granules. Early methods for texture
These signatures are proved to be invariant under image classification focus on the statistical, structural, model
rotation and illumination rotation. Moreover, these features based and signal processing analysis of texture images. The
are also invariant to the color spaces. The robustness of representative methods include the co-occurrence matrix
different color spaces against varying illumination in color method [2], Morphology-based approaches [3], Markov
Texture classification with 45 samples of 15 outex texture Random Fields and wavelet transforms [4]. In general their
classes are examined. Comparative results show that the classification results are good as long as the training and
proposed method is efficient in rotation invariant texture test samples have identical or similar orientations [5].
classification. However, the rotations of real- world textures will vary
arbitrarily, severely affecting the performance of the for-
Index terms Quaternion  Quaternion fourier transform  mer methods and suggesting the need for rotation invariant
Rotation invariant color texture classification methods of texture classification [6].
Rotation invariant texture analysis can be obtained
either via learning of rotation invariance during a train-
Electronic supplementary material The online version of this ing phase or through the extraction of rotation invariant
article (doi:10.1007/s10044-011-0207-0) contains supplementary
material, which is available to authorized users.
features from the input textures. Kashyap and Khotanzad
[7] were one among the first researchers to study rota-
B. Sathyabama (&)  M. Anitha  S. Raju  V. Abhaikumar tion-invariant texture classification using a circular
Electronics and Communication Engineering, autoregressive model. Further, the multi resolution auto-
Thiagarajar College of Engineering,
regressive model, hidden Markov model, Gaussian
Thiruparankundram, India
e-mail: sbece@tce.edu Markov random field model, and the auto correlation
URL: http://www.tce.edu/receiving model have been proposed by various researchers. Many
M. Anitha Gabor and wavelet-based algorithms were also proposed
e-mail: mmanitha2005@gmail.com for rotation invariant texture classification. However,
S. Raju such methods do not consider the effect of rotation of the
e-mail: rajuabhai@tce.edu illuminant direction.

123
70 Pattern Anal Applic (2013) 16:69–81

The direction of illumination has a binding influence on of color textures. Fig. 1 shows the complete rotation
the directional characteristics of texture images. Thus, image invariant texture classification scheme using quaternion. A
forming process under directed illumination of texture acts color image pixel may be converted to a quaternion pixel
as directional filter. It implies that the directional properties by placing the three components (such as the red, green,
of texture are not intrinsic to the surface, but they are con- and blue components in the case of RGB images) into the
siderably affected by the variation in illuminant tilt. Majority three imaginary parts of the quaternion, leaving the real
of texture feature sets used in classification and retrieval part zero. Fourier spectrum for the formed Quaternion is
exploits the directional characteristics. Among the work obtained using QFFT. Radial plot is plotted and the rota-
reported in texture classification only very few includes the tion invariant Texture Signatures, Peak Distribution Norm
effect of illumination in grayscale images [8]. Vector (PDNV) is computed for both the rotated and
However, the color of the image can aid the process of unrotated textures. Then the classification and retrieval
texture classification and retrieval [9]. The perceived color results are obtained by calculating Sum of square distance
of the surface is determined not only by the color of the (SSD) of PDNV between training and test textures.
surface but also by the color of the light. Changes in the
angle of illumination incident upon a surface texture can 2.1 Illumination
also significantly alter the color characteristics.
A comprehensive review of the techniques which deals The image intensity Iðm; nÞof a surface is represented using
with color texture can be found in [10, 11]. It can be Lamberts law as,
classified into three groups: parallel, sequential, and inte- Iðm; nÞ ¼ n  L ð1Þ
grative [12]. Parallel approaches consider texture and color
as separate phenomena. In sequential approaches the first where, n is the unit vector normal to the surface at the point
step is applying a color indexing method to the original (m, n) and
color images. As a result the indexed images can be L ¼ ðcos s sin r; sin s sin r; cos rÞ ð2Þ
processed as grayscale textures. In this framework the
co-occurrence matrix has received a great deal of attention, is the normalized illumination vector toward the light
due to its straightforward extension to indexed images [13]. source. Here, s and r are the tilt and slant angles, respec-
Integrative models are based on the spatial relationship of tively, both measured with respect to illumination as shown
pixels. These approaches [14] can further be subdivided in Fig. 2. The amount of illumination depends on the angle
into single-band if data are considered separately from each of rotation of illumination source. Thus, it implies that an
channel or multiple-band if two or more channels are image forming process using directed illumination acts as a
considered jointly. directional filter of texture image which can be measured as
Further Multiple-band approaches have also been
obtained as extensions of classical grayscale texture
descriptors, such as wavelets, Gabor filters, co-occurrence
matrices, log polar transform, Local Binary Patterns and Input
Markov Random Fields. In order to achieve illumination
invariance color information is vital[15].In the above
mentioned methods classical grayscale texture descriptors Quaternion Formation
are applied to each color channel separately which in turn
results in loss of information [16].
Fourier spectrum using QFFT
This article proposes a Quaternion-based rotation
invariant classification method for color textures with
illumination changes. This article presents the overview of
the proposed rotation invariant texture classification Radial Plot
scheme in Sect. 2 and the texture classification and
retrieval experimental results in Sect. 3. Finally, Sect. 4
concludes this article. Rotation Invariant Texture
Signatures

2 Rotation invariant texture classification


Classifier
The method presented here relies on two building blocks: a
rotation and illuminant invariant classification and retrieval Fig. 1 Texture classification using quaternion

123
Pattern Anal Applic (2013) 16:69–81 71

it into complex Fourier equivalence, which allows convo-


lution and cross correlation on quaternion-valued images in
the frequency domain [23]. Further quaternion texture
analysis, quaternion singular value decomposition and
quaternion principal component analysis are implemented
and applied to several applications, such as segmentation of
color images [24].
The quaternion representation of color is shown to be
Fig. 2 Definition of axis and illumination angles effective in the context of color texture analysis in dig-
ital color images [25]. The advantage of using quaternion
an angular variation. Such an effect is likely to have arithmetic is that a color can be represented and ana-
important implications for texture classification schemes. lyzed as a single entity. A color texture sample projected
onto a subset of quaternion basis provides a concise
2.2 Role of illumination in color texture description of the texture. Quaternions provide a new
way to process color and texture in combination. The
Color is an important aspect of any object recognition advantage of the quaternion representation for color
scheme as color changes considerably with the variation in image I(m, n), as proposed by Sangwine, combines a
illumination, object pose, and camera viewpoint. It pro- color 3-tuple (RGB or LMS) of I(m, n) into a single
vides a high-discriminative information cue robust against hyper complex number as.
unreliable imaging conditions. Iq ðm; nÞ ¼ Rðm; nÞi þ Gðm; nÞj þ Bðm; nÞk ð3Þ
Changes in environment illumination usually alter the
resultant images acquired by the camera, causing colors to A color can then be processed as a unit, rather than as three
appear different which is known as the color constancy separate channels. Three color channels of each pixel is
problem [17].Generally the color components (R, G, and B converted into a quaternion pixel by placing the three
for example) are handled as separated monochromatic components into the three imaginary parts of the quater-
images. This method is satisfactory for linear filtering nion, leaving the real part zero and making it as purely
operations and so on, but it is not adequate for more imaginary quaternion.
sophisticated image analysis involving correlation between
images which may have been captured under different 2.4 Rotating with quaternions
illumination conditions. The spectral stability of illumina-
tion is an important aspect in color machine vision. The Rotation operator for any vector, with h rotation around
physical properties of lamps may change the spectrum of unit direction ‘n’ can be represented by
the illumination with time [18]. The reflection from each
color channel varies, due to the change in illumination, Rh ¼ ðcosðh=2Þ; n sinðh=2ÞÞ ð4Þ
therefore consideration of illumination changes while Here, the 3D vector (0, Iq) rotated by h can be written as,
defining the rotation invariant feature for color texture  
gives significant results in both classification and retrieval. R Iq ¼ R1h  I q  Rh ð5Þ

Since
2.3 Quaternion for color  
R½Iq  ¼ ðcosðh=2Þ; n sinðh=2ÞÞ  0; Iq
Quaternions are popular in mathematics, Physics, and  ðcosðh=2Þ; n sinðh=2ÞÞ
engineering for decades [19]. Using Quaternions to repre-  
¼ 0; Iq ðcos2 ðh=2Þ  sin2 ðh=2Þ
sent colors, has recently been proposed and studied [20].  
Color sensitive filtering can be achieved using a quaternion þ 2n n  Iq sin2 ðh=2Þ
 
valued filter [21]. There are two color edge detection þ 2 n  Iq cosðh=2Þ sinðh=2Þ
 
methods defined on quaternions [22]. One is based on the ¼ ð0; Iq cos h þ n n  Iq ð1  cos hÞ
chromaticity cancellation that generates a gray color for  
þ n  Iq sin h
non-boundary regions; the other is based on estimation of
homogeneity of regions. Quaternion-based trilateral filter- This form of rotation representation is used here to include
ing is proposed by locally adapting color and changing the rotation and illumination changes. Illumination changes
shape of the filter to achieve the effect of smoothing colors will result in the form of angular variations h1, which is
yet preserving edges. Furthermore, Quaternion Fourier described in Sect. 2.1. Apart from this the texture’s image
analysis has been implemented efficiently by decomposing rotation is also measured in terms of angular variations.

123
72 Pattern Anal Applic (2013) 16:69–81

This totally provides a set of angular changes (h1 and h2) The discrete quaternion fourier transform of an image
where h1 represents the illumination rotation, and h2 rep- Iq (m, n) for two variables based on Ell’s formula is
resents image rotation. X
1 X
N 1
1 M
e2plð M þ N Þ Iq ðm; nÞ
mu nv
According to the illumination changes, the vector Iq Fq ðu; vÞ ¼ pffiffiffiffiffiffiffiffi ð9Þ
rotated by h1 is represented as, MN m¼0 n¼0
 
Rh1  Iq  Rh1 ð6Þ where the discrete array f(m, n) is of dimension M 9 N.
The inverse transform is
where,
1 M X
1 X
N 1
e2plð M þ N Þ Fq ðu; vÞ
mu nv
Rh1 ¼ ðcosðh1 =2Þ; n sinðh1 =2ÞÞ Iq ðm; nÞ ¼ pffiffiffiffiffiffiffiffi ð10Þ
MN u¼0 v¼0
and further by h2 is represented as
    It transforms an image into a quaternion-valued frequency
Rh2  Rh1  Iq  Rh1  Rh2 ð7Þ
domain signal. Here the fast algorithm of Quaternion
where, fourier transform (QFFT) has been employed.
An important point is that the QFT depends of the
Rh2 ¼ ðcosðh2 =2Þ; n sinðh2 =2ÞÞ definition of l which is any unit pure quaternion [23]. Such
This can be represented as, a transform can be easily computed using classical Fourier
    transforms and symplectic decomposition. The QFT is
Rh2  Rh1  Iq  Rh1  Rh2 ¼ ðRh2  Rh1 Þ  Iq  ðRh1  Rh2 Þ separable, and may thus be evaluated by 1D summation
¼ ð Rh 2  Rh 1 Þ  I q  ð Rh 1  Rh 2 Þ over rows and columns of the input array, or it may be
evaluated by a fast algorithm.
ðcosðh1 =2Þ; n sinðh1 =2ÞÞ  ðcosðh2 =2Þ; n sinðh2 =2ÞÞ These facts of hyper-complex Fourier transforms have
¼ ðcosðh1 =2Þ cosðh2 =2ÞÞ  ðsinðh1 =2Þ sinðh2 =2ÞÞ; shown their applicability by allowing generalizations of
nðsinðh1 =2Þ cosðh2 =2Þ þ cosðh1 =2Þ sinðh2 =2ÞÞ many image processing techniques dependent on Fourier
¼ ðcosððh1 þ h2 Þ=2Þ; n sinððh1 þ h2 Þ=2ÞÞ ð8Þ transforms to color. In particular, using the more recent
transform definition, the validity of hyper complex auto
From the above equation, it is inferred that, rotation of any and cross-correlation, and vector phase correlation has
Quaternion by h1 and then by h2 is equivalent to rotation by been proposed. Pei et al. [28] published a work on the
h1 ? h2. Hence any rotation invariant texture signature in efficiency of the computation of Quaternion fourier trans-
the Quaternion domain will be invariant to both orientation forms and linear quaternion filters. Bülow and Sommer
and illumination changes. [16] have published work on hyper complex Fourier
transforms applied to grayscale images. They utilized a
2.5 Quaternion fourier transform hyper complex Fourier transform because of its symmetry
properties when applied to real-valued images but they
The QFT plays a vital role in the representation of signals haven’t considered application of the transform to invariant
and images. The idea of computing Quaternion fourier texture analysis of color images.
transform (QFT) of a color image has only recently been
realized. It is possible, of course, to separate a color image 2.6 Rotation invariant texture signatures
into three scalar images and compute the Fourier trans-
forms of these images separately, but this study concerns The Rotation invariant texture signatures are extracted
with the computation of a single, holistic, Fourier trans- from the radial plot of the Fourier spectrum. The radial plot
form which treats a color image as a vector field. This
transform cannot have a single complex transform, since
color images have inherently 3-D pixels (regardless of the F(0)

color space in which they are represented).


The QFT was first proposed by Ell [26]. Ell also intro- F(1)

duced the use of the QFT in the analysis of 2D linear time


invariant dynamic systems. Later, an extended investiga- F(2)

tion of important properties of the two-sided QFT mainly


for real signals were made and applied to signal and image
F(n)
processing. The first application of a QFT to color images
was reported in 1996 by Sangwine using a discrete version
of Ell’s transform. Fig. 3 Radial Plot computation from QFFT spectrum

123
Pattern Anal Applic (2013) 16:69–81 73

is calculated by integrating all the contributions at each should essentially be able to demonstrate that. This prop-
radial frequency in Fourier spectrum which is shown in erty of rotation invariance is achieved by these PDNV
Fig. 3. The peak distribution norm vector (PDNV), com- signatures.
puted from the radial plot provides a set of Rotation
invariant texture signatures.
Norm is taken to the Quaternion fourier spectrum. The 3 Experiment and results
Fourier of the Quaternion has both real and imaginary
components. However, the norm of Fourier of quaternion is To demonstrate the significance of the proposed approach,
always a real quantity. a set of texture classification and retrieval experiments
The norm N(Fq) is a real-valued function and the norm of a have been conducted. In the experimental activity, textures
product of quaternion satisfies the following properties. with a set of different illuminations are taken from outex
  database. Each texture in the database is captured using
N Rh ¼ N ð Rh Þ
    three different simulated illuminants provided in the light
and N Rh Rh ¼ NðRh ÞN Rh source: 2300 K horizon sunlight denoted as ‘horizon’,
    
N R F q ¼ N Rh Fq Rh 2856 K incandescent CIE A denoted as ‘inca’, and 4000 K
    fluorescent tl84 denoted as ‘tl84’ as shown in figure below
¼ N ðRh ÞN Fq N Rh
  ð11Þ Fig. 4.
¼ N ðRh ÞN Fq N ðRh Þ All the textures used are having orientations 30°, 45°,
 
¼ N Fq : 60°, 90°, 135°, 180°, 225°, 270°, 315°, and 360°. The size
of the original images is 746 9 538 pixels. For the
The norm of the unrotated vector Fq equates to the norm of
experiment, 100 9 100 sizes of samples are taken. The
the rotated version. The energy (amplitude), here the norm,
authors have used fifteen texture samples from this data-
of an image should be the same irrespective of the orien-
base and they are shown in the Fig. 5.
tation of the image and any transform applied to this image
3.1 Presentation of experimental results

This approach for the texture classification uses peak dis-


tribution norm vector (PDNV) of the Quaternion derived
from the radial plot as rotation invariant texture signature.
These rotation invariant texture signatures are compared
with those of training classes using a sum of square dis-
Horizon Inca tl84
tance (SSD) to provide rotation invariant texture classifi-
Fig. 4 Texture (seeds005) under three different illuminations cation. Results for each of these stages are given below.

Fig. 5 Input datasets

123
74 Pattern Anal Applic (2013) 16:69–81

3.2 Rotation invariant texture signatures texture classification. It shows that radial plot will have
high energy content near the origin and progressively lower
Radial plot computed for the sample textures is given in values for higher frequencies. The energy of an image
Fig. 6 which is insensitive to rotation and illumination [8]. should be the same irrespective of the orientation of the
This is the important characteristic of radial plot which image. This is achieved by calculating Peak distribution
makes it a suitable feature for rotation invariant color norm vector of the quaternion radial plot. The proposed

250

200

150

100

50

0
0 5 10 15 20 25 30 35 40 45 50

Fig. 6 Input texture (Granular002), Spectrum and radial plot

Fig. 7 a Classification results


for outex textures.
b Classification results for
vistex textures

123
Table 1 Sum of square distance for various textures using quaternion
Textures Barleyrice002- Canvas006- Wallpaper020- Granular002- Carpet006- Seeds013 - Flakes010- Seeds005- Wood012- Tile004- Canvas041- Chips023- Canvas040- Cardboard001 Gravel001 -
tl84-100dpi-00 tl84- tl84-100dpi-00 tl84- tl84- tl84- tl84- tl84- tl84- tl84- tl84- tl84- tl84- -inca- tl84-
100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00

Barleyrice002- 1.8451 44.9666 42.2301 23.6765 21.0735 32.7590 13.0404 9.9124 8.2235 3.8830 16.5778 16.2075 11.9839 22.0712 29.8205
horizon-
100dpi-00
Canvas006- 46.5737 1.3226 88.1373 47.5374 25.3273 13.5672 33.3633 36.5057 38.3605 46.6691 29.9790 30.1407 34.3896 24.4449 16.7138
horizon-
100dpi-00
Wallpaper020- 37.7514 82.7944 4.3610 14.7236 59.1359 70.5363 50.8694 47.7224 45.8398 37.4491 54.1113 54.0464 49.8281 59.6415 67.3939
horizon-
Pattern Anal Applic (2013) 16:69–81

100dpi-00
Granular002- 17.0340 62.2539 24.7368 6.3102 38.4872 49.9979 30.2290 27.0804 25.2002 16.9754 33.6116 33.4266 29.2054 39.1566 46.9345
horizon-
100dpi-00
Carpet006- 24.5188 20.7815 66.0721 47.4073 4.1956 8.5289 11.3437 14.4618 16.2998 24.6053 7.9313 8.1023 12.3443 5.5306 5.6014
horizon-
100dpi-00
Seeds013 30.6594 14.6136 72.2726 53.6083 9.5295 2.5311 17.4575 20.5941 22.4586 30.8101 14.1449 14.2306 18.4816 8.6955 5.2824
-horizon-
100dpi-00
Flakes010- 17.5212 27.7906 59.0685 40.4018 4.8947 15.5388 4.4277 7.4868 9.3025 17.6040 1.2433 1.3635 5.4222 9.0320 12.5141
horizon-
100dpi-00
Seeds005- 9.3070 35.9897 50.9133 32.2505 12.3987 23.7470 4.0025 1.0056 1.1559 9.5237 7.4660 7.1724 3.1130 17.2460 20.7308
horizon-
100dpi-00
Wood012- 9.3332 36.1394 50.7207 32.0589 12.6557 23.8749 4.5132 1.9787 1.6970 9.2868 7.4776 7.4315 3.3806 17.3202 20.8011
horizon-
100dpi-00
Tile004- 4.3245 41.3742 45.4834 26.8235 17.8058 29.1168 9.4840 6.3722 4.5137 4.0855 12.7013 12.6129 8.4499 22.5473 26.0281
horizon-
100dpi-00
Canvas041- 17.5007 27.8143 59.1814 40.5272 4.2524 15.6106 4.4778 7.5153 9.3727 17.8123 2.6239 1.4401 5.3930 9.3832 12.8014
horizon-
100dpi-00
Chips023- 20.1027 25.7185 61.6626 43.0445 2.1686 13.6862 7.6980 10.4450 12.2332 20.5120 5.4551 4.8279 8.1168 8.1835 11.2760
horizon-
100dpi-00
Canvas040- 11.9839 34.3896 49.8281 29.2054 12.3443 18.4816 2.2794 3.1130 3.3806 8.4499 4.9064 14.1737 1.6270 14.7466 18.2296
horizon-
100dpi-00
Cardboard001- 25.1338 20.3433 66.5379 47.8850 5.5631 8.1601 12.0278 15.1126 16.9074 25.0885 8.4365 8.8500 13.0477 2.9238 4.9176
horizon-
100dpi-00
Gravel001 28.7547 16.7456 70.1625 51.5117 8.4336 4.6882 15.6169 18.7198 20.5251 28.7172 12.0597 12.4158 16.6384 2.2371 1.3062
-Horizon-
100dpi-00

Bold values indicate the place where the minimum sum of square distance achieved for the corresponding texture
75

123
76

Table 2 Sum of square distance for various textures using Gray-FFT


Textures Barleyrice002- Canvas006- Wallpaper020- Granular002- Carpet006- Seeds013 - Flakes010- Seeds005- Wood012- Tile004- Canvas041- Chips023- Canvas040- Cardboard001 Gravel001 -
tl84-100dpi-00 tl84- tl84-100dpi-00 tl84- tl84- tl84- tl84- tl84- tl84- tl84- tl84- tl84- tl84- -inca- tl84-

123
100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi- 100dpi-00 100dpi-00 100dpi-00 100dpi-00 100dpi-00
000

Barleyrice002- 3.3236 34.7059 39.5061 44.7120 28.6726 49.6962 7.6014 18.1942 36.0083 31.6246 23.4432 41.7469 19.4364 55.0926 63.1888
horizon-
100dpi-00
Canvas006- 48.2927 14.2829 42.7792 47.5374 33.7857 20.9040 39.0053 36.7314 27.9759 34.9953 22.2183 32.0609 27.9019 32.2314 32.0984
horizon-
100dpi-00
Wallpaper020- 46.5173 36.2853 5.3312 8.8985 19.6179 33.8468 40.5580 30.5097 21.6926 15.9808 11.0473 20.0775 15.7130 26.6862 37.8403
horizon-
100dpi-00
Granular002- 53.4243 37.4511 16.2279 8.6043 27.6116 27.3410 46.3936 33.6194 20.0348 23.8916 22.0800 12.3964 24.9384 22.8817 34.4456
horizon-
100dpi-00
Carpet006- 38.9488 9.9494 23.3460 31.0602 13.7115 20.1727 29.8295 25.7425 15.0221 14.7895 9.6425 24.3856 18.7221 21.9262 27.3719
horizon-
100dpi-00
Seeds013 52.2966 20.5703 32.7696 33.9466 30.4654 5.7161 42.8758 34.1076 17.9851 29.4409 28.9808 21.4109 33.6597 19.7265 23.8152
-horizon-
100dpi-00
Flakes010- 15.1357 27.5866 27.7796 33.0922 17.7122 39.9873 9.6892 9.3036 25.1015 20.2225 19.4075 38.5676 16.7836 44.0590 52.9523
horizon-
100dpi-00
Seeds005- 29.8353 22.5573 22.6522 24.1342 17.6091 26.6836 22.1626 8.8806 12.3286 17.6052 13.3467 29.5192 7.1939 33.4568 43.1340
horizon-
100dpi-00
Wood012- 61.5482 35.3804 25.0058 23.0410 32.5984 19.7091 53.0735 41.8297 22.9903 29.3607 17.4440 13.8236 23.0545 11.4350 20.5776
horizon-
100dpi-00
Tile004- 38.9722 21.4896 10.1671 19.1277 7.5043 24.4835 31.1132 23.3582 12.1891 4.5189 7.5916 22.5624 16.4300 21.0630 30.6617
horizon-
100dpi-00
Canvas041- 15.7228 39.5132 40.6489 42.4328 26.1709 42.7994 7.0985 15.8759 44.0617 27.3030 14.2205 34.0452 16.0496 31.7312 40.4696
horizon-
100dpi-00
Chips023- 29.1740 35.7821 23.6551 20.5696 10.8815 28.4518 26.8416 11.8982 26.2054 16.0578 14.1737 8.6264 15.3742 26.4597 38.2907
horizon-
100dpi-00
Canvas040- 41.7469 32.0609 20.0775 12.3964 24.3856 21.4109 7.0104 29.5192 13.8236 22.5624 10.9685 28.0689 7.4445 33.1173 42.0001
horizon-
100dpi-00
Cardboard001 57.5478 27.9546 29.3124 28.6199 31.7947 9.1131 48.4534 37.9409 19.2160 29.5726 38.9047 23.8358 38.8723 13.6327 20.1170
-horizon-
100dpi-00
Gravel001 69.4027 34.5136 40.4730 43.4081 41.2330 22.2022 60.0088 52.8653 34.2493 39.6674 49.5201 40.8989 52.1863 15.0108 5.8563
-horizon-
100dpi-00
Pattern Anal Applic (2013) 16:69–81

Bold values indicate the place where the minimum sum of square distance achieved for the corresponding texture
Pattern Anal Applic (2013) 16:69–81 77

quaternion approach is compared with the existing (a) PRECISION


1
methods. Proposed QFT
0.95 Gray-FFT
Moment-Based
3.3 Texture classification 0.9 Color-GLCM

0.85
Many texture classification approaches have been pre-

Precision
sented in the past that are image-rotation invariant which 0.8
do not consider the illumination changes. In the proposed
0.75
approach the rotation invariant texture signatures are
extracted for the unrotated and rotated versions of texture 0.7

images by considering illumination effects. Then, the sum 0.65


of square distance (SSD) between the peak distribution
0.6
norm vector of the training and test textures are
calculated. 0.55
2 4 6 8 10 12 14 16
In order to prove the efficiency of the proposed
No of Retrievals
method in classifying textures, methods like Gray FFT,
Log Polar, Gabor have been included for comparison. (b) RECALL
0.8
The classification results for each texture at ten different Proposed QFT
angles of rotation are shown in Fig. 7. Sum of square 0.7
Gray-FFT
Moment-Based
distance (SSD) for each texture is compared and tested Color-GLCM
with all the textures in the outtex and vistex data bases 0.6

separately.
SSD computed using the proposed method for outex 0.5
Recall

database is tabulated in Table 1 and the results using Gray-


0.4
FFT are tabulated in Table 2. For texture barleyrice002 the
classifier was trained using 30° rotation and tested with all 0.3
the textures with different rotations in the data base. Using
Gray-FFT, the SSD of 3.3236 is obtained and by the pro- 0.2
posed QFFT approach 1.8451 is attained. Using Gray-FFT
five textures got minimum Sum of square distance with 0.1
2 4 6 8 10 12 14 16
different textures (mis classification), which is reduced to No of Retrievals
three in QFFT method.
(c) RETRIEVAL EFFICIENCY
For some selected 15 textures, the proposed method 100
yields a classification accuracy of 97.46 and 92.07, 83.55, Proposed QFT
95 Gray-FFT
and 76.46% are achieved by Gray FFT, Log Polar, and Moment-Based
Gabor methods, respectively, for outex data set. It can be 90 Color-GLCM
Retrieval efficiency

understood from Fig. 7a. Similarly for selected vistex 85


textures, classification accuracy of 91.09, 86.56, 81.56, and
77.21% are obtained for Proposed QFFT, Gray FFT, Log 80

Polar, and Gabor, respectively, and it is shown in Fig. 7b. 75

70
3.4 Retrieval results
65
Figures 8 and 9 show the results of correct retrievals and
60
retrieval efficiency for a variety of methods. Many different
methods for measuring the performance of a retrieval 55
2 4 6 8 10 12 14 16
system have been created and used by researchers. The No of Retrievals
most common evaluation methods namely, Precision and
Fig. 8 a Precision graph. b Recall graph. c Retrieval efficiency graph
Recall usually presented as a Precision versus Recall graph
is used here. Precision and recall alone contain insufficient
information. The authors can always make recall value 1 images or precision and recall should either be used toge-
just by retrieving all images. In a similar way precision ther or the number of images retrieved should be specified.
value can be kept in a higher value by retrieving only few Thus, three graphs: Precision versus No. of retrievals;

123
78 Pattern Anal Applic (2013) 16:69–81

Recall versus No. of retrievals; Precision versus Recall Retrieval efficiency


have been plotted for the conventional GLCM [2, 11], 8
> No: of relevant images retrieved
moment invariants [29], gray FFT methods and the pro- >
>
>
> Total no: of images retrieved
posed QFFT-based method. >
>
>
>
As can be seen in the Fig. 8a, b, the precision and recall > No: retrieved \= No relevant
>
>
<
values of the proposed method for different number of ¼ No: of relevant images retrieved
retrievals are greater than the conventional methods. >
>
> Total no: of relevant images
>
>
>
No: of relevant images retrieved >
>
Precision ¼ >
> otherwise
Total no: of images retrieved >
>
:
No: of relevant images retrieved
Recall = No: of non-relevant images retrieved
Total no: of relevant images in the database Error rate =
Total no: of images retrieved
Apart from Precision and Recall, the other measures of
performance evaluation are error rate and retrieval By the definition of retrieval efficiency, the authors can say
efficiency. These are defined as follows it depends on precision and recall. The same parameters

Fig. 9 a Retrieval efficiency


for outex textures. b Retrieval
efficiency for vistex textures

123
Pattern Anal Applic (2013) 16:69–81 79

Table 3 Average retrieval efficiency comparison COMPARISION OF COLORSPACES


300
Methods Outex (%) Vistex (%) RGB
LMS
QFFT 95.41 90.13 250 XYZ
YMU
Gray FFT 91.16 85.90

Norm Distribution
Log Polar 80.84 80.77 200

Gabor 73.71 76.33


150
precision, recall are computed for Gray FFT, Log polar,
and Gabor for outex and vistex dataset. From this Retrieval 100
efficiency is calculated which are shown in Fig. 9a, b, and
the average Retrieval efficiency is tabulated in Table 3. 50
The proposed method is proved to be the best when
compared to the other methods in terms of classification 0
and retrieval. 0 5 10 15 20 25 30 35 40 45 50
Radial sampling frequency
3.5 Color space independence
Fig. 10 Comparison of color spaces

The effect of different color spaces in color texture clas-


sification is still subject of controversial debate in the COMPARISION OF COLORSPACES
250
computer vision community. A number of color spaces RGB
LMS
have been proposed that aim to make the image object data XYZ
robust to the presence of noise, illumination changes, 200 YMU
complex backgrounds, and poor quality image capture
Norm Distribution

equipment. These color spaces have been extensively 150


evaluated and it has been found that there is a considerable
variation in the results based on which color space is used.
Depending on which color space is chosen, very different 100

classification performances are observed using the same


texture features and the same experimental image data so 50
the choice of color space for texture analysis is crucial one.
To overcome this, some color normalization procedures
have been developed and they give some approximately 0
0 5 10 15 20 25 30 35 40 45 50
better results for various color spaces. But in Quaternion Radial sampling frequency
representation the color is processed as a single entity.
Figure 10 shows the plot of the proposed texture feature Fig. 11 Comparison of color spaces
with respect to the radial distance for various color spaces.
From the figure it is understood that the point at which the individual bands or computing QFFT after combining three
peak and valley occurs at same points for all the color bands together to a quaternion. The QDFT or its FFT
spaces. implementation QFFT requires fewer real multiplications
The magnitudes of peaks are different for different color than three-complex DFT/FFTs and hence is more efficiently
spaces. If these magnitudes are normalized, the resultant computed for a color image, as well as requiring less mem-
plot will provide color space-independent texture features ory. QDFT directly implemented would require 75% of the
as shown in Fig. 11. Hence the proposed texture features real multiplications and additions required by three complex
in quaternion domain provide color space-independent DFTs. A 2D Cooley-Tukey decimation in space or frequency
features. requires 75% of the additions required by three-complex
FFTs of the same type, but the number of multiplications is
3.6 Efficiency only slightly less at 8/9 of the number required by the three
complex FFTs. This is because, in the complex FFT, the
The effectiveness of classification accuracy for grayscale product of two twiddle factors may be evaluated by table
version using FFT is comparatively low, so the authors prefer lookup, whereas in the QFFT this combination of two
using color image. In case of color image, color bands can be twiddle factors may not be carried out because they are in
processed in two ways. Separately computing FFT for orthogonal complex fields. The QFFT of a RGB image with

123
80 Pattern Anal Applic (2013) 16:69–81

M rows and M columns requires 8M2log2M real additions 2. Davis LS, Johns SA, Aggarwal JK (1979) Texture analysis using
and 8 M2(log2M - 1) real multiplications, the reduction in generalized co-occurrence matrices. IEEE Trans Pattern Anal
Machine Intell PAMI-1:251–259
multiplications being due to omitted unity twiddle factors in 3. Serra J, Wong AKC (1973) Mathematical morphology applied to
the first stage of the algorithm. fibre composite materials. Film Sci Technol 6:141–158
The QFT opens up a very wide range of possibilities in 4. Mallat S (1989) A theory for multiresolution signal decomposi-
color image processing, since it allows the spectrum of a tion: the wavelet representation. IEEE Trans Pattern Recog
Machine Intell 11(7):674–693
color image to be handled as a whole rather than as sepa- 5. Idrissa M, Acheroy M (2002) Texture classification using Gabor
rate color component spectra. All the existing frequency filters. Pattern Recogn Lett 23(9):1095–1102
domain techniques which have been developed for com- 6. Arivazhagan S, Ganesan bL, Padam SP (2006) Texture classifi-
plex spectra and FFTs should generalize to quaternion cation using Gabor wavelets based rotation invariant features.
Pattern Recogn Lett 27:1976–1982
spectra and FFTs. There may be some cases where the non- 7. Kashyap RL, Khotanzed A (1986) A model based method for
commutativity of quaternion multiplication will require rotation invariant texture classification. IEEE Trans Pattern Anal
careful analysis. Further, there is scope for the use of the Machine Intell PAMI-8(4):472–481
QFT on monochrome images, because of the true 2D nat- 8. Chantler M, Schmidt M, Petrou M, McGunnigle G (2002) The
effect of illuminant rotation on texture filters: Lissajous’s ellipses.
ure of the transform, as pointed out by Ell [26]. This has Proc Eur Conf Comput Vision 3:289–303
not yet been explored. In digital signal processing, Ueda 9. Vertan C, Boujemaa N (2000) Color texture classification by
and co-workers [27] has worked on digital filters with normalized color space representation. In: 15th International
hyper complex coefficients, mainly as a way of imple- conference on pattern recognition (ICPR’00), vol 3, Barcelona,
pp 35–84
menting parallel real filters efficiently. Clearly, there is 10. Petrou M, Garcı́a SPG (2006) Image processing. Dealing with
scope to explore the application of this idea in image texture, Wiley Interscience, Petrou, TA1637, P48
processing, especially with color images. 11. Haralick RM, Shanmugam K, Dinstein I (1973) Textural features
for image classification. IEEE Trans Systems, Man Cybernet
3:610–621
12. Palm C (2004) Color texture classification by integrative
4 Conclusion co-occurrence matrices. Pattern Recogn 37:965–976
13. Sertel O, Lozanski G, Shana’ah A, Catalyurek U, Saltz J, Gurcan
In this article, a Novel Quaternion-based rotation-invariant S (2008) Texture classification using nonlinear color quantiza-
tion: application to histopathological image analysis. In: Pro-
texture classification and retrieval has been proposed. The ceedings of IEEE international conference on acoustics, speech
rotation-invariant texture signatures, peak distribution and image processing, Las Vegas, pp 597–600
norm vector have been constructed from the radial plot of 14. Bianconi F, Fernández A, Gonzalez E, Caride D, Calvino A
the Quaternion fourier transform and were further used for (2009) Rotation-invariant colour texture classification through
multilayer CCR. Elsevier, New York, Pattern Recognition Letters
classification. The feature extraction process in Quaternion 30: 765–773
domain has been quite efficient with less number of 15. van de Sande KEA, Gevers T, Snoek GM (2010) Evaluating color
mathematical computations as well as requiring less descriptors for object and scene recognition. IEEE Trans Pattern
memory. Experimental results show that the proposed Anal Mach intll 32:1582–1596
16. Bülow T, Sommer G (2001) Hypercomplex signals—a novel
algorithm achieves high-classification accuracy and extension of the analytic signal to the multidimensional case.
retrieval efficiency for color images. IEEE Trans Signal Proc 49(11):2844–2852
However, the emphasis of this paper has been on 17. Kakumanu P, Makrogiannis S, Bourbakis N (2007) A survey of
extracting the rotation-invariant texture features. Further skin-color modeling and detection methods. Pattern Recogn
40(3):1106–1122
improvement could be on investigating algorithms for 18. Kauppinen H, Silvén O (1996) The effect of illumination varia-
efficient texture segmentation. tions on color-based wood defect classification. In: 13th ICPR,
Vienna, pp 828–832
Acknowledgment The authors are very much thankful to the 19. Hamilton WR (1844) On Quaternions; or a new system of
reviewers for their valuable comments and also to the Management imaginaries in algebra. Philos Mag 25:489–1844
and Department of Electronics and Communication Engineering of 20. Sangwine SJ, Ell TA (1999) Hypercomplex auto and cross cor-
Thiagarajar College of Engineering for their support and assistance to relation of color images. In: IEEE international conference on
carry out this work. image processing (ICIP’99), Kobe, pp 319–322
21. Evans CJ, Ell TA, Sangwine SJ (2000) Hyper complex color
sensitive smoothing filters, vol 1. ICIP Vancouver, Canada,
pp 541–544
22. Sangwine SJ, Evans CJ, Ell TA (2000) Color sensitive edge
References detection using hyper complex filters. In: Proceedings of the
tenth European signal processing conference (EUSIPCO), vol I.
1. Arvis V, Debain C, Berducat M, Benassi A (2004) Generalization Tampere, Finland, pp 107–110
of the co occurence matrix of color images: application to color 23. Sangwine SJ, Evans CJ, Ell TA (2001) Hyper complex fourier
texture classification. Image Anal Stereol 23:63–67 transforms of color images. In: Proceedings of the IEEE

123
Pattern Anal Applic (2013) 16:69–81 81

international conference on image processing (ICIP 2001), vol I. 27. P-Ukomoto M, Ueda K, Takahashi S -I (1995) Realization of four
Thessaloniki, Greece, pp 137–140 real coefficient transfer functions by one hypercomplex transfer
24. Shi L, Funt B (2005) Quaternion color texture, AIC0 2005 pro- function. In: Proceedings of 12th European conference circuit
ceedings of the tenth congress of the international color associ- theory and design (ECCTD’95), ITLJ, Istanbul Technical Uni-
ation, Granada versity, Istanbul, pp 975–978
25. Sangwine SJ (1996) Fourier transform of color images using 28. Pei S-C, Ding J-J, Chang J-H (2001) Efficient implementation of
quaternion, or hypercomplex, numbers. Electron Lett 32(21): quaternion fourier transform, convolution, and correlation by 2-D
1979–1980 complex FFT. IEEE Trans Signal Proc 49(11):2783–2797
26. Ell TA, Sangwine SJ (2007) Hypercomplex fourier transforms of 29. Hu M (1962) Visual pattern recognition by moment invariants.
color images. IEEE Trans Image Proc 16(1):22–35 IRE Trans Inform Theory 8:179–187

123

You might also like