You are on page 1of 4

An Application of Linear Predictive Coding and

Computational Geometry to Iris Recognition


Masoud Alipour, Ali Farhadi, Nima Razavi
Institute for Studies in Theoretical Physics and Mathematics, Tehran, Iran

Received 3 November 2004; accepted 15 December 2006

ABSTRACT: The aim of this work is to present a method in computer structure has been known for a long time, the pioneering work on
vision for person identification via iris recognition. The method makes iris recognition was [Daugman, 1995]. A more refined version with
essential use of computational geometry and LPC. V C 2007 Wiley
outstanding empirical results is presented in Daugman [2004], and
Periodicals, Inc. Int J Imaging Syst Technol, 16, 230–233, 2006; Published online
a good introduction to the subject is given in Wildes [1997]. This
in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.20083
article contains the description of a complete system, from image
Key words: computer vision; texture classification; biometrics; linear acquisition to the analysis of the iris data, and references to related
predictive coding; computational geometry works. One difficulty with the method described in Wildes [1997]
is that serious hardware constraints are imposed on the image acqui-
I. INTRODUCTION sition system. The approach of this article significantly reduces
Problems of image analysis are treated by different methods in these constraints. Multichannel Gabor filters are utilized in Zhu
computer vision depending on the context and application. Texture, et al. [2000] and Ma et al. [2002] for the analysis of iris data and an
shape, or silhouette-based analysis and the use of wavelets, FFT, or improved version is presented in Ma et al. [2002]. Combining a fre-
other filters are commonly used techniques for understanding quency analysis with the Gabor filter and using a coding method
images. Motivated by an earlier work on segmentation of images similar to that of Daugmann [Daugman, 1995], a different method
and texture classification [Farhadi and Shahshahani, 2003], the is presented in TMTR.
authors apply a combination of linear predictive coding (LPC), Fou- The method presented in this article depends on the texture anal-
rier and discrete cosine transforms, and computational geometry for ysis of an iris image. It has several advantages, which may be im-
biometrical person identification via iris recognition. portant for practical applications. Some of these advantages are
For this application, it is necessary to quantify the differences
between certain transforms of an image. The main mathematical 1. It is robust relative to changes in lighting, which reduces the
tool developed in this work for this purpose comes from computa- constraints on image acquisition.
tional geometry. It is a notion, which is roughly related to a discre- 2. It is independent of color and works as well on dark eyes as
tized version of the curvature function of a surface. It is our experi- blue ones and is not affected by wearing color contact lenses.
ence that it is beneficial to avoid analytical tools, which require 3. There is no requirement of straightening out of images and
smoothing of surfaces that are given as nondifferentiable manifolds transforms suitable for circular images are directly applied.
and instead rely on definitions that are intrinsically discrete. Consequently, this technique is orientation insensitive.
This article is organized as follows: Section II describes the
background work on iris printing and recognition, and the following
section is a brief exposition of Hough transform, LPC and computa-
tional geometry as used in this work. The final section describes the III. TECHNICAL METHODS
feature vector and the results for iris printing and identification. A. Circular Hough Transform. Let X be a binary image repre-
sented as a matrix of zeros and ones. The purpose of circular Hough
II. BACKGROUND ON IRIS PRINTING transform is to identify circles in this image X. For every fixed r >
The problem of person identification by iris recognition has been 0 and every pixel v ¼ (x, y), consider all circles of radius r, which
studied by a number of authors. Although the individuality of iris passes through v. Let Av,r be the binary matrix, which is one at the
centers of these circles and zero otherwise. Thus Av,r is one on a
Correspondence to: Nima Razavi, Department of Informatik, ETH Zurich; e-mail: circle and zero elsewhere. Now set
nimar2@ipm.ir
Present address of Ali Farhadi: Department of Computer Science, University of
Illinois, Urbana X
Present address of Masoud Alipour: Department of Informatik, EPF Lausanne. Ar ¼ Av; r
Authors are listed in alphabetical order. v

' 2007 Wiley Periodicals, Inc.


for the square window, we can use the coordinates of the upper left
corner pixel as the coordinates of the window. Let Z ¼ Z(x,y) be a
function of the LPC coefficients attached to the window corre-
sponding to the point (x, y). The LPC surface (Fig. 2) is the graph
of the function

ðx; yÞ ! Zðx; yÞ

The function Z may be a linear local function or global in nature


such as FFT or the DCT of a local linear function of the coeffi-
cients. The function Z (or its graph S) may be a nonsmooth surface,
which to some extent reflects the perceptual contents of an image.

C. Computational Geometry. The oscillations of an LPC sur-


face contain significant information about an image. To quantify
this content, we triangulate the surface S. There are different meth-
ods for triangulating a surface but not all such methods preserve the
key elements that are required for the analysis. A version of the
Figure 1. Hough space (for a fixed radius). Delaunay triangulation yields a triangulated surface where the local
profusion of triangles is a measure of the oscillations of the surface.
As r > 0 varies, this construction gives a 3D matrix, which we To each triangle s, we assign its centroid rs, which is a point in
call the circular Hough space associated to the binary image X and space with coordinates (xs, ys, zs). Let F(s) ¼ (xs,ys) be its projec-
denote it by HX. Notice that a point in HX represents a circle in X tion on the (x,y)  plane. Partition the plane into nonoverlapping
and vice versa. p 3 p squares and for each square Q let F(Q) be the number of pro-
We fix a threshold v, which reflects the number of circles to be jection F(s) of centroids that are in Q. The choice of F(Q) is deter-
identified in the image, and calculate the first v maxima in HX. Let mined by the particular application.
(v,r) [ R2 3 Rþ be a maximum point for HX (Fig. 1). Then, (v,r) It may be worthwhile to explain the motivation for the particular
represents a circle of radius r centered at the point v in the original counting function F(Q) used in this context. For a triangulated sur-
binary image X. Hough transform was introduced by Paul Hough face T, one may introduce a notion of discrete curvature as the func-
for recognizing complex patterns in 1962, and circular Hough trans- tion defined at the vertices as
forms are discussed in Kimme et al. [1975].
jðvT Þ ¼ 6  number of vertices incident on v
B. LPC Surfaces. Next, we describe the construction of LPC sur-
faces for a general image. By a neighborhood N of the origin zero, for an interior vertex vT; and replacing 6 with 3, we define the cur-
we mean the configuration of n ¼ 20 with  representing the origin: vature of a boundary vertex. This notion of curvature seems to be
the correct discrete analogue of Gaussian curvature and some
notions of differential geometry of surfaces can be translated into
the discrete framework with this concept. As pointed out earlier,
replacing the discrete curvature [Meyer et al., 2003] by the usual
Gaussian curvature for a smoothing of the LPC surface, e.g., by
splines, introduces significant computational complexity and loss of

Define Np ¼ p þ N for every point p. For convenience, we enumerate


the pixels in the neighborhood N of p (excluding p itself) by counting
from left to right and up to bottom. Thus, pixels 6 and 15 are directly
on top and below p, respectively; while 10 and 11 are to the left and
right of p. A window W is a fixed subset in the plane, which by trans-
lation (with overlaps) scans the entire image. A window may be a
w 3 w square region of the image but for our application to eye print-
ing it is convenient to allow more general shapes. For every pixel,
q [ X let X(q)
Pdenote the value (or brightness) of q. We determine aj [
R such that q[W |X(q)  Y(q)|2 is minimized where

YðqÞ ¼ Rp2N Ap Xðq þ pÞ:

As the window W varies, the coefficients AW 1, . . . , A n define points


W

in the space R . Equivalently, the coefficients Ai trace n outcurves in


n W

R2, which are mathematically equivalent to the points in Rn.


Each window is specified by a pair of coordinates in the plane Figure 2. LPC surface of an Iris. [Color figure can be viewed in the
since the set of windows differ only by translations. For example, online issue, which is available at www.interscience.wiley.com.]

Vol. 16, 230–233 (2007) 231


232
four steps, viz.,
IV. IRIS PRINTING

2. Locating the iris.


1. Image denoising.

f. Statistics of C.
3. Feature extraction:

4. Person identification.
related coefficients

Vol. 16, 230–233 (2007)


is more convenient for our applications.

c. Decorrelation of LPC coefficients.


Figure 3. Edge image.

Figure 4. Sector windows.


b. Geometry of LPC surface and its statistics.
a. Determination of LPC coefficients and LPC surface S.

size appropriately to facilitate the application of Hough transform.


e. Computation of the circular-DCT C of the original image.

which could interfere with the analysis of the image and reduces the
ard tool of image processing. The purpose of the application of the
For image denoising, we used the Haar wavelet, which is a stand-
The process of eye printing and identification in our method has
tatively has behavior similar to the discrete curvature function but
important information. The function F(Q) qualitatively and quanti-

d. Geometry of the projection of DCT coefficients of decor-

Haar wavelet is twofold. It removes some of the high frequencies,


Table I. Feature vectors (14-element) of 10 sample irises are shown.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

11 0.074874 0.074976 0.059577 0.060132 0.071996 0.071905 0.095784 0.095779 0.095128 0.095774 0.14997 0.149 0.3287 0.3538 0.109 0.109 0.010 0.10946 1 0.149
12 0.79759 0.79859 0.64949 0.64903 0.79819 0.79886 0.99173 1 0.79359 0.79546 0.0067762 0.0072534 0.010887 0.011 0.0183 0.008 0.0072 0.0186 6.44e006 6.33e006
13 0.1384 0.13928 0.044774 0.045308 0 0 0 0 0.099988 0.99997 0.0992 0.0989 0.12876 0.13 0.109 0.109 0.099 0.10966 0.14521 0.14999
14 99.8919 100 32.1249 32.6756 0 0 0 0.011224 67.0029 68.0565 0.0073 0.007 0.021 0.0208 0.0183 0.0183 0.017357 0.0186 1.6958e006 1.6965e006
15 33.0389 32.9949 72.7396 23.0754 49.4388 49.4389 49.4228 49.4299 49.4423 49.452 1.7529e006 2.38e006 0.0036 0.0036 0.011 0.0063 0.0069 0.011 3.97e006 3.97e006
16 0.41608 0.41778 0.45624 0.35644 0.88796 0.88745 1 0.99985 0.40349 0.43019 0.120 0.111 0.095 0.095 0.087 0.078485 0.079 0.0893 0.12 0.11999
17 0.13116 0.12945 0.6484 0 0.34417 0.34417 0.34406 0.34404 0.34424 0.34436 7.2699e006 8.6532e006 0.0036 0.0036 0.011 0.006 0.007 0.011 3.77e006 3.75e006
18 999.897 999.9535 321.4869 303.8407 0 0.39608 0.38575 0.41763 0.31885 0.47695 3.96e006 4.27e006 0.021 0.020 0.018 0.0073 0.00724 0.0186 2.8498e006 2.8395e006
19 0 0 0.574 0.31208 0.38462 0.38463 0.38465 0.38462 0.38467 0.38479 0.0063 0.0063 0.0063 0.006 0.0099 0.004762 0.0064 0.0102 4.8002e007 4.7966e007
10 0.19296 0.19135 0.55402 0.55991 1 0.99146 0.33515 0.32558 0.45099 0.46718 50.36 51.56 63.36 63.44 57.48 57.4 58 58.56 63.69 63.72
11 9.9739 8.5461 99.7755 100 13.713 12.945 0.061167 0 39.6443 41.828 38762 26508.608 18439 18772.673 35101.902 32438.172 20607.262 35842 583 582.002
12 13.6758 10.1836 3.6521 5.0859 5.2472 7.2083 57.7198 57.9466 20.8985 19.9473 3.019 3.308 1.4661 1.453 1.6402 1.2969 1.3094 1.6046 1.22 1.216
13 6.6413 21.948 100 5.1378 6.7477 6.2415 9.7985 9.8491 1.3096 1.323 0.29004 0.29025 0.77707 0.76485 0.30432 0.26305 0.26372 0.30436 0.098842 0.098832
14 0 0.71253 0.959 1 0.47976 0.47956 0.90672 0.88534 0.17724 0.17206 18.609 19.3257 25.3 24.2 18.2568 23.8179 23.5525 18.398 10.396 10.3862
Table II. Quantized distances.
11 0 1 8 7 4 4 4 4 6 6 4 4 4 4 3 3 3 3 3 3
12 1 0 8 8 5 5 5 5 7 7 5 5 4 4 4 4 4 4 4 4
13 8 8 0 1 8 8 7 8 7 7 9 9 8 8 8 8 8 8 8 8
14 7 8 1 0 7 7 7 7 6 6 8 8 8 8 7 7 7 7 7 7
15 4 5 8 7 0 0 5 5 7 7 5 5 5 5 4 4 4 4 4 4
16 4 5 8 7 0 0 5 5 7 7 5 5 5 5 4 4 4 4 4 4
17 4 5 7 7 5 5 0 0 7 7 7 5 5 5 7 7 4 7 4 4
18 4 5 8 7 5 5 0 0 7 7 7 5 5 5 7 7 4 7 4 4
19 6 7 7 6 7 7 7 7 0 2 5 8 7 7 4 4 7 4 6 6
10 6 7 7 6 7 7 7 7 2 0 5 8 7 7 4 4 7 4 6 6
11 4 5 9 8 5 5 7 7 5 5 0 0 5 5 3 4 4 3 7 7
12 4 5 9 8 5 5 5 5 8 8 0 0 5 5 3 4 4 3 4 4
13 4 4 8 8 5 5 5 5 7 7 5 5 0 2 4 3 3 4 4 4
14 4 4 8 8 5 5 5 5 7 7 5 5 2 0 4 3 3 4 4 4
15 3 4 8 7 4 4 7 7 4 4 3 3 4 4 0 1 3 2 7 7
16 3 4 8 7 4 4 7 7 4 4 4 4 3 3 1 0 2 3 6 6
17 3 4 8 7 4 4 4 4 7 7 4 4 3 3 3 2 0 1 3 3
18 3 4 8 7 4 4 7 7 4 4 3 3 4 4 2 3 1 0 7 7
19 3 4 8 7 4 4 4 4 6 6 7 4 4 4 7 6 3 7 0 2
20 3 4 8 7 4 4 4 4 6 6 7 4 4 4 7 6 3 7 2 0

For the determination of the iris, we make use of the circular a set of positive numbers parametrized by the radii of the concentric
Hough transform. Initially, an edge detection algorithm is applied to circles. The variance of this set of numbers is the 14th coefficient of
the reduced and denoised image (Figure 3). The circular Hough trans- the feature vector.
form of the resulting binary image is calculated and normalized by di- The feature vectors are classified via a standard nearest neighbor
vision by 2p1 ffir where r is the radius of the circle and refers to the coor- algorithm. The test results for 20 digital images from eyes of 10 dif-
dinate along the third axis. The absolute maximum in the normalized ferent subjects are shown in Table I. Vertical entities (1–14) repre-
Hough space represents the outer or ciliary boundary of the iris. To sent the fourteen components of the feature vector and the horizon-
obtain the inner or pupilary boundary, we make use of the fact that it tal entries refer to 20 irises the results of which are exhibited below.
is a circle (almost) concentric
 with the outer boundary and its radius A quantization was introduced to modify the distances between
lies in the interval 5r ; 2r . The iris is then extracted from the image. feature vectors assigned to irises. With this quantization distances,
The observation about the radius and center of the pupilary boundary 2 mean correct matching and otherwise an incorrect match. In Ta-
reduces the complexity of the computation from O(n3) to O(n2). ble II, below the entries on rows and columns 2k  1 and 2k refer to
The first step in feature extraction is the calculation of LPC the same eye but digital pictures taken under different lighting condi-
coefficients of the iris. The window shape is adopted to the circular tions. Entries corresponding to 15, 16, and 17, 18 refer to different
nature of the iris by using windows, which in (r,y)-space (polar eyes of the same person, which explains the proximity of the quan-
coordinates) are rectangular (Fig. 4). The 20 LPC coefficients are tized distances.
calculated for overlapping windows scanning the entire image. For
each window, the results are reorganized into three groups accord- ACKNOWLEDGMENTS
ing to the following scheme: The authors thank Professor M. Shahshahani for his unequivocal
support and help during this project, and Dr. A. Katanforoush and
A6 þ A10 þ A11 þ A15 A5 þ A7 þ A14 þ A16
G1 ¼ ; G2 ¼ Dr. A. Nowzari for technical consultation on the key issues.
4 4
and G3 is the average of the remaining coefficients. From G1, we
REFERENCES
obtain the LPC surface, which is denoted by SX. This surface maybe
regarded as the graph of function on (r,y)-space. Its DCT transform J. Daugman, High confidence personal identification by rapid video analysis
of iris, IEEE Conf Publication no 408, European convention on security and
based on (r,y) windows are calculated. The lowest frequency ((0,
detection, May 16–18, 1995.
0)-coefficient) is discarded and the next nine lowest coefficients,
moving in the obvious zigzag manner, are retained. As the window J. Daugman, How Iris recognition works, IEEE Transactions on circuits and
systems for video technology 14 (2004), 21–30.
moves by varying y, the differences of the nine coefficients for con-
secutive windows are calculated. The resulting quantities are aver- A. Farhadi and M. Shahshahani, Image segmentation vis local higher order
aged as the window scans the entire LPC surface, and they form the statistics, Int J Imaging Syst Technol 13 (2003), 215–223.
first nine components of the feature vector. C. Kimme, D.H. Ballard, and J. Sklansky, Finding circles by an array of
The surface SX is then triangulated as explained in the preceding sec- accumulators, Commun Assoc Comput Machinery 18 (1975), 120–122.
tion. The corresponding function F(Q) counting the number of centroids M. Meyer, M. Desbrun, P. Schröder, and A.H. Barr, Discrete differential-
within each window of size 5 3 5 is evaluated. The mean, variance, and geometry operators for triangulated 2-manifolds, In Visualization and Math-
kurtosis of F(Q) are components 10, 11, and 12 of the feature vector. ematics III, Springer-Verlag, Heidelberg, 2003.
The differences of LPC coefficients of contiguous windows in L. Ma, Y. Wang, and T. Tan, Iris recognition using circular symmetric filters
(r,y) are calculated. (Here, contiguous is with respect to the values of Int Conf Pattern Recogn 2 (2002), 414–417.
y.) The FFT of the 20 coefficients are calculated and their real parts L. Ma, Y. Wang, and T. Tan, Iris recognition based on multichannel gabor
are averaged exactly as we did for calculating G1, G2, and G3. These filtering, Proc ACCV 2002 1 (2002), 279–283.
numbers determine a scatter plot in R3 and the volume of the convex R.P. Wildes, Iris recognition: An emerging biometric technology, Proc
closure of the scatter plot is the 13th component of the feature vector. IEEE 85 (1997), 1348–1363.
To the denoised image, we apply circular DCT, discard the low- Y. Zhu, Y. Wang, and T. Tan, Biometric personal identification based on
est frequency, and calculate the kurtosis of each circle. One obtains iris patterns, Int Conf Pattern Recogn 2 (2000), 805–808.

Vol. 16, 230–233 (2007) 233

You might also like