Professional Documents
Culture Documents
\
|
=
) , (
) , (
tan
2
1
,
1
j i V
j i V
j i o
x
y
(1)
where,
( )
+
=
+
=
c c =
4
4
4
4
) , ( ) , ( 2 ,
i
i u
j
j v
y x x
v u v u j i V (2)
( ) ( )
+
=
+
=
c c =
4
4
4
4
2 2
) , ( ) , ( ,
i
i u
j
j v
y x y
v u v u j i V (3)
The value of ) , ( j i o is least square estimate of the local ridge orientation in the block centred at pixel ) , ( j i .
Mathematically, it represents the direction that is orthogonal to the dominant direction of the Fourier spectrum of the
8 8 window.
4. Smooth the orientation field in a local neighbourhood. In order to perform smoothing (low pass filtering), the
orientation image needs to be converted into a continuous vector field, which is defined as
( ) ( ) ) , ( 2 cos ,
1
j i o j i
x
= (4)
and
( ) ( ) ) , ( 2 sin ,
1
j i o j i
y
= (5)
where,
x 1
and
y 1
, are the x and y components of the vector field, respectively.
With the resulting vector field, the low pass filtering can be performed as,
( ) ( ) ( )
= =
=
2 /
2 /
2 /
2 /
1
, , ,
w
w u
w
w v
x x
wv j wu i v u W j i
(6)
and
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 261
( ) ( ) ( )
= =
=
2 /
2 /
2 /
2 /
1
, , ,
w
w u
w
w v
y y
wv j wu i v u W j i
(7)
where, (.) W is a two dimensional low pass filter with unit integral and w w specifies the filter size.
Note that smoothing operation is performed at the block level. For our experimentation we have used a 5 5 mean filter.
The smoothed orientation field Oat ) , ( j i is computed as,
( )
|
|
.
|
\
|
=
) , (
) , (
tan
2
1
,
1
j i
j i
j i O
x
y
(8)
5. Compute the sine component of the smoothed orientation image O, using
( ) ( ) ) , ( sin , j i O j i E = (9)
6. Initialize R, a label image used to indicate the core point.
7. For each pixel ) , ( j i in E , compute the difference in the pixel intensities of those pixels having different orientations
in O.
8. Find the maximum value in R and assign its co-ordinates to the core.
4. REGION OF INTEREST EXTRACTION
Let ( ) y x I , denote the gray level at pixel ( ) y x, in an N M fingerprint image and let ( )
c c
y x , denote the core point. The
region of interest is defined by a collection of sectors
i
S , where the i
th
sector
i
S is computed in terms of parameters ) , ( r
as follows:
( )
( ) ( )
)
`
s s s s < s
+ < s +
=
+
M y N x
T b r T b
y x S
i i
i i
i
1 , 1 ,
, 2 1
,
1
(10)
where,
k div i T
i
=
(11)
( ) ( )
k i
k i
2
mod = (12)
( ) ( )
2 2
c c
y y x x r + = (13)
|
.
|
\
|
=
) (
) (
tan
1
c
c
x x
y y
(14)
bis the width of each band and k is the number of sectors considered in each band.
We use six concentric bands around the center point. Each band is 18-pixels wide (b =18), and segmented into eight
sectors (k =8). The innermost band is not used for feature extraction because the sectors in the region near the center
contain very few pixels. Thus, a total of 40 5 8 = sectors ) (
39 0
S through S are defined.
5. GABOR FILTERS USED FOR FINGERPRINT FEATURE EXTRACTION
By applying properly tuned Gabor filters to a fingerprint image, the true ridge and furrow structures can be greatly
accentuated. These accentuated ridges and furrow structures constitute an efficient representation of a fingerprint image.
The general form of a 2D Gabor filter is defined by (6). A fingerprint image is decomposed into eight component images
corresponding eight different values of
k
= ) 5 . 157 135 , 5 . 112 , 90 , 5 . 67 , 45 , 5 . 22 , 0 (
0 0 0 0 0 0 0 0
and with respect to the x -axis.
6. IMPLEMENTED ALGORITHM
In the proposed algorithm, the filter frequency f is set to the reciprocal of the inter-ridge distance since most local ridge
structures of fingerprints come with well-defined local frequency and orientations. The average inter ridge distance is
approximately 10 pixels in a 500 dpi fingerprint image. If f is too large, spurious ridges may be created in the filtered
image, whereas if f is too small, nearby ridges may be merged into one. The bandwidth of the Gabor filters is
determined by
x
and
y
. If the values of
x
and
y
are too large, the filter is more robust to noise, but is more likely
to smooth the image to the extent that the ridge and furrow details in the fingerprint are lost. On the other hand, if they
are too small, the filter is not effective in removing noise. In the proposed algorithm, the values of
x
and
y
were
empirically determined and both were set to 4.0 and the filter frequency f is set to 0.1.
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 262
Before decomposing the fingerprint image ( ) y x I , , normalize the region of interest ( ) y x N
i
, in each sector separately
to a constant mean and variance. Normalization is done to remove the effects of sensor noise and finger pressure
differences. Let ( ) y x I , denote the gray value at pixel ( ) y x, ,
i
M , and
i
V , the estimated mean and variance of the
sector
i
S respectively and ( ) y x N
i
, , the normalized gray-level value at pixel ( ) y x, . For all the pixels in sector
i
S , the
normalized image is
( )
( )
( )
>
+
=
, ,
) , (
) , ( ,
) , (
,
2
0
0
2
0
0
otherwise
V
M y x I V
M
M y x I if
V
M y x I V
M
y x N
i
i
i
i
i
i
(15)
where
0
M and
0
V are the desired mean and variance values, respectively.
Normalization is a pixel-wise operation that doesnt change the clarity of the ridge and furrow structures. If
normalization is done on the entire image, then it cannot compensate for the intensity variations in the different parts of
the finger due to finger pressure differences. Normalization of each sector separately alleviates this problem. In the
proposed algorithm, both
0
M and
0
V to a value were set to 100.
After setting all the parameters of the Gabor filters, the even Gabor feature, at sampling point ) , ( Y X can be calculated
using,
( ) ( ) ( )
=
+ + =
1
0
1
0
, , , , , , , , , , ,
M
x
N
y
y x k i y x k
f y x g y Y x X N f Y X G (16)
where ( ) .,.
i
N denotes a sector of normalized fingerprint image ( ) y x I , of size N M , having 256 gray-levels.
Figure 2 (a)-(h)Gabor features of fingerprint image for
) 5 . 157 135 , 5 . 112 , 90 , 5 . 67 , 45 , 5 . 22 , 0 (
0 0 0 0 0 0 0 0
and
k
e
Figure 3 (a) Original image (b) Tessellated image (c) Reconstructed image using four Gabor filters (d) Reconstructed
image using eight Gabor filters
The magnitude Gabor features at the sample point and those of its neighbouring points within three pixels are similar,
while the others are not. This is because the magnitude Gabor feature has the shift-invariant property. A fingerprint
image ( ) y x I , is thus normalized and convolved with each of the eight Gabor filters to produce eight component images.
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 263
Convolution with an 0 oriented filter accentuates ridges parallel to the x -axis, and it smoothes ridges that are not
parallel to the x -axis. Filters tuned to other directions work in a similar way. According to the experimental results, the
eight component images capture most of the ridge directionality information present in a fingerprint image (see Figure 2)
and thus form a valid representation. It is illustrated by reconstructing a fingerprint image by adding together all the eight
filtered images. The reconstructed image is similar to the original image but the ridges have been enhanced. Filtered and
reconstructed images from four and eight filters for the fingerprint are shown in Figure 3 and Figure 4.
Figure 4 (a) Original image (b) Tessellated image (c) Reconstructed image using four Gabor filters (d) Reconstructed
image using eight Gabor filters.
6.1 Minutiae Extraction
Minutiae represent local ridge details. Ridge endings and ridge bifurcation are the two popular minutiae used for
fingerprint matching applications. A ridge bifurcation is that point on an image where the ridge branches out into two
and ridge ending is the open end of the ridge. These features are unique for every other fingerprint and are used for
fingerprint recognition. A template image is created for all the detected ridge bifurcations and ridge endings in an image
after false rejection as shown in Figure 5. The minutiae matching score is a measure of similarity of the minutiae sets of
the query and template images. The similarity score is normalized in the [0,100] range.
(a) (b)
Figure 5 Minutiae set (a) queryimageand (b) template image
6.2 Finger code generation
To generate the Gabor filter-based finger code from the fingerprint image following steps are performed sequentially as:
Step 1:Find the core point of each fingerprint image.
Step 2:Tessellate the region of interest around the reference point into 40 sectors and sample the fingerprint image by
set of Gabor filters to give ( ) y x N
k
i
,
\
|
=
i
n
i i
i
i
P y x N
n
F
) , (
1
(17)
where,
i
n , is the number of pixels in the sector
i
S ,
i
P is the mean of pixel values in the sector
i
S .
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 264
Thus, the average absolute deviation of each sector of the eight filtered images defines the components (320) of the finger
code ) 8 5 8 ( . The query and template finger codes are then matched and the matching score is found. The minutiae
and finger code matching scores are then combined to generate a single matching score.
7. EXPERIMENTAL RESULTS
Although the fingerprint databases of NIST, MSU, and FBI are sampled at 500 dpi, the fingerprint images can be
recognized at 200 dpi by the human eye. The recognition of low quality images is efficient and practicable for a small-
scale fingerprint recognition system. In the proposed system we have used a inked fingerprint image from the person (two
images) and captured the digital format with a scanner at 200dpi and 256 gray-level resolutions. The minutiae and finger
code is stored in the database as a template image. The minutiae features are unique for every other fingerprint and are
used for fingerprint recognition.
Figure 6 (a)-(h) Finger codes for ) 5 . 157 135 , 5 . 112 , 90 , 5 . 67 , 45 , 5 . 22 , 0 (
0 0 0 0 0 0 0 0
and
k
e (i) original image.
Figure 7 (a)-(h) Finger codes for ) 5 . 157 135 , 5 . 112 , 90 , 5 . 67 , 45 , 5 . 22 , 0 (
0 0 0 0 0 0 0 0
and
k
e (i) original image.
Figure 6 and Figure 7 shows finger codes of two fingerprints belonging to different persons. From theseFigures we find
that the finger codes of different persons do not match. This reveals that by using both the minutiae and the finger codes
generated provides more security when it is used for criminal identification using fingerprints found at the location of
crime. The fingerprint matching is based on the Euclidean distance between the two corresponding finger codes and
hence is extremely fast.
Another experiment was performed to find the Euclidian distance between the test image and rest images of the same
group. Table 1 shows the test results for 10 fingerprint images with its own individual set of 8-distracted images. This
distraction was carried out with respect to brightness, contrast, partial cut of images, blurriness, etc. The implemented
system results tabulated shows that if Euclidian distance is equal to zero then the perfect match has been found else not.
The test data shows that the Euclidian distance and its mean plays a vital role in identifying any given input image
(latent) with its corresponding stored template images (reference print). The implemented system outperforms on the
whole database.
Table 1Computation of Euclidian distance and Mean using the proposed algorithm
Serial No Image ID Euclidian Distance Mean of Euclidean Distance
1 101_1 0 (5570.8424 /8) =696.3553
101_2 1322.2815
101_3 904.1864
101_4 840.7033
101_5 963.627
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 265
101_6 522.3945
101_7 462.6671
101_8 554.9826
2 102_1 859.8459 (6927.1888 /8) =865.8986
102_2 1118.2438
102_3 941.0052
102_4 914.3823
102_5 939.1802
102_6 799.4998
102_7 756.8328
102_8 598.1988
3 103_1 1114.7228 (6358.405 /8) =794.800625
103_2 768.2789
103_3 879.9089
103_4 1142.6031
103_5 750.9579
103_6 705.0551
Serial No Image ID Euclidian Distance Mean of Euclidean Distance
103_7 445.9306
103_8 550.9477
4 104_1 741.2131 (4261.459 /8) =532.682375
104_2 578.8144
104_3 692.6348
104_4 584.2756
104_5 454.4189
104_6 392.6432
104_7 361.272
104_8 456.187
5 105_1 796.7528 (5927.0577 /8) =740.8822125
105_2 841.3898
105_3 809.0306
105_4 957.4854
105_5 685.9827
105_6 756.334
105_7 528.4521
105_8 551.6303
6 106_1 652.761 (5701.8574 /8) =712.732175
106_2 708.5278
106_3 918.1781
106_4 731.8565
106_5 647.9269
Serial No Image ID Euclidian Distance Mean of Euclidean Distance
106_6 611.8686
106_7 835.2303
106_8 595.5082
7 107_1 675.2977 ( 5999.043 /8) =749.880375
107_2 654.7358
107_3 899.821
107_4 910.4479
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 266
107_5 577.1496
107_6 902.5137
107_7 740.1758
107_8 638.9015
8 108_1 645.7223 (4846.51872 /8) =605.81484
108_2 509.1288
108_3 888.2192
108_4 496.0903
108_5 597.2147
108_6 659.3944
108_7 419.7498
108_8 630.9985
9 109_1 794.6275 (5317.2688 /8) =664.6586
109_2 754.6599
109_3 877.7669
109_4 914.9374
Serial No Image ID Euclidian Distance Mean of Euclidean Distance
109_5 565.109
109_6 475.1324
109_7 552.8026
109_8 382.2331
10 110_1 932.6882 ( 5546.1557/8) =693.2694625
110_2 740.8111
110_3 755.3441
110_4 655.5515
110_5 502.4248
110_6 592.0504
110_7 719.5817
Figure 8 Enrollment of the fingerprint image from the subject.
The Snapshots of the GUI for various imprints are provided below. Accept the fingerprint image from the subject using a
fingerprint sensor.Then enrolment of the fingerprint image from the subject in the form of feature vector is carried out as
shown in Figure 8. Matching the query fingerprint image found at the crime scene with the fingerprint images available
in the database is then carried out as shown in Figure 9.
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 267
Figure 9 Matchingthe query fingerprint image found at the crime scene with the fingerprint images available in the
database.
8. CONCLUSION
The proposed matching algorithm that uses both minutiae (point) information and the texture (region) information is
more accurate. Results obtained on the fingerprint captured in digital format with a scanner at 200 dpi and 256 gray level
resolutions shows that a combination of minutiae based score matching and texture based (local as well as global)
information leads to a substantial improvement in the overall matching performance. The filter frequency f and the
values of
x
and
y
that determine the bandwidth of the Gabor filter should be selected properly. If f is too large,
spurious ridges may be created in the filtered image, whereas if f is too small, nearby ridges may be merged into one.
Similarly, if the values of
x
and
y
are too large, the filter is more robust to noise, but is more likely to smooth the
image to the extent that the ridge and furrow details in the fingerprint are lost. On the other hand, if they are too small,
the filter is not effective in removing noise. The fingerprint matching using Euclidean distance between the query and the
template image is extremely fast. This reveals that by setting the parameters to appropriate values, the method is more
efficient and suitable than the conventional methods as an automated system for criminal identification based on
fingerprints found at the crime scene.Also, the Euclidian distance and its mean play a vital role in identifying any given
input image with its corresponding stored template images.
References
[1] S. Pankanti, R.M. Bolle, A. Jain, Biometrics: the future of identification, IEEE Comput. 33 (2) (2000) 4649.
[2] A. Jain, R. Bolle, S. Pankanti (Eds.), Biometrics: Personal Identification in Networked Society, Kluwer Academic,
Dordrecht, 1999.
[3] D. Maio and D. Maltoni, Direct Gray-Scale Minutiae Detection in Fingerprints, IEEE Trans. PAMI, Vol 19, No 1,
pp 27- 40, 1997.
[4] G. T. Candela, P. J. Grother, C. I. Watson, R. A. Wilkinson, and C. L. Wilson, PCASYS: A Pattern-Level
Classification Automation System for Fingerprints, NIST Tech. Report NISTIR 5647,August 1995.
[5] H. C. Lee, and R. E. Gaensslen, Advances in Fingerprint Technology, Elsevier, New York, 1991.
[6] S. Prabhakar, and A. K. Jain, Fingerprint Classification and Matching, A PhD thesis Submitted to Michigan State
University, 2001.
[7] N. Ratha, K. Karu, S. Chen, and A. K. Jain, A Real-Time Matching System for Large Fingerprint Databases, IEEE
Trans. Pattern Anal.and Machine Intell., Vol. 18, No. 8, pp. 799-813, 1996.
[8] A. K. Jain, L. Hong, S. Pankanti, and Ruud Bolle, An Identity Authentication System Using Fingerprints,
Proceedings of the IEEE, Vol. 85, No. 9, pp. 1365-1388, 1997.
[9] X. Jaing, W. Y. Yau, Fingerprint Minutiae Matching based on the Local and Global Structures, Proc. 15th
International Confererence on Pattern Recognition, Vol. 2, pp. 10421045, Barcelona, Spain, September 2000.
[10] A. K. Jain, L. Hong, and R. Bolle, On-line Fingerprint Verification, IEEE Trans. Pattern Anal. and Machine
Intell., Vol. 19, No. 4, pp. 302-314, 1997.
[11] A. K. Jain, S. Prabhakar, and L. Hong, A Multichannel Approach to Fingerprint Classification, IEEE Trans.
Pattern Anal.and Machine Intell., Vol. 21, No. 4, pp. 348-359, 1999.
[12] A. K. Jain, S Prabhakar, L Hong, and S Pankanti, Filterbank-based Fingerprint Matching, IEEE Trans. Image
Proc.Vol 9, No 5, pp 846-859, 2000.
International Journal of Application or Innovation in Engineering& Management (IJAIEM)
Web Site: www.ijaiem.org Email: editor@ijaiem.org, editorijaiem@gmail.com
Volume 2, Issue 11, November 2013 ISSN 2319 - 4847
Volume 2, Issue 11, November 2013 Page 268
[13] A. K. Jain, A. Ross, and S. Prabhakar, Fingerprint Matching Using Minutiae and Texture Features, in proc. of the
Int. Conf. on Image Processing (ICIP), Greece, pp 282-285, Oct. 2001.
[14] A. Ross, A. K. Jain, and J. Reisman, A hybrid Fingerprint Matcher, in proc. of the Int.Conf. on Pattern
Recognition (ICPR), Quebec City, Aug.2002.
[15] A. K. Jain, S. Prabhakar, and S. Chen, Combining Multiple Matchers for a High Security Fingerprint Verification
System, Pattern Recognition Letters, Vol 20, No. 11-13, pp. 1371-1379, Nov.1999.
[16] A. K. Jain, L. Hong, S. Pankanti, and R. Bolle, An Identity Authentication System using Fingerprints, Proc. of the
IEEE, Vol. 85, No 9.
[17] Pradeep M. Patil, Shekhar R Suralkar, and Faiyaz B Shaikh, System authentication using hybrid features of
fingerprint, ICGST International Journal on Graphics, Vision and Image Processing (GVIP), Issue 1, Vol. 6, July
2006, pp. 43-50.
Author:
M.P. Deshmukh :- He recieved M.E. from MNREC, Allahabad and presently persuing his PhD from North
Maharashtra University, Jalgaon (M.S.). He has 24 years of teachingexperience
Prof (Dr) P.M. Patil:-l He is having 25 years of experience and at present Director & Principal RMD, SIT,
Warje, Pune. He has several publications in national & International journals and number of research
students are persuing PhD under his guidance.