This action might not be possible to undo. Are you sure you want to continue?

**BIOMETRIC PALMPRINT VERIFICATION
**

I Ketut Gede Darma Putra

Departement of Electrical Engineering, Faculty of Engineering

Udayana University, Bukit Jimbaran, Bali - Indonesia

email : duglaire@yahoo.com

Abstract — This paper proposes a new technique to extract

the palmprint features based on some fractal codes. The

palmprint features representation is formed based on position

of range blocks and direction between the position of range

and domain blocks of fractal codes. Each palmprint

representation is divided into a set n blocks and the mean

value of each block are used to form the feature vector. The

normalized correlation metrics are used to measure the

degree of similarity of two feature vectors of palmprint

images. We collected 1050 palmprint images, 5 samples from

each of 210 persons. Experiment results show that our

proposed method can achieve an acceptable accuracy rate

with FRR = 1.754, and FAR= 0.699.

Keyword; biometrics, fractal codes, fractal dimension,

feature extraction, palmprint recognition

I. INTRODUCTION

The personal verification becomes an important and

highly demanded technique for security access systems in

this information area. Traditional automatic personal

recognition can be divided into two categories: token-

based, such as a physical key, an ID card, and a passport,

and knowledge-based, such as a password and a PIN.

However these approaches have some limitations. In the

token-based approach, the “token” can be easily stolen or

lost. In the knowledge-based approach, the “knowledge”

can be guessed or forgotten [21]. In order to reduce the

security problem caused by traditional methods, biometric

verification techniques have been intensively studied and

developed to improve reliability of personal verification.

Biometric-based approach use human physiological or

behavioral features to identify a person. The most widely

used biometric features are of the fingerprints and the most

reliable are of the irises. However, it is very difficult to

extract small minutiae features from unclear fingerprints

and the iris input devices are very expensive [19]. Other

biometric features such as of face, voice, hand geometries,

and handwritten are less accurate. Faces and voices can be

mimicked easily, hand geometries and handwritten can be

faked easily.

Palmprint is the relatively new in physiological

biometrics [18]. There are many unique features in a

palmprint image that can be used for personal recognition.

Principal lines, wrinkles, ridges, minutiae points, singular

points and texture are regarded as useful features for

palmprint representations [21]. A palmprint has several

advantages compared to other available features: low-

resolution images can be used, low cost capture devices

can be used, it is very difficult or impossible to fake

palmprints, and their characteristics are stable and unique

[18].

Recently, many verification/identification technologies

using palmprint biometrics have been developed

[2],[3],[4],[5],[11],[12],[13],[18],[21]. Zhang et al. [21]

applied 2-D Gabor filter to obtain the texture features of

palmprints. Pang at al. [13] used the pseudo-orthogonal

moments to extract the features of palmprint. LI et al. [12]

transformed the palmprint from spatial to frequency

domain using Fourier transform and then computed ring

and sector energy features. Connie at al.[2] extracted the

texture feature of palmprint using PCA and ICA. Wu et

al.[18] extracted line feature vectors (LFV) using the

magnitudes and orientations of the gradient of the points

on palm-lines. Kumar et al.[11] combined the palmprints

and hand geometries for verification system. Each

palmprint was divided into overlapping blocks and the

standard deviation value of each block was used to form

the feature vector.

In this paper, we propose a new technique to extract the

features of palmprint based on fractal codes. This

technique is different with the method in [4] and [5].

II. IMAGE ACQUISITION

All of palm images are captured using Sony DSC P72

digital camera with resolution of 640 x 480 pixels. Each

persons was requested to put his/her left hand palm down

on with a black background. There are some pegs on the

board to control the hand oriented, translation, and

stretching. A sample of the hand and pegs position on the

black board is shown on Figure 1 (a).

III. PALMPRINT EXTRACTION AND

NORMALIZATION

This paper used new technique to extract the ROI

(region of interest) of palmprint. This technique consists of

two steps in center of mass (centroid) method. These steps

can be explained as follow.

a. The gray level hand image is thresholded to obtain the

binary hand image. The threshold value was computed

automatically using the Otsu method. To avoid the

white pixels (not pixel object) outside of the hand

object is used median filter.

(IJCSIS) International Journal of Computer Science and Information Security,

Vol. 9, No. 2, February 2011

47 http://sites.google.com/site/ijcsis/

ISSN 1947-5500

b. Each of the acquired hand images needs to be aligned

in a preferred direction so as to capture the same

features for matching. The moment orientation method

is applied to the binary image to estimate the

orientation of the hand. In the method, the angle of

rotation (

θ

) is the difference between normal axis and

major axis of ellipse that can be computed as follows.

−

=

−

2 , 0 0 , 2

1 , 1 1

2

tan

2

1

µ µ

µ

θ

(1)

( ) ( )

q

p

m n

q p

n n m m − − =

∑∑ ,

µ (2)

where

q p,

µ represent the (p,q)

th

moment central, and

( n m, ) represents center of area is defined as

∑∑

=

m n

m

N

m

1

,

∑∑

=

m n

n

N

n

1

, (3)

where N represents number of pixel object.

Furthermore, the grayscale and the binary image are

rotated about (

θ

) degree.

c. Bounding box operation is applied to the rotated

binary image to get the smallest rectangle which

contains the binary hand image. The original hand

image, binarized image, and the bounded image

shown in Figure 1 (a), (b), and (c), respectively.

d. The centroid of bounded image is computed using

equation (3) and based on this centroid, the bounded

binary and original images are segmented with 200 x

200 pixels. The segmented image and its centroid

position are shown in Figure 1 (d) and (e).

e. The centroid of the segmented binary image is

computed and based on this centroid the ROI of

grayscale palmprint image can be cropped with size

128 x 128 pixels. The first and the second positions of

centroid in binary and gray level image are shown in

Figure 1 (f) and (g).

This method is so simple. This method has been tested

for 1050 palmprint images acquired from 210 persons, and

the results show this method is reliable.

Before the feature extraction phase, the extracted ROI

are normalized using normalization method in [11] to

reduce the possible imperfections in the image due to non-

uniform illumination. The method is as below:

otherwise

y x I if

y x I

d

d

φ

λ φ

λ φ >

¹

´

¦

−

+

=

) , (

) , ( '

(4)

ρ

φ ρ

λ

2

} ) , ( { −

=

y x I

d

(5)

where I and I’ represents original grayscale palmprint

image and the normalized image respectively, φ and ρ

represents mean and variance of the original image

respectively, while φ

d

and ρ

d

are the desired values for

mean and variance respectively. This research use φ

d

= 180

and ρ

d

= 180 for all experiments.

(a) (b) (c)

(d) (e) (f) (g)

Figure 1. Extraction of palmprint, (a) original image, (b)

binary image of (a), (c) object bounded, (d) and (e)

position of the first centroid mass in segmented binary and

gray level image, respectively, (f) and (g) position of the

second centroid mass in segmented binary and gray level

image, respectively.

IV. FEATURES EXTRACTION

There are three main steps to extract the palmprint

features based on fractal codes proposed in this paper.

These steps can be explained as follows.

A. Extraction of fractal codes of palmprint images

Fractal codes of palmprint images are obtained using

the partitioned iterated function system (PIFS) method. In

PIFS method, each image is partitioned into its range

blocks and domain blocks. The size of the domain blocks

is usually larger than the size of the range blocks. The

relation between a pair of range block (R

i

) and domain

block (D

i

) is noted as

( )

i i i

D w R = (6)

w

i

is contracted mapping that describes the similarity

relation between R

i

and D

i

, and is usually defined as an

affine transformation as below:

+

=

i

i

i

i

i

i

i

i i

i i

i

i

i

i

o

f

e

z

y

x

s

d c

b a

z

y

x

w

0 0

0

0

(7)

where x

i

and y

i

represent top-left coordinate of the R

i

, and

z

i

is the brightness value of its block. Matrix elements a

i

,

b

i

, c

i

, and d

i

, are the parameters of spatial rotations and

flips of D

i

, s

i

is the contrast scaling and o

i

is the luminance

offset. Vector elements e

i

and f

i

are offset value of space.

In this paper, we used the size of domain region twice the

range size, so the values of a

i

, b

i

, c

i

, and d

i

are 0.5. The

actual fractal code p

i

below is usually used in practice[19].

( ) ( ) ( )

i i i i R R D D i

o s size y x y x f

i i i i

, , , , , , , θ = (8)

(IJCSIS) International Journal of Computer Science and Information Security,

Vol. 9, No. 2, February 2011

48 http://sites.google.com/site/ijcsis/

ISSN 1947-5500

where ( )

i i

R R

y x , and ( )

i i

D D

y x , represent top-left

coordinate position of the range block and domain block,

respectively, and size is the size of range block. The fractal

codes of a palmprint image is denoted as follow:

U

N

i

i

f F

1 =

= (9)

where N represents the number of the fractal code. The

inequality expression below is used to indicate whether the

range and the relevant domain block are similar or not.

( ) , , ε ≤ D R d (10)

where d(R,D) represents rmse value, and є is the threshold

(tolerance) value. The range and the relevant domain block

is similar if d(R,D) is less or equal than є. Otherwise, the

block is regarded not similar.

B. Palmprint features representation

The first step of this method is the forming of angle

image A as follows.

( )

2 1

, 3 , 2 , 1 , , 3 , 2 , 1 , , M k M j k j A

i

K K = = =α (11)

i

R D

R D

i

x x

y y

−

−

= arctan α

i i R

y k and

R

x j if = =

,

otherwise, 0 =

i

α (12)

where ( )

i i

D D

y x , represent top-left coordinate of the

domain block (see formula (8)) and d

i

represent the angle

between range and domain block. The angle image is not

binary image representation. The criterion below are added

to compute the direction .

i

α

i i D R D R

then y y and x x if α α = ≥ <

i i D R D R

then y y and x x if α α − = ≥ > 180

i i D R D R

then y y and x x if α α + = ≤ > 180

i i D R D R

then y y and x x if α α − = ≤ < 360

90 = ≥ =

i D R D R

then y y and x x if α

270 = ≤ =

i D R D R

then y y and x x if α (13)

The criterion ) min(size size

i

= means the palmprint

features representation is formed practically using the

coordinate of the smallest size range block. Later, the

representation is filtered as follow.

( ) ( ) ( ) , , , ,

'

n x m

y x h y x I y x I ∗ = (14)

h(x,y) is filter which all of its component are one. Figure

2(b) show the palmprint features image of Figure 2(a).

C. Palmprint feature vector

Palmprint feature vector (V) is obtained by dividing

the palmprint image into 16 x 16 blocks, and for each

block its mean value is computed, so obtained the feature

vector ( )

N

v v v V , ,

2 1

K = , where N = 256,and v

i

is

mean value of block i.

(a) (b)

(c) (d)

Figure 2. Palmprint feature extraction, (a) original image,

(b) Image I, (c) Image I’, (d) block feature representation

The Figure 2 (d) show the palmprint feature representation

in 16 x 16 sub blocks. Figure 3 shows example of three

groups of palmprints from the same palm and palms with

similar/different line structures. The features of these

palmprints are plotted in figure 4. The results show that the

features of three palm images from the same person are

close to each other than the features of three palm images

from the different persons with similar or different line

structures.

V. PALMPRINT FEATURE MATCHING

The degree of similarity between two palmprint

features is computed as follows:

( )( )

( )( ) [ ] ( )( ) [ ]

2

1

2

1

1

T

s s s s

T

r r r r

T

s s r r

rs

x x x x x x x x

x x x x

d

− − − −

− −

− =

(15)

where

s r

x x , are the mean of palmprint feature x

r

and x

s

,

respectively. The above equation computes one minus

normalized correlation between palmprint feature vector x

r

and x

s

. The values of d

rs

are between 0 – 2. The

rs

d will

be close to 0 if x

r

and x

s

obtained from two image of the

same palmprint. Otherwise, the

rs

d will be far from 0.

Figure 4 shows comparison of feature component of

those palmprint shown in figure 3, and their score are listed

in Table 1. The matching score of group A are close to 0,

and the matching score of group B and C are far from 0.

The average score of group A, B, and C are 0.1762,

0.5057, and 0.6452, respectively. It is easy to distinguish

group A from group B and C using these scores.

(a1) (a2) (a3)

Group A: palmprints from the same person

(IJCSIS) International Journal of Computer Science and Information Security,

Vol. 9, No. 2, February 2011

49 http://sites.google.com/site/ijcsis/

ISSN 1947-5500

(b1) (b2) (b3)

Group B: palmprints from different person with similar line

structure

(c1) (c2) (c3)

Group 3: palmprints from different person with different line

structure

Figure 3. Example of three groups of palmprint

Table 1 Matching Score of groups A, B, and C in figure 3

a1 a2 a3

Average

a1 0 0.1957 0.1404

0,1762 a2 0.1957 0 0.1925

a3 0.1404 0.1925 0

b1 b2 b3

Average

b1 0 0.5352 0.3056

0,5057 b2 0.5352 0 0.6763

b3 0.3056 0.6763 0

c1 c2 c3

Average

c1 0 0.6900 0.6177

0,6452 c2 0.6900 0 0.6280

c3 0.6177 0.6280 0

VI. EXPERIMENTS AND RESULTS

We collected palm image from 210 persons from both

sexes and different ages, 5 samples from each person, so

our database contains 1050 images. The resolution of hand

image is 640 x 480 pixels. The palmprint images, of size

128 x 128 pixels, were automatically extracted from hand

image as described in the Section 3. The averages of the

first three images from each user were used for training

and the rest were used for testing.

The performances of the verification system are

obtained by matching each of testing palmprint images

with all of the training palmprint images in the database. A

matching is noted as a correct matching if the two

palmprint images are from the same palm and as incorrect

if otherwise.

(a)

(b)

(c)

Figure 4. Comparison of feature component of the

palmprint group shown in figure 2. (a),(b),(c) are feature

component of group A, B, and C, respectively. Red, green,

blue color are the first, second, and third palmprint in each

group, respectively.

Figure 5. Distribution of three feature components

of 1050 palmprints in feature space

0

50

100

150

200

250

0

100

200

300

400

0

100

200

300

400

v

22

v

24

v

26

(IJCSIS) International Journal of Computer Science and Information Security,

Vol. 9, No. 2, February 2011

50 http://sites.google.com/site/ijcsis/

ISSN 1947-5500

(a) (b)

Figure 6. Performance of verification system,(a) genuine

and imposter distribution, (b) FAR/FRR/EER with various

threshold

Table 2. FRR/FAR with various threshold value

Threshold FRR FAR

0.4386 2.0734 0.4734

0.4586 1.9139 0.5158

0.4626 1.7544 0.6998

0.4746 1.4354 0.9160

0.4786 1.2759 1.3552

0.4986 1.1164 2.1480

0.5386 1.1164 2.2881

Figure 6 (a) shows the probability distributions of a

genuine and imposter parts with tolerance value = 3, and

feature vector length = 256 (16 x 16 blocks). The genuine

and imposter parts are estimated from correct and incorrect

matching scores, respectively. The result with various

threshold and false acceptance rates (FAR)/false rejection

rates (FRR) are shown in figure 6 (b). The equal error rate

(EER) of the verification system is 1.2758. Table 2 show

the performance (FAR/FRR) system with some threshold

values.

The main advantage by using PIFS code in this paper

is both palmprint feature and palmprint image can be

obtained directly from compressed domain (fractal code).

VII. CONCLUSIONS AND FUTURE WORK

In this paper, we introduced a fractal

characteristics based feature extraction and representation

method for palmprint verification. The experiment results

show that the proposed method can achieve an acceptable

accuracy rate with FRR = 1.7544, and FAR= 06998. In the

future, we will combine the proposed method with wavelet

transformation to extract the feature of palmprint to retain

the block operation.

REFERENCES

[1] Chih-Lung Lin., “Biometric Verification Using

Palmprints and Vein-patterns of Palm-dorsum”,

http://thesis.lib.ncu.edu.tw/etd-db/etd-search/

[2] Connie T., Andrew Teoh, Michael Goh, David Ngo,

2003, “Palmprint Recognition with PCA and ICA”,

sprg.massye.ac.nz/ivcnz/proccedings/ivcnz_41.pdf

[3] C.L. Lin, Biometric Verification Using Palmprints

and Vein-patterns of Palm-dorsum, 2004,

http://thesis.lib.ncu.edu.tw/etd-db/etd-search/

[4] Darma Putra, IKG., Adhi Susanto, A. Harjoko & TS.

Widodo, Palmprint Verification based on Fractal

Codes and Fractal Dimensions, Proceedings of the

Eighth IEASTED International Conference Signal and

Image Processing, Honolulu, Hawai, 2006, 323–328.

[5] Darma Putra, Adhi Susanto, Agus Harjoko, Thomas

Sri Widodo, 2006, Biometrics Palmprint Verification

Using Fractal Method, EECCIS proceedings, Part 2,

pp.22-23, Brawijaya University, Malang, Indonesia.

[6] Duta N., Jain A.K., Mardia K.V.,2002, Matching of

Palmprints, Pattern Recognition Letters, 23, pp. 477-

485.

[7] Ekinci Murat, Vasif V., Nabiyev, Yusuf Ozturk, 2003,

A Biometric Personal Verification Using Palmprint

Structural Features and Classifications, IJCI

Proceedings of Intl, XII, Vol.1, No.1.

[8] Jain A.K., 1995, Fundamentals of Digital Image

Processing, Second Printing, Prentice-Hall, Inc.

[9] Jain A.K., Ross A., and Pankanti S., 1999, A Prototype

Hand Geometry-based Verification System,

www.research.ibm.com/ecvg/publications.html

[10] Jain A.K, Introduction to Biometrics System,

http://biometrics.cse.msu.edu/.

[11] Kumar A., David C.M.Wong, Helen C.Shen, Anil

K.Jain, 2004, “Personal Verification using Palmprint

and Hand Geometry Biometric”,

http:/biometrics.cse.msu.edu/Kumar_AVBPA2003.pdf

[12] LI Wen-xin, David Z,, Shuo-qun XU., 2002,

Palmprint Recognition Based on Fourier Transform,

Journal of Software, Vol.13, No.5

[13] Pang Y., Andrew T.B.J., David N.C.L., Hiew Fu San.,

2003, Palmprint Verification with Moments, Journal of

WSCG, Vol.12, No.1-3, ISSN 1213-6972, Science

Press.

[14] Sarraille, J., 2002, Developing Algorithms For

Measuring Fractal Dimension, http://ishi.csustan.edu

[15] Shu W., Zhang D., 1998, Automated personal

identification by palmprint, Opt. eng., Vol. 37, No.8,

pp. 2359-2363.

[16] Tao Y., Thomas R.I., Yuan Y.T., Extraction of

Rotation Invariant Signature Based On Fractal

Geometry, http://cs.tamu.edu

(IJCSIS) International Journal of Computer Science and Information Security,

Vol. 9, No. 2, February 2011

51 http://sites.google.com/site/ijcsis/

ISSN 1947-5500

[17] Wohlberg B., Gerhanrd de Jager, 1999, A Review of

the Fractal Image Coding Literature, IEE

Transactions on Image Processing, Vol. 8, No.12.

[18] WU Xiang-Quan, Kuan-Quan Wang, David Zhang,

2004, An Approach to Line Feature Representation

and Matching for Palmprint Recognition, Journal of

Software, Vol.15., No.6.

[19] Yokoyama T., Sugawara K., Watanabe T., Similarity-

based image retrieval system using partitioned

iterated function system codes, The 8

th

International

Symposium on Artificial Life and Robotics, January

24-26 2006, Oita, Japan,

email:yokotaka@sd.is.uec.ac.jp

[20] Yokoyama T., Watanabe T., Koga H.,Similarity-

Based Retrieval Method for Fractal Coded Images in

the Compressed Data Domain,

email:yokotaka@sd.is.uec.ac.jp

[21] Zhang D., Wai-Kin Kong, Jane You, Michael Wong,

2003, Online Palmprint Identification, IEEE

Transaction on Pattern Analysis and Machine

Intelligence, Vol.25, No.9.

[22] Zhang D., and W.Shu, Two novel characteritics in

palmprint verification: datum point invariance and

line feature matching, pattern recognition vol 32,

pp.691-702,1999

AUTHOR PROFILE

Dr. I Ketut Gede Darma Putra is a

lecturer in Department of Electrical Engineering and

Information Technology, Udayana University Bali,

Indonesia. He obtained his master and doctorate degree on

informatics engineering from Electrical Engineering,

Gadjah Mada University, Indonesia. His research interest

includes biometrics, image processing, expert system and

Soft computing.

(IJCSIS) International Journal of Computer Science and Information Security,

Vol. 9, No. 2, February 2011

52 http://sites.google.com/site/ijcsis/

ISSN 1947-5500

This paper proposes a new technique to extract the palmprint features based on some fractal codes. The palmprint features representation is formed based on position of range blocks and direction be...

This paper proposes a new technique to extract the palmprint features based on some fractal codes. The palmprint features representation is formed based on position of range blocks and direction between the position of range and domain blocks of fractal codes. Each palmprint representation is divided into a set n blocks and the mean value of each block are used to form the feature vector. The normalized correlation metrics are used to measure the degree of similarity of two feature vectors of palmprint images. We collected 1050 palmprint images, 5 samples from each of 210 persons. Experiment results show that our proposed method can achieve an acceptable accuracy rate with FRR = 1.754, and FAR= 0.699.

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd