Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Save to My Library
Look up keyword
Like this
2Activity
0 of .
Results for:
No results containing your search query
P. 1
Person Identification System using Static-dynamic signatures fusion

Person Identification System using Static-dynamic signatures fusion

Ratings: (0)|Views: 104 |Likes:
Published by ijcsis
Abstract — Off-line signature verification systems rely on static image of signature for person identification. Imposter can easily imitate the static image of signature of the genuine user due to lack of dynamic features. This paper proposes person identity verification system using fused static-dynamic signature features. Computational efficient technique is developed to extract and fuse static and dynamic features extracted from offline and online signatures of the same person. The training stage used the fused features to generate couple reference data and classification stage compared the couple test signatures with the reference data based on the set threshold values. The system performance is encouraging against imposter attacker in comparison with previous single sensor offline signature identification systems.

Keywords- fused static-dynamic signature; feature extraction; forgeries
Abstract — Off-line signature verification systems rely on static image of signature for person identification. Imposter can easily imitate the static image of signature of the genuine user due to lack of dynamic features. This paper proposes person identity verification system using fused static-dynamic signature features. Computational efficient technique is developed to extract and fuse static and dynamic features extracted from offline and online signatures of the same person. The training stage used the fused features to generate couple reference data and classification stage compared the couple test signatures with the reference data based on the set threshold values. The system performance is encouraging against imposter attacker in comparison with previous single sensor offline signature identification systems.

Keywords- fused static-dynamic signature; feature extraction; forgeries

More info:

Published by: ijcsis on Oct 10, 2010
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

08/18/2011

pdf

text

original

 
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 6, September 2010
Person Identification System using Static-dyamicSignatures Fusion
Dr. S.A Daramola
Department of Electrical and Information EngineeringCovenant UniversityOta, Nigeriadabaslectcu@yahoo.com
Prof. T.S Ibiyemi
Department of Electrical EngineeringUniversity of IlorinIlorin, Nigeriatsibiyemi@yahoo.com 
 Abstract
 — 
Off-line signature verification systems rely on staticimage of signature for person identification. Imposter can easilyimitate the static image of signature of the genuine user due tolack of dynamic features. This paper proposes person identityverification system using fused static-dynamic signature features.Computational efficient technique is developed to extract andfuse static and dynamic features extracted from offline and onlinesignatures of the same person. The training stage used the fusedfeatures to generate couple reference data and classification stagecompared the couple test signatures with the reference databased on the set threshold values. The system performance isencouraging against imposter attacker in comparison withprevious single sensor offline signature identification systems.
 Keywords- fused static-dynamic signature; feature extraction; forgeries
I.
 
I
NTRODUCTION
 Person identity verification is a problem of authenticatingindividual using physiological or behavioral characteristicslike face, iris, fingerprint, signature, speech and gait. Personidentification problem can be solved manually orautomatically. Biometric systems automatically use biometrictrait generated from one or two sensors to validate theauthenticity of a person. Automatic person identityverification based on handwritten signature can be classifiedinto two categories: on-line and off-line, differentiated by theway signature data is acquired from the input sensor. In off-line technique, signature is obtained on a piece of paper andlater scanned to a computer system while in on-line technique,signature is obtained on a digitizer thus making dynamicinformation like speed, pressure available while in offline onlythe shape of the signature image is available [1] [2].In this paper, combination of offline and online signaturesare used for person identification. The process involvesverification of a signature signed on both paper and electronicdigitizer concurrently. Therefore the physical present of thesigner is required during the period of registration andverification. This type of system is useful particular in thebank while the physical present of the holders of savingcurrent are required before money can be withdrawn. InNigeria many banks manually identify holder of saving currentusing face and static signature, in the process genuine usersare rejected as imposters because of high intra-class variationin signatures. Frauds as result of signature forgeries must beprevented particular among closely resemble people. Fusionof dynamic and static signature will strengthen theidentification of people physically in paper documentationenvironment.A detailed review of on-line signature verificationincluding summary of off-line work until the mid 199
0’s was
reported in [1] [2]. Alessandro et al. [3] proposed a hybridon/off line handwritten signature verification system. Thesystem is divided into two modules. The acquisition andtraining module use online-offline signatures, while theverification module deals with offline signatures. Soltane et al[4] presented a soft decision level fusion approach for acombined behavioral speech-signature biometrics verificationsystem. And Rubesh et al [5] presented online multi-parameter3D signature and cryptographic algorithm for personidentification. Ross et al [6] presented hand-face multimodalfusion for biometric person authentication while Kiskus et al[7] fused biometrics data using classifiers.From the approaches mentioned above, some of theauthors used online signature data to strengthen the systemperformance either at registration or training stage, whileothers combined online signature data with other biometricmodalities data like speech, face as means of personidentification. The system proposes in this novel frame work is based on fusion of static and dynamic signature data atfeature level for person identification. The universalacceptance of signature, compatibility of offline and onlinesignature features make the proposed system more robust,accurate and friendly in comparison with other previous multibiometric modalities systems or single sensor offline systemfor person identification.Section 2 provides the description of the system, thesignature preprocessing and feature extraction and fusiontechnique. Also in section 2, the signature training, thresholdselection and classification are presented. Section 3 shows theexperimental results and finally, conclusions are drawn insection 4.
88http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 6, September 2010
II.
 
PROPOSED SYSTEM
 The system block diagram is shown in Fig. 1. The offlineand online data are collected at the same time from the sameuser during registration/training and verification exercises.Also the offline signature are preprocessed to removeunwanted noise introduced during scanning process whereas,the online signatures are not preprocess in order to preservethe timing characteristics of the signature. Discriminativestatic and dynamic features are extracted separately fromoffline and online signature respectively. At the feature levelthe two signatures are fused together to obtain a robust static-dynamic features. These features are used to generate couplereference data during training and for signatures classification.
 A.
 
 Data Acquisition
The signature database consists of a total number of 300offline and 300 online handwritten genuine signature imagesand 100 forged signatures. The genuine signatures arecollected from 50 people. Each of the users contributed 6offline and 6 online signature samples. The 100 skilledforgeries consist offline and online signatures, they arecollected from 25 forgers and each of the forgers contributed 4samples. The raw signature data available from our digitizerconsists of three dimensional series data as represented by (1).
 p y xS
)](),(),([)(
 
n
,,.........2,1,0
 
)1(
 where (
 x
(
),
 y
(
)) is the pen position at time
, and
 p
(
)
 
{0,1,… ,1024} represents th
e pen pressure.
 B.
 
Offline Signature Preprocessing
The scanned offline signature images may contain noisecaused by document scanning and it has to be removed toavoid errors in further processing steps. The gray-level imageis convolved with a Gaussian smoothing filter to obtain asmoothed image. The smoothed gray image is converted intobinary image and then thinned to one pixel wide.
C.
 
Offline Feature Extraction
The feature extraction algorithm for the static signature isstated as follows:(1)
 
Locate signature image bounding box.(i) Scan the binary image from top to bottom toobtain the signature image height.(ii) Scan the binary image from left to right to obtainthe signature image width.(2)
 
Centralization of the signature image.(i) Calculate centre of gravity of the signature imageusing (2).
 N i
i x N  x
1
),(1
 
 N  j
 j y N  y
1
).(1
(2)(ii)Then move the signature image centre to coincidewith centre of the predefined image space.(3) The image is partitioned into four sub-image parts.(i) Through point
 x
make a horizontal splittingacross the signature image.(ii) Through point
 y
make a vertical splitting acrossthe signature image.(4) Partition each of the sub-image parts into four rectangularparts.Couple test featureCouple test featureCouple reference featureResult
Figure1: System block diagram
 
Input onlinesignature on digitizerOn-line signaturefeature extractionTrainingInput offlinesignature on paperPreprocessingOffline signaturefeature extraction
 
Fusion of online and offline features
 
Classification stage
89http://sites.google.com/site/ijcsis/ISSN 1947-5500
 
(IJCSIS) International Journal of Computer Science and Information Security,Vol. 8, No. 6, September 2010
(i) Locate the centre of each of the sub-image partsusing (2).(ii) Repeat step 3(i) and 3(ii) for each of the sub-image parts in order to obtain a set of 16 sub-imageparts.(5) Partition each of the16 sub-image parts into four signaturecells(i) Locate the centre of each of the sub-image partsusing (2).(ii) Repeat step 3(i) and 3(ii) for each of the sub-image parts in order to obtain a set of 64 sub-imagecells.(6) Calculate the angle of inclination of each sub-image centrein each cell to the lower right corner of the cell.(i) Locate the centre of each of the 64 sub-imagecells using (2)(ii) Calculate the angle that each centre point makeswith the lower right corner of the cell.The feature extracted at this stage constitutes the offlinefeature vector, which is represented as: F = f 
1
,
2
,
3
,
4
…….
.f 
64
. The details and diagrams of the feature extractionare given in [8].
 D.
 
Online Signature Extraction
Three on-line signature features are extracted at eachsampling point from the raw data. The features are
∆p/∆x,∆p/∆y and v.
x corresponds to change of x between twosuccessive sampling points,
y corresponds to change of ybetween two successive sampling points,
∆p
corresponds tochange of p between two successive sampling points,
∆p/∆y
corresponds to ratio of 
p to
∆y, ∆p/∆x
corresponds to ratioof 
p to
∆x and v
corresponds to change of speed betweentwo successive sampling points [9]. These features areobtained using (3), (4) and (5).
22
)()(
y xv
 
)3(
 
)1()( )1()(
 y y  p p  y p
 
)4(
 
)1()( )1()(
 x x  p p  x p
 
)5(
 
 E.
 
Fusion of Offline and Online features
This technique is designed to compute fused featurevector, which contains information from both the offline andonline signatures and used this feature vector for subsequenceprocessing. Information from two input sensors can becombined at data acquisition stage, feature extraction stage orat decision level stage. The accuracy of the system alsodepends on the level of fusion and the discriminative ability of fused data. In [4] the fusion of voice and signature data wasdone at decision level. While in [6] fusion of hand and facewas done at feature level. In this work offline feature iscombined with online feature at feature level. Thecompatibility of the signature data from the same person, fromdifferent sensors made the fusion possible without any lost of information. The steps involve are stated as follows: given thatextracted static feature is F = f 
1
, f 
2
, f 
3 ,
4
……….f 
64
. The meanand variance of the feature vector are calculated using (6) and(7) respectively
 N iioff 
 N 
1
1
 
 
)6(
 
)(1
12
off  N iioff 
 N 
  
 
)7(
 The fused features are obtained by normalized each of theextracted online features
),,(
 x p y pv
using the varianceof the offline feature
)(
2
off 
 
 . The three fused features (SF1,SF2 and SF3) become: 
.,,
222
off off off 
x p y pv
   
 
F.
 
Training and Threshold Setting
Each of the registered users submitted 12 genuinesignatures to the system, out of which 8 signatures are fusedtogether to generate 4 couple reference features. Thesefeatures are used to generate 6 distance values by cross-aligned the couple reference features to the same length usingDynamic Time Warping (DTW).
 
These distance values areused to measure the variation within each of the user
’s
 signatures, so as to set user-specific threshold for accepting orrejecting a couple test signatures. Given four couple referencesignature samples R1, R2, R3 and R4, these features are crossaligned to obtain 6 distance values as shown in Fig.2. Themean (m
) and standard deviation (
σ
) of the distances: d
12
,d
13
, d
14
, d
23
, d
24
and d
34
are calculated and used to set thethreshold (t
) for each of the users based on each of the fusedfeatures as given in (8).
m
 
0
(8)
R1d
12
R2d
14
 d
13
d
24
 d
23
 d
34
R4R3Figure2. Cross-alignment of Couple referencefeatures
90http://sites.google.com/site/ijcsis/ISSN 1947-5500

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->