You are on page 1of 205



 




  
 

 

 
 


  


 
   
    
 
 !  
     
 "#    
   
 !  

  
     
     
 



 $%& 
 '(  

 
  


 

  
 
 #
        
  
 )
  
    "$%      
 
  

 



* + & (
# 
  
 #  ,
  
 -  
 *  + 
& ( $%  
 !   
 
 *   
  

 
 
 

 *  +  & (
# 
  
 #  ,
  
 -  
 *  + 
& ( *    
   

 

  
  
 
 
 
 
        
 
 
 
 
  !"

 

 
 

 
 

 

."* /-#
01 
(

 
 

 
*
  
"2  

! 3


456

 $3 46%33
"2+
& 
4
$%4647& 
3 *6 **64 *63
5348#"



    

 

       


     


 

       

     




 
!"

 


  
           
                

 
     !  
  
"        #      $  !%
 !      &  $   '     
'    ($     '   # %  %
)   %*   %'   $  ' 
   +  " %    &
 '  !  #     
    $, 
 ( $

    
    -         .   
                   
  
             !  
"-                   (     %
            
   
 .  
   %   %   %   % 
      $             $      $ -  
             -            

            - -
// $$$   
0  
1"1"#23."   
     

4& )*5/ +)


* !6 !& 7!8%779:9&  %  ) 2 
; !  
*   &
    
 
< "-& 
/ 
 &   =&4/&>% )  
  4? -=)4?>%$ % %:@99
/- A:@9B4& )*5/ +)
"3   "    &  :@9B

DEDICATION





This is dedicated to.. my son, Samuel Abhishek, who is now, with the Lord and
whom I loved so much.
And also to my Mom, Dad & Wife
who offered me unconditional love and support throughout the course of this thesis.

Sunil V. K. Gaddam






(iv)


ACKNOWLEDGEMENTS

For the LORD gives wisdom,
and from his mouth come knowledge and understanding.
Proverbs 2:6
Trust in the Lord with all your heart, and lean not on your own understanding; In
all your ways acknowledge Him, And He shall direct your paths.
Proverbs 3:5-6
To our God and Father be glory for ever and ever. Amen.
Philippians 4:20
In the last few years, I have learnt the importance of relying on Jesus
completely. I thank you Lord for showing me the way.
Time flies and I shall leave Delhi State soon a place I shall cherish long after my
post graduation. This year has been pretty tough and I would not have survived with
what little sanity I possess intact without the help of many people, who have been so
kind and helpful to me during all these years; all of you have made a mark in my life!
First and foremost, I want to express my greatest gratitude to my thesis supervisor
Prof. Dr. Manohar Lal, Director, School of Computer and Information Sciences
(SOCIS), IGNOU, for accepting me as a doctoral student. He is such a wonderful
advisor, mentor, and motivator. Under his guidance, I have learned a lot in different
aspects of conducting research, including finding a good research problem, writing a
convincing technical paper, and prioritizing different research tasks, to name a few.
Really I thank him, for his able guidance and supervision, for his valuable
suggestions, comments and corrections of the manuscript and also for his constant
support and recognition.
I am also grateful to Prof. Dr. V. N. Rajasekharan Pillai, Vice-Chancellor, Indira
Gandhi National Open University (IGNOU), for his support and encouragement in
submission of this thesis.
Many thanks go also to members of trusty, Bharat Institute of Technology
(BIT), Meerut, for their support and encouragement.
I would also like to thank all of the staff members at the SOCIS office of IGNOU
for all of their help and support.
Further I would like to thank my friends Ms. J. Smita and Mr. Binu, D for their
technical support and interesting discussions. Also I acknowledge the moral support
of my friends & colleagues, especially Mr. Kshitiz Saxena and Ms. Mahima Jain.
Finally, I owe a thousand thanks to my parentsStaya Raj & Saramma, uncle
Prasanna Kumar, brother Rajesh, sister Baby, wife Prabha and children
Benny, Vinny & Blessy who accompanied me warmly through all these years for
their incredible love, prayers, enthusiasm, encouragement; and to fellow believers
for their timely counsel and prayer.
Sunil V. K. Gaddam
(v)


TABLE OF CONTENTS

Declaration

(ii)

Certificate

(iii)

Dedication

(iv)

Acknowledgements

(v)

Table of Contents

(vi) (xi)

Abstract

(xii) (xiii)

List of Publications

(xiv) (xvii)

List of Abbreviations/Acronyms

(xviii)

List of Figures

(xix) (xxv)

List of Tables

(xxv)

CHAPTER 1: INTRODUCTION TO NETWORK SECURITY

1 19

1.1 Need for Security of Computer Networks

1.2 Security Goals

1.3 Security Attacks

1.3.1 Types and Sources of Network Threats

1.4 Security Mechanisms

1.5 Security Services

1.6 Security Techniques

1.6.1 Cryptography

1.6.1.1 Symmetric-key Encipherment

1.6.1.2 Asymmetric-key Encipherment

10

1.6.1.3 Hashing

11

1.6.2 Steganography

13

1.7 Identity Proof and Authentication Mechanisms

13

1.8 Aims and Objectives of Thesis

14

1.9 Original Contributions

16

1.10 Organisation of the Thesis

16
(vi)

CHAPTER 2: BIOMETRIC SYSTEM SECURITY

20 46

2.1 Introduction

20

2.2 The Need for Biometric Systems

20

2.3 Biometrics A Reliable Authentication Mechanism

23

2.3.1 Biometric Characteristics

24

2.3.2 Simple Biometric System Components

25

2.3.3 Biometric Modes of Operation

26

2.3.4 Information Flow in Biometric System

28

2.4 Biometric Technologies and Classifications

29

2.5 Biometric Modalities

31

2.6 Comparison of Various Biometric Technologies

31

2.7 Performance Measurements of Biometric System

33

2.8 Merging Biometrics and Cryptography for Reliable Network Security 36


2.9 Fingerprint as a Biometric Modality

37

2.9.1 Fingerprint Recognition

37

2.9.2 Fingerprint Uniqueness

38

2.9.3 Fingerprint Categories

38

2.9.4 Fingerprint Classification (based on Pattern Types)

39

2.9.5 Types of Minutiae

40

2.9.6 Fingerprint Matching Techniques

42

2.10 Fingerprint Systems

44

2.11 Summary

46

CHAPTER 3: CANCELLABLE BIOMETRIC SYSTEM

47 60

3.1 Introduction

47

3.2 Problems with the Existing Biometric Security Systems

47

3.3 Cancellable Biometrics

50

3.4 How Cancellable Biometrics Work

53

3.5 Implementation of Cancellable Fingerprints

54

3.6 Registration

56

3.7 Transformations

56

3.7.1 Transformation on Signal Level

57

3.7.2 Transformation on Feature Level

58

(vii)


3.8 Selection of Transformation Function

58

3.9 Summary

60

CHAPTER 4: LITERTURE SURVEY

61 76

4.1 Introduction

61

4.2 Biometrics is Not New!

61

4.3 A Brief History of Biometrics

63

4.4 Biometric Technologies from the Past to the Present

66

4.5 Condensed Timeline of Biometric Technologies

68

4.6 Review of Cancellable Biometrics

70

4.7 Summary

76

CHAPTER 5: THEORETICAL BACKGROUND

77 90

5.1 Introduction

77

5.2 Cancellable Biometric Systems

77

5.3 Bio-Crypto Key Generation

78

5.4 Concepts Utilized in the Proposed System

79

5.4.1 Histogram Equalization

80

2.4.2 Filters

81

5.4.2.1 Gabor Filter

83

5.4.2.2 Wiener Filter

84

5.4.3 Adaptive Threshold

85

5.4.4 Morphological Operations

86

5.4.5 Minutiae Extraction

87

5.4.5.1 Binarization

87

5.4.5.2 Ridge Thinning Algorithm

88

5.4.6 AES Encryption

89

5.5 Evaluation Schemes for Biometric Systems

89

5.6 Summary

90

(viii)


CHAPTER 6: MOTIVATION FOR THE RESEARCH

91 99

6.1 Introduction

91

6.2 Motivating Algorithms

91

6.2.1 Ratha et al.s Work on Cancellable Template Generation


for Fingerprints

94

6.2.2 S. Tulyakov et al.s Work on Symmetric Hash Functions


for Secure Fingerprint Biometric Systems
6.3 Summary

97
99

CHAPTER 7: SUGGESTED NEW APPROACHES TO


CANCELLABLE BIOMETRIC BASED
SECURITY

100 113

7.1 Introduction

100

7.2 New Approaches Proposed for Cancellable Fingerprints

100

7.3 First Proposed Method: New-Fangled Approach for


Cancellable Biometric Key Generation for Fingerprints
7.3.1 Extracting Minutiae Points From Fingerprint

101
101

7.3.1.1 Pre-processing

102

7.3.1.2 Region of Interest (ROI) Selection

103

7.3.1.3 Minutiae Extraction

104

7.3.2 Secured Feature Matrix Generation

106

7.3.3 Key Generation from Secured Feature Matrix ( SFm )

107

7.4 Second Proposed Method: Development of Bio-Crypto Key from


Fingerprints Using Cancellable Templates
7.4.1 Extraction of minutiae points from Fingerprint

107
108

7.4.1.1 Pre-processing

108

7.4.1.2. Region of Interest (ROI) Selection

109

7.4.1.3 Minutiae extraction

109

7.4.2 Secured Cancellable Template Generation

109

7.4.3 Cryptographic Key Generation from


Secured Cancellable Template
7.5 Summary

112
(ix)

111

CHAPTER 8: EXPERIMENTAL RESULTS AND ANALYSIS

114 141

8.1 Introduction

114

8.2 Technology Evaluation Scheme of Biometric Systems

114

8.3 Experimental Environment and Datasets

115

8.4 Evaluation Metrics

117

8.5 Experimental Results

118

8.5.1 Experimental Results of the Proposed Efficient


Cancellable Biometric Key Generation Scheme
(New-Fangled Approach)

118

8.5.2 Experimental Results of the Proposed Efficient Approach


For Cryptographic Key Generation from Fingerprint
(Bio-Crypto Key)

121

8.5.3 Experimental Results of the Method Introduced


by Nalini K. Ratha et al.

123

8.5.4 Experimental Results of the Approach Introduced


by S. Tulyakov et al.

126

8.6 Performance Analysis of the algorithms

127

8.6.1 Performance Analysis of the Proposed


Efficient Cancellable Biometric Key Generation
Scheme (algorithm 1)

128

8.6.2 Performance Analysis of the Proposed


Efficient Approach for Cryptographic Key Generation
from Fingerprint (algorithm 2)

130

8.6.3 Performance Analysis of Nalini K. Ratha et al.s Algorithm 132


8.6.4 Performance Analysis of S. Tulyakov et al.s Algorithm

135

8.7 Comparative Analysis of the Proposed Methods with


the Previous Approaches

137

8.7.1 Comparison of the Proposed Methods with the Previous


Approaches over FVC 2002 Database1 (DB1)

138

8.7.2 Comparison of the Proposed Methods with the Previous


Approaches over FVC 2002 Database2 (DB2)

138

8.7.3 Comparison of the Proposed Methods with The Previous


Approaches over FVC 2002 Database3 (DB3)
(x)


139

8.7.4 Comparison of the Proposed Methods with the Previous


Approaches over FVC 2002 Database4 (DB4)
8.8 Summary

140
141

CHAPTER 9: SECURITY ANALYSIS

142 145

9.1 Introduction

142

9.2 Security Analysis of Radha et al.s Work

142

9.3 Security Analysis of S.Tulyakov et al.s Work

143

9.4 Security Analysis of the Proposed First Method (Cancellable


Biometric Key Generation Scheme for Cryptography)

144

9.5 Security Analysis of the Proposed Second Method (Bio-Crypto


Key from Fingerprints using Cancellable Templates)
9.6 Summary

145
145

CHAPTER 10: CONCLUSION AND FUTURE WORK

146 147

10.1 Conclusion

146

10.2 Scope for Future Work

147

GLOSSARY (Technical Terminology)

148 156

BIBLIOGRAPHY (References)

157 172

(xi)


ABSTRACT

Networks make the information available from one corner of the world
to another, almost instantaneously. However, the growing use of the
Internet, a network of Networks, by individuals and organizations has
presented formidable problems of identity fraud, organised crime,
money laundering, theft of intellectual property and a myriad of cyber
crimes. The world is witnessing attempts at hacking of crucial
information systems like that of defence installations including that of
Pentagon of USA, which may endanger the security of even a nation.
Since incidents of September 11, 2001 and even earlier, security has
been in the forefront of American and other nations concern and the
importance of corporate data privacy, the need and demand for a
biometric physical security solution has been higher. Hence, the study
of methods of analysis of security requirements and needs of such
systems and consequent design, implementation and deployment is
the primary scope of the discipline named as Network Security.
Biometric security systems have a number of problems because of
the fact that the biometric data of a person is generally stored in the
system itself. The problems arise especially when that data is
compromised. Standard password based security systems have the
ability of cancelling the compromised password and reissuing a new
one. But the biometrics cannot be changed or cancelled. Thus,
advantage of biometrics based security becomes disadvantage also in
such situations. The concept of cancellable biometrics can upgrade
the existing biometric security system so that it gains the advantages
of the password based security systems, and at the same time not
losing the inherent superiority. In this thesis, we will be discussing
about problems with existing biometric technologies and then will be
showing that cancellable biometric system is one of important
solutions for security of computing and information systems.
(xii)


In the thesis, we are proposing two new algorithms, New-Fangled


Approach for Cancellable Biometric Key Generation and Development of
Bio-Crypto Key from Fingerprints Using Cancellable Templates for
secure fingerprint biometric systems. Also this thesis presents the
experimental results and the analysis of the proposed methods with
the existing approach. The experimental analyses have been carried
out utilizing the eminent fingerprint databases to noticeably evaluate
the performance of the approaches. Subsequently, Receiver Operating
Characteristic (ROC) graph is plotted in between the False Non-Match
Rate (FNMR) vs. False Match Rate (FMR) to signify the relative
effectiveness of the approach among the various methods. Also we
discuss the performance of the proposed approach, which is
significantly better as compared with the previous methods.
It also discusses the security analysis of the proposed methods along
with the existing approaches. In order to signify the effectiveness of
the proposed approach, the security analysis is carried out in terms
of, different transformations utilized in the various methods to prove
the

non-invertibility.

Finally,

the

conclusion

is

made

through

extensive analysis to ensure security against the impostor attacks and


we conclude with a discussion on biometrics of the future.
The above mentioned work is based on a number of publications in
reputed journals and proceedings of international and national
conferences. (The list of publications is enclosed).

(xiii)


LIST OF PUBLICATIONS
Publications in Peer Reviewed International Journals
1. Sunil V.K. Gaddam, Manoharlal, Development of Bio-Crypto Key from
Fingerprints Using Cancellable Templates, accepted for publication (ISSN: 09753397) in the International Journal on Computer Science and Engineering
(IJCSE), Vol.3, No.2, February 2011, PP.797-805.
2. Sunil V.K. Gaddam, Manoharlal, Efficient Cancellable Biometric Key Generation
Scheme for Cryptography, published in the International Journal of
Network Security (IJNS), Vol.11, No.2, PP.61-69, September 2010.
3. Surendra Rahamatkar, Sunil Vijaya K Gaddam & S. Qamar Machenism for
Termination Detection in Wireless Mobile Adhoc Network, published in
International Journal of Hybrid Computational Intelligence 1(1) January
2008; pp. 79-89.
4. Surendra Rahamatkar, Sunil Vijaya K Gaddam & S. Qamar Application and Use
of Object Model for Version & Configuration Control in Distributed S.D.Es,
published in International Journal of Hybrid Computational Intelligence
1(1) January 2008; pp. 91-102.
Publications in the Proceedings of International Conferences
5. Sunil V.K. Gaddam, Dr. Manoharlal, A New Approach for Formulating
Randomized Cryptographic Key Generation Using Cancellable
Biometrics, published in the proceedings of the 2010 International
Conference on Security and Management (SAM'10), a part of
WORLDCOMP'10 which is to be held during 12-15 July, 2010, in Las Vegas,
Nevada, USA.
6. Sunil V.K. Gaddam, Dr. Manoharlal, An Effective Method for Revocable
Biometric Key Generation, published in the proceedings of the The 2009
International Conference on Security and Management (SAM'09), a part
of WORLDCOMP'09 which was held during 13-16 July, 2009, in Las Vegas,
Nevada, USA.
7. Sunil V.K. Gaddam, Dr. Manoharlal, A Review on Next Generation
Networks: Convergence and QoS, in the proceedings of the The 2009
International Conference on Wireless Networks (ICWN'09), a part of
WORLDCOMP'09 which was held during 13-16 July, 2009, in Las Vegas, Nevada,
USA.
8. Sunil V.K. Gaddam, Dr. Manoharlal, Dr. Rajesh C. Phoha, New-Fangled
Approach for Cancelable Biometric Key Generation, published in the
proceedings
of
International
Conference
on
Computing,
Communicating and Networking (ICCCN-2008), during 18 20 December
2008 organised by Chettinad College of Engineering & Technology, Karur
sponsored by IEEE ED Society India Tamilnadu, India.
9. Surendra Rahamatkar, Sunil Vijaya K Gaddam, Samuel Qamar, WiHFi and NETH
SPOT: Effect of Wireless LAN Technology in MIET Campus, in the proceedings of
International Conference on Emerging Technologies & Applications in
(xiv)


Engineering, Technology and Sciences (ICETAETS 2008), during 13 14


January 2008 organised by Department of Computer Science, Saurashtra
University, Rajkot, Gujarat, India.
10. Sunil V. K. Gaddam, Prof. Ram Chakka, Prof. Manoharlal, "A Review on Network
Management Architectures", in the proceedings of the International Conference
on Recent Trends in Automation and its Adaption to Industries (PICA 2006) in
July, 2006, at Nagpur, Madya Pradesh, India.
11. Sunil V. K. Gaddam, Prof. Ram Chakka, Prof. Manoharlal, "Critical Issues and
Solutions in Network Management Architectures", in the proceedings of the
International Conference on Internet Computing (ICOMP'06), a part of
WORLDCOMP'06 which is to be held in June, 2006, in Las Vegas, USA.
12. Sunil Vijaya Kumar G., Dr. Rajesh C. Phoha, Prof. P.C. Saxena, Prof. Manohar Lal,
and Dr. Mukul K. Sinha, "Management of Networks: Part-I" in the proceedings
of NAFENs 14th Inter National Conference on Technology Innovations &
Financing Opportunities in Infrastructure Industries (2nd INFRATECH-2000), New
Delhi, December 2000
13. Sunil Vijaya Kumar G., Dr. Rajesh C. Phoha, Prof. P.C. Saxena, Prof. Manohar Lal,
and Dr. Mukul K. Sinha, "Management of Networks: Part-II" in the proceedings
of NAFENs 14th Inter National Conference on Technology Innovations &
Financing Opportunities in Infrastructure Industries (2nd INFRATECH-2000), New
Delhi, December 2000
14. Sunil Vijaya Kumar G., Dr. Rajesh C. Phoha, Prof. P.C. Saxena, Prof. Manohar Lal,
and Dr. Mukul K. Sinha, "Network Management Architectures", in the proceedings
of International Conference Systemics, Cybernetics and Informatics (SCI-2000)
On Advances in Information Technology, Organised by International School for
Information Technology (ISIT) of NIRSA, Hyderabad, A.P., December 2000.
15. Sunil Vijaya Kumar G., Dr. Rajesh C. Phoha, Prof. P.C. Saxena, Prof. Manohar Lal,
and Dr. Mukul K. Sinha, "Network Management Protocols", in the proceedings of
International Conference Systemics, Cybernetics and Informatics (SCI-2000) On
Advances in Information Technology, Organised by International School for
Information Technology (ISIT) of NIRSA, Hyderabad, A.P., December 2000.
Publications in the Proceedings of National Conferences
16. Sunil V.K. Gaddam, Prbhavathamma Pydicalva, A review on Information and
Communication Technology in Agriculture Extension and Development, in the
proceedings of National Conference on Emerging Technologies in Computer
Science- ETCS-09, Organized By Computer Science Department, Meerut Institute
of Engg. & Technology, Meerut U.P.
17. Sunil V.K. Gaddam, Mahima Jain, Digital Management for Enterprise An
Integrated Framework, in the proceedings of National Conference on Emerging
Technologies in Computer Science- ETCS-09, Organized By Computer Science
Department, Meerut Institute of Engg. & Technology, Meerut U.P.
18. Sunil V.K. Gaddam, Mohit Kumar, Instructional Design for Learning on the World
Wide Web, in the proceedings of National Seminar on Total Quality Management
in Pedagogy (TQM_P), Sponsored by AICTE on 27th May 2008 at Meerut
Institute of Engineering & Technology (MIET), Meerut 250 005, UP.
(xv)


19. Sunil V.K. Gaddam, Mohit Kumar, Re-conceptualisation of the Teaching and
Learning Process in the Contemporary Digital Age, in the proceedings of
National Seminar on Total Quality Management in Pedagogy (TQM_P), Sponsored
by AICTE on 27th May 2008 at Meerut Institute of Engineering & Technology
(MIET), Meerut 250 005, UP.
20. Sunil V.K. Gaddam, Surendra Rahamatkar, Samuel Qamar, Comparison of
Termination Detection Scheme in Mobile Distributed Network, in the proceedings
of National Conference on Methods and Models in Computing (NCM2C 2007),
during 13 14 December 2007 conducted by SC&SS, Jawaharlal Nehru
University, New Delhi.
21. Sunil V.K. Gaddam, Surendra Rahamatkar, Dharmendra Sharma, Pradeep Pant,
Miet-Net-Spot Wifi: Emerging Wlan Technology In Miet Campus, in the
proceedings of the National Conference on "Emerging Technologies in Computer
Science (ETCS 2007)", during September 22-23, 2007 conducted by Meerut
Institute of Engineering & Technology, Meerut, Uttar Pradesh.
22. Sunil V.K. Gaddam, Surendra Rahamatkar, P.K. Bharti, Futuristic Developments
in Communication Paradigm in Distributed and Ubiquitous Computing, in the
proceedings of the National Conference on "Emerging Technologies in Computer
Science (ETCS 2007)", during September 22-23, 2007 conducted by Meerut
Institute of Engineering & Technology, Meerut, Uttar Pradesh.
23. Sunil V.K. Gaddam, Vijaya Lakshmi, Design of Framework to Prevent the
Unauthorized Administrative or Super User Transactions, in the proceedings of
the National Conference on "Emerging Technologies MKCE- Confluence '07", in
March 15, 2007 conducted by M. Kumarsamy College of Engineering,
Thalavapalayam, Karur, Tamil Nadu
24. Sunil V.K. Gaddam, K. Sreekanth, Implementaion of Personal Number Service
using VoIP, in the proceedings of the National Conference on "TechnoZion 07",
during January 26-27, 2007 conducted by National Institute of Technology (NIT),
Warangal, A.P.
25. Sunil V.K. Gaddam, Arun Kumar, Data Security and Authentication, in the
proceedings of the National Conference on "Recent Trends in Electronics and
Communications - NCRTEC-2007", in January 25, 2007 conducted by G.Pulla
Reddy Engg College, Kurnool, A.P.
26. Sunil Vijaya Kumar G., Prof. Manohar Lal, Dr. Rajesh C. Phoha, "Network
Management A New Paradigm: Part - I", in the proceedings of IEEE ACE 2002,
Organised by IEEE Calcutta Section at Science City, Kolkata in December 2002.
27. Sunil Vijaya Kumar G., Prof. Manohar Lal, Dr. Rajesh C. Phoha, "Network
Management A New Paradigm: Part - II", in the proceedings of IEEE ACE
2002, Organised by IEEE Calcutta Section at Science City, Kolkata in December
2002.
28. G. Sunil Vijaya Kumar, Dr. Rajesh C. Phoha, Prof. Manohar Lal, "Internet
Management", in the proceedings of 15th National Convention of Computer
Engineers on E-Goverance: Challenges and prospects (ego 2000), Organised by
The Institution of Engineers (India), Kerala State Centre, Trivendrum, Kerala,
October 2000.

(xvi)


29. G. Sunil Vijaya Kumar, Prof. Manohar Lal, "OSI Management" in the proceedings
of 15th National Convention of Computer Engineers on E-Goverance: Challenges
and prospects (ego 2000), Organised by The Institution of Engineers(India),
Kerala State Centre, Trivendrum, Kerala, October 2000.
30. G. Sunil Vijaya Kumar, Prof. P.C. Saxena, "Management of Telecommunications
Management (TMN)" in the proceedings of 15th National Convention of Computer
Engineers on E-Goverance: Challenges and prospects (ego 2000), Organised by
The Institution of Engineers(India), Kerala State Centre, Trivendrum, Kerala,
October 2000.
31. G. Sunil Vijaya Kumar, Dr. Rajesh C. Phoha, Prof. Manohar Lal, "Integrated
Network Management Architecture", in the proceedings of the National
Conference on Quality, Reliability and Management (NCQRM 2000), September
2000
32. G. Sunil Vijaya Kumar, Dr. Rajesh C. Phoha, Prof. Manohar Lal, "Integrated WebBased Network Management Architecture", in the proceedings of the National
Conference on Quality, Reliability and Management (NCQRM 2000), Conducted
by Priyadarshini Engineering College, Vaniyambadi, T.N., September 2000.
33. G. Sunil Vijay Kumar, "INMP: A new paradigm proposal to avoid loopholes in
SNMP Client Server", in the Proceedings of an all India seminar on IT Application
in Engineering and Technology, conducted by Institution of Engineers (India) &
KSRM College of Engineering, Cuddapah, A.P., March 2000.
34. G. Sunil Vijay Kumar, "Web-Based Network Management: An Integrated
Management Solution", in the Proceedings of an all India seminar on IT
Application in Engineering and Technology, conducted by Institution of Engineers
(India) & KSRM College of Engineering, Cuddapah, A.P., March 2000.
35. G. Sunil Vijay Kumar, M. Srinivasulu, "Design of Data warehousing for Business
Applications", in the Proceedings of an all India seminar on IT Application in
Engineering and Technology, conducted by Institution of Engineers (India) &
KSRM College of Engineering, Cuddapah, A.P., March 2000.

(xvii)


LIST OF ABBREVIATIONS/ ACRONYMS

AES Advanced Encryption Standard


DB Database
DES Data Encryption Standard
DET Detection Error Trade-off
EER Equal Error Rate
FAR False Accept Rate
FMR False Match Rate
FNMR False Non-Match Rate
FpVTE Fingerprint Vendor Technology Evaluation
FRR False Reject Rate
FTC or FCR Failure to Capture Rate
FTE or FER Failure to Enroll Rate
FVC Fingerprint Verification Competition
ID Identification
MAC Message Authentication Code
MD message digest
NGRA Number of Genuine Recognition Attempts
NIRA Number of Impostor Recognition Attempts
PIN Personal Identification Numbers
PSDs Power Spectral Densities
ROC Receiver Operating Characteristics
ROI Region of Interest
RSA Rivest, Shamir and Adleman Algorithm
SHA Secure Hash Algorithm
(xviii)


LIST OF FIGURES
Figure
No.

Name of the Figure

Page
No.

1.1

Block diagram of a generic cryptography

08

1.2

A simple cryptography model

08

1.3

Secret key (symmetric) cryptography

09

1.4

A simple symmetric key cryptography model

09

1.5

Public key (asymmetric) cryptography

10

1.6

A simple asymmetric key cryptography model

10

1.7

A simple hashing model

12

2.1

Simplified logical block diagram of a biometric system

26

2.2

Enrollment process

27

2.3

Verification and identification process

27

2.4

Information flow in biometric systems

29

2.5

Typology of biometric mechanisms

30

2.6

Error trade-off in a biometric system

35

2.7

A sample fingerprint

38

2.8

Three major fingerprint classifiers

39

2.9

A fingerprint image with the core and four minutiae points

40

2.10

Fingerprintridge patterns and minutiae examples

41

2.11

Fingerprint pattern recognition system

44

2.12

Typical structure of a fingerprint system

45

3.1

Transformation using signal domain

50

3.2

Transformation using feature domain

51

Construction of cancellable fingerprints using feature domain

55

3.3
6.1
6.2
6.3

Cartesian transformation which maps each cell to some random


96
cell with collisions.
Polar transformation where each sector is mapped into some other
96
random sector after transformation.
S. Tulyakov et al.s secure fingerprint biometric systems using
98
symmetric hash functions

7.1: (a)

Original fingerprint image

102

7.1: (b)

Histogram equalized image

102
(xix)

7.2
8.1: (a)
8.1: (b)
8.1: (c)
8.1: (d)
8.2: (a)
8.2: (b)
8.2: (c)
8.2: (d)
8.2: (e)
8.2: (f)
8.2: (g)
8.3: (a)
8.3: (b)
8.3: (c)
8.3: (d)
8.3: (e)
8.3: (f)
8.3: (g)
8.4: (a)
8.4: (b)
8.4: (c)
8.4: (d)
8.4: (e)

Fingerprint after binarization

104

Sample of fingerprint images taken from four fingerprint


databases Sample images of: DB1
Sample of fingerprint images taken from four fingerprint
databases Sample images of: DB2
Sample of fingerprint images taken from four fingerprint
databases Sample images of: DB3
Sample of fingerprint images taken from four fingerprint
databases Sample images of: DB4
Intermediate results of New-Fangled Approach for the sample
image from DB1: Input Fingerprint Image
Intermediate results of New-Fangled Approach for the sample
image from DB1: Histogram equalized image
Intermediate results of New-Fangled Approach for the sample
image from DB1: Gabor Filtered Image
Intermediate results of New-Fangled Approach for the sample
image from DB1: Binarized Image
Intermediate results of New-Fangled Approach for the sample
image from DB1: Region of Interest (ROI)
Intermediate results of New-Fangled Approach for the sample
image from DB1: Fingerprint Image with minutiae points
Intermediate results of New-Fangled Approach for the sample
image from DB1: Generated 256-bit key
Intermediate results of New-Fangled Approach for the sample
image from DB2: Input Fingerprint Image
Intermediate results of New-Fangled Approach for the sample
image from DB2: Histogram equalized image
Intermediate results of New-Fangled Approach for the sample
image from DB2: Gabor Filtered Image
Intermediate results of New-Fangled Approach for the sample
image from DB2: Binarized Image
Intermediate results New-Fangled Approach for the sample image
from DB2: Region of Interest (ROI)
Intermediate results of New-Fangled Approach for the sample
image from DB2: Fingerprint Image with minutiae points
Intermediate results of New-Fangled Approach for the sample
image from DB2: Generated 256-bit key
Intermediate results of New-Fangled Approach for the sample
image from DB3: Input Fingerprint Image
Intermediate results of New-Fangled Approach for the sample
image from DB3: Histogram equalized image
Intermediate results of New-Fangled Approach for the sample
image from DB3: Gabor Filtered Image
Intermediate results of New-Fangled Approach for the sample
image from DB3: Binarized Image
Intermediate results of New-Fangled Approach for the sample
image from DB3: Region of Interest (ROI)
(xx)

116
116
116
116
119
119
119
119
119
119
119
119
119
119
119
119
119
119
120
120
120
120
120

8.4: (f)
8.4: (g)
8.5: (a)
8.5: (b)
8.5: (c)
8.5: (d)
8.5: (e)
8.5: (f)
8.5: (g)
8.6: (a)
8.6: (b)
8.6: (c)
8.6: (d)
8.6: (e)
8.6: (f)
8.6: (g)
8.7: (a)
8.7: (b)
8.7: (c)
8.7: (d)
8.7: (e)
8.7: (f)
8.7: (g)

Intermediate results of New-Fangled Approach for the sample


image from DB3: Fingerprint Image with minutiae points
Intermediate results of New-Fangled Approach for the sample
image from DB3: Generated 256-bit key
Intermediate results of New-Fangled Approach for the sample
image from DB4: Input Fingerprint Image
Intermediate results of New-Fangled Approach for the sample
image from DB4: Histogram equalized image
Intermediate results of New-Fangled Approach for the sample
image from DB4: Gabor Filtered Image
Intermediate results of New-Fangled Approach for the sample
image from DB4: Binarized Image
Intermediate results of New-Fangled Approach for the sample
image from DB4: Region of Interest (ROI)
Intermediate results of New-Fangled Approach for the sample
image from DB4: Fingerprint Image with minutiae points
Intermediate results of New-Fangled Approach for the sample
image from DB4: Generated 256-bit key
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Input Fingerprint Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Histogram equalized image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Wiener Filtered Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Region of Interest (ROI)
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Thinned image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Fingerprint Image with minutiae
points
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB1: Generated 256-bit key
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Input Fingerprint Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Histogram equalized image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Wiener Filtered Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Region of Interest (ROI)
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Thinned image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Fingerprint Image with minutiae
points
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB2: Generated 256-bit key
(xxi)

120
120
120
120
120
120
120
120
120
121
121
121
121
121
121
121
122
122
122
122
122
122
122

8.8: (a)
8.8: (b)
8.8: (c)
8.8: (d)
8.8: (e)
8.8: (f)
8.8: (g)
8.9: (a)
8.9: (b)
8.9: (c)
8.9: (d)
8.9: (e)
8.9: (f)
8.9: (g)
8.10: (a)
8.10: (b)
8.10: (c)
8.10: (d)

8.10: (e)
8.11: (a)
8.11: (b)
8.11: (c)

Intermediate results of Bio-Crypto Key Generation Approach for


the sample image from DB3: Input Fingerprint Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB3: Histogram equalized image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB3: Wiener Filtered Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB3: Region of Interest (ROI)
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB3: Thinned image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB3: Fingerprint Image with minutiae
points
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB3: Generated 256-bit key
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Input Fingerprint Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Histogram equalized image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Wiener Filtered Image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Region of Interest (ROI)
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Thinned image
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Fingerprint Image with minutiae
points
Intermediate results of Bio-Crypto Key Generation Approach for
the sample image from DB4: Generated 256-bit key
Intermediate results of Ratha et al.s Approach for the sample
image from DB1: Input Fingerprint Image
Intermediate results of Ratha et al.s Approach for the sample
image from DB1: Fingerprint image with orientation field
Intermediate results of Ratha et al.s Approach for the sample
image from DB1: Fingerprint Image with minutiae points
Intermediate results of Ratha et al.s Approach for the sample
image from DB1: Minutiae points after applying Cartesian
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB1: Minutiae points after applying Polar
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB2: Input Fingerprint Image
Intermediate results of Ratha et al.s Approach for the sample
image from DB2: Fingerprint image with orientation field
Intermediate results of Ratha et al.s Approach for the sample
image from DB2: Fingerprint Image with minutiae points
(xxii)

122
122
122
122
122
122
122
123
123
123
123
123
123
123
124
124
124
124

124
124
124
124

8.11: (d)

8.11: (e)
8.12: (a)
8.12: (b)
8.12: (c)
8.12: (d)

8.12: (e)
8.13: (a)
8.13: (b)
8.13: (c)
8.13: (d)

8.13: (e)
8.14: (a)
8.14: (b)
8.14: (c)
8.14: (d)
8.15: (a)
8.15: (b)
8.15: (c)
8.15: (d)

Intermediate results of Ratha et al.s Approach for the sample


image from DB2: Minutiae points after applying Cartesian
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB2: Minutiae points after applying Polar
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB3: Input Fingerprint Image
Intermediate results of Ratha et al.s Approach for the sample
image from DB3: Fingerprint image with orientation field
Intermediate results of Ratha et al.s Approach for the sample
image from DB3: Fingerprint Image with minutiae points
Intermediate results of Ratha et al.s Approach for the sample
image from DB3: Minutiae points after applying Cartesian
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB3: Minutiae points after applying Polar
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB4: Input Fingerprint Image
Intermediate results of Ratha et al.s Approach for the sample
image from DB4: Fingerprint image with orientation field
Intermediate results of Ratha et al.s Approach for the sample
image from DB4: Fingerprint Image with minutiae points
Intermediate results of Ratha et al.s Approach for the sample
image from DB4: Minutiae points after applying Cartesian
transformation
Intermediate results of Ratha et al.s Approach for the sample
image from DB4: Minutiae points after applying Polar
transformation
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB1: Input Fingerprint Image
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB1: Fingerprint Image with minutiae points
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB1: Minutia points with its nearest
neighbour (n=5)
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB1: Fingerprint hash value
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB2: Input Fingerprint Image
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB2: Fingerprint Image with minutiae points
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB2: Minutia points with its nearest
neighbour (n=5)
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB2: Fingerprint hash value
(xxiii)

124

124
125
125
125
125

125
125
125
125
125

125
126
126
126
126
126
126
126
126

8.16: (a)
8.16: (b)
8.16: (c)
8.16: (d)
8.17: (a)
8.17: (b)
8.17: (c)
8.17: (d)
8.18
8.19
8.20
8.21
8.22
8.23
8.24
8.25
8.26
8.27
8.28
8.29
8.30
8.31
8.32

Intermediate results of S. Tulyakov et al.s Approach for the


sample image from DB3: Input Fingerprint Image
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB3: Fingerprint Image with minutiae points
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB3: Minutia points with its nearest
neighbour (n=5)
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB3: Fingerprint hash value
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB4: Input Fingerprint Image
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB4: Fingerprint Image with minutiae points
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB4: Minutia points with its nearest
neighbour (n=5)
Intermediate results of S. Tulyakov et al.s Approach for the
sample image from DB4: Fingerprint hash value
Performance graph (FMR (t) and FNMR (t)) of New-Fangled
Approach on DB1
Performance graph (FMR (t) and FNMR (t)) of New-Fangled
Approach on DB2
Performance graph (FMR (t) and FNMR (t)) of New-Fangled
Approach on DB3
Performance graph (FMR (t) and FNMR (t)) of New-Fangled
Approach on DB4
Performance graph (FMR (t) and FNMR (t)) of Bio-Crypto Key
Generation Approach on DB1
Performance graph (FMR (t) and FNMR (t)) of Bio-Crypto Key
Generation Approach on DB2
Performance graph (FMR (t) and FNMR (t)) of Bio-Crypto Key
Generation Approach on DB3
Performance graph (FMR (t) and FNMR (t)) of Bio-Crypto Key
Generation Approach on DB4
Performance graph (FMR (t) and FNMR (t)) of Ratha et al.s
Approach on DB1
Performance graph (FMR (t) and FNMR (t)) of Ratha et al.s
Approach on DB2
Performance graph (FMR (t) and FNMR (t)) of Ratha et al.s
Approach on DB3
Performance graph (FMR (t) and FNMR (t)) of Ratha et al.s
Approach on DB4
Performance graph (FMR (t) and FNMR (t)) of S. Tulyakov et
al.s Approach on DB1
Performance graph (FMR (t) and FNMR (t)) of S. Tulyakov et
al.s Approach on DB2
Performance graph (FMR (t) and FNMR (t)) of S. Tulyakov et
al.s Approach on DB3
(xxiv)

127
127
127
127
127
127
127
127
128
129
129
130
130
131
131
132
133
133
134
134
135
136
136

8.33

Performance graph (FMR (t) and FNMR (t)) of S. Tulyakov et


al.s Approach on DB4

137

8.34

ROC curve of FVC 2002 Database1 (DB1)

138

8.35

ROC curve of FVC 2002 Database2 (DB2)

139

8.36

ROC curve of FVC 2002 Database3 (DB3)

140

8.37

ROC curve of FVC 2002 Database4 (DB4)

141

LIST OF TABLES
Table No.

Name of the Table

Page No.

2.1

Comparison summary of authentication mechanisms

22

2.2

Comparison of various biometric technologies

32

4.1

Timeline of biometric technologies

68 - 70

8.1

Scanners/technologies employed for the collection of


FVC2002 databases

117

(xxv)


CHAPTER 1
INTRODUCTION NETWORK SECURITY
"To competently perform rectifying security service, two critical incident response
elements are necessary: information and organization." Robert E. Davis
A Computer Network is an interconnected collection of autonomous computers which
use a well-defined, mutually agreed set of rules and conventions known as protocols for
interacting with one-another meaningfully in the form of messages and for allowing
resource-sharing preferably in a predictable and controllable manner [1]. Networks
make the information available, from one corner of the world to another, almost
instantaneously. However, the growing use of the Internet, a network of Networks, by
individuals and organizations has presented formidable problems of identity fraud,
organised crime, money laundering, theft of intellectual property and a myriad of
cybercrimes. The world is witnessing attempts at hacking of crucial information systems
like that of defence installations including the Pentagon of USA, which may endanger the
security of even a nation. Since incidents of September 11, 2001 and even earlier,
security has been in the forefront of American and other nations concern. Hence, the
study of methods of analysis of security requirements and needs of such systems and
consequent design, implementation and deployment is the primary scope of the discipline
named as Network Security. The network security includes all the issues related to
security of internets, including the Internet, which is a single huge global network of
networks.

In this chapter we discuss briefly about the network security, security goals and attacks,
types of network threats, security mechanisms, security services and security techniques.
There after, we discuss identity proof and authentication mechanisms.

Finally, we

describe the aims and objectives of this thesis, original contributions and summary or
outline of the thesis, in which, we give the gist of each chapter very briefly.

1.1 NEED FOR SECURITY OF COMPUTER NETWORKS


The art of war teaches us to rely not on the likelihood of the enemy's not coming, but on
our own readiness to face it; we should not depend on the chance of enemy not attacking
us, but rather on the fact that we have made our position unassailable.
In general, security is the condition of being protected against danger or loss. Security is
a concept similar to safety. Security is freedom from risk or danger, while computer
and data security is the ability of a system to protect information and system resources
with respect to confidentiality and integrity [59] i.e., Information Security is nothing
but the quality or state of being secure and to be free from danger. Traditionally,
information security was provided primarily by physical and administrative means.
Networks and communications facilities require additional measures to protect the
security of data during transmission.
The valuable information must be protected through security measures. Security is often
viewed as the need to protect one or more aspects of networks operations and permitted
use (access, behaviour, performance, privacy and confidentiality included). Security
requirements may be local or global in their scope, depending upon the networks or
internetworks purpose of design and development [3].
Information Security can be handled at two levels: (i) Computer Security and (ii)
Network (Internet) Security. In generic terms, computer security is the process of
securing a single, standalone computer, while network security is the process of securing
the entire network of computers [4].
x

Computer Security: It can be defined as a set of technological and managerial


procedures applied to a computer system to ensure the availability, integrity and
confidentiality of the information managed by the computer.

Network (Internet) Security: Protection of networks and their services from


unauthorized modification, destruction or disclosure and provision of assurance
that the network performs its critical functions correctly and has no harmful side
effects. Here, security measures are designed to protect data during their

Securing network infrastructure is like securing possible entry points of attacks on a


country by deploying appropriate defence. Attacks could be stopped at their entry points
before they spread. Computer security is more like providing means of self-defence to
each individual citizen of the country. Here, the measures taken are focused on securing
individual computer hosts.
1.2 SECURITY GOALS
The security goals include protection of information from unwanted access and
maintaining Confidentiality, Authentication, Data Integrity, Access Control &
Availability, and Non-repudiation [5]. Security goals can be defined, depending on the
application environment, or in a more general and technical way.
x

Confidentiality (Secrecy): Ensuring that information is accessible only to those


authorized to have access. i.e., only sender and intended receiver should
understand message contentssender encrypts message and receiver decrypts
message according to mutually agreed protocols.

Authentication: It is the process of verifying the claimed identity of a user. i.e.,


sender and receiver want to confirm the identity of each other.

Data Integrity: Safeguarding the accuracy and completeness of information and


processing methods i.e., sender and receiver want to ensure that data or message
be not-altered (in transit, or afterwards) without detection.

Access Control & Availability: Ensuring that authorized users have access to
information and associated assets when required i.e., services must be accessible
and available to intended users.

Non-repudiation: Ensuring an individual cannot deny the authorization of a


transaction. i.e., sender or receiver should not be able to disavow later-on the
message actually transmitted or received by them.

1.3 SECURITY ATTACKS


The actual realization of a threat is called an attack. In other words, any action that
compromises the security of information owned by an organization or individual is called
an attack. An attack compromises the information system security. Attacking
Communications on the Message Level are of two kinds: Passive Attacks and Active
Attacks [6, 7, and 8].
x

Passive Attacks: Passive attacks are the eavesdropping communications and


releasing of messages. These involve simply getting access to link or device and
consequently to data, without altering/changing the data. It only requires traffic
analysis on the identities, locations, frequency etc of communications.

Active Attacks: Active attacks involve attempts on security leading to deletion,


modification, insertion, redirection, blockage or destruction of data, device or
links.

1.3.1 Types and Sources of Network Threats


A threat in a communication network is any possible event or sequence of actions that
might lead to a violation of one or more security goals. Security Threats have potential
for security violation. There are two basic types of threats: accidental threats and
intentional threats.
Accidental threats can lead to exposure of confidential information or causing an illegal
system state to occur due to modification of information. An intentional threat is an
action performed by an entity with the intention to violate security which includes
destruction, modification, fabrication, interruption or interception of data [4].
Threats are broadly classified as: Masquerade, Eavesdropping, Authorization Violation,
Loss or Modification of (transmitted) Information, Denial of Communication Acts
(Repudiation), Forgery of Information, Sabotage [6].
x

Masquerade or Spoofing: It refers to impersonation. i.e., an entity claiming to be


another entity

Eavesdropping or Snooping: It refers to unauthorized access or interception of


data which is not intended to be read

Authorization Violation: An entity uses a service or resource it is not intended to


use

Loss or Modification of (transmitted) Information: When data is altered or


destroyed

Denial of Communication (Repudiation): An entity falsely denies its


participation in a communication act

Forgery of Information: An entity creates new information in the name of


another entity

Sabotage: Any action that aims to reduce the availability and/ or correct
functioning of services or systems

1.4 SECURITY MECHANISMS


A Security Mechanism is a process, algorithm, protocol or device that is designed to
detect, locate, identify, prevent, or recover from security attacks. Our purpose is to
provide reliable security services in different environments having a potential of suffering
variants of security attacks, by exploiting a number of security mechanisms. There are
two types of Security mechanisms: Specific Security Mechanisms, Pervasive Security
Mechanisms [6].
x

Specific Security Mechanisms include encryption, digital signature, access


controls, data integrity, authentication exchange, traffic padding, routing control,
notarization etc.

Pervasive Security Mechanisms include trusted functionality, security label,


event detection, security audit trails, security recovery etc.

1.5 SECURITY SERVICES


A Security Service is a processing or communication service that enhances security of
data processing and information transmission, and makes use of the security mechanisms.
It is an abstract service that seeks to ensure a specific security property. A security
5

service can be realized with the help of cryptographic algorithms and protocols as well
as with conventional means: One may keep an electronic document confidential by
storing it on the disk in an encrypted format as well as locking away the disk in a safe.
But to make information more secure, usually a combination of cryptographic and other
means are used.
The field of information security has started to evolve in response to the rapid growth of
Internet and evolving threats to it, and is becoming an important discipline with a sound
theoretical basis. The discipline is divided in to five supporting pillars [9]:

Identification and Authenticationis the process of verifying the identity of a


user through the use of specific credentials.

Authorizationauthorizing access to resources.

Confidentialityensuring only authorized individuals can view the content of


data or software.

Integrityensuring that only authorized individuals can change the content of


data or software.

Non-denialensuring an individual cannot deny the authorization or execution


of a transaction, like changing the content of the data.

The identification and authentication is listed first because it is crucial to the entire
process and facilitates the other four pillars of security. If an individuals identity is
unknown, access cannot be authorized since system confidentiality cannot be enforced
nor integrity safeguarded. Similarly, non-denial is impossible without identification and
authentication since the system is unable to log an identity against specific transactions.
Consequently, identification and authentication should always be viewed as the first step
to successfully enforcing information security [9].
To conclude, we can say that, attacks are the reasons, mechanisms are the tools, and
services are our goals.

1.6 SECURITY TECHNIQUES


Mechanisms discussed in the previous section are only theoretical recipes to implement
security. The actual implementation of security goals needs some techniques. The
following two techniques are more prevalent.
1. Cryptographyvery general technique.
2. Steganographyspecific technique.
In all the techniques, the following goals/services of security are very important
x

Authentication: Only the legitimate sender can send the information.

Integrity: The received and sent message must be the same.

Confidentiality: Only the legitimate receiver can read the information

1.6.1 Cryptography
Some security mechanisms listed in the previous section can be implemented using
cryptography. The word cryptography is derived from a Greek word, which means secret
(crypto) writing (graphy). However, we use the term to refer to the science and art of
transforming messages to make them secure and immune to attacks [6]. Cryptography is
the science of information and communication security. In other words, it is the science
of information protection against unauthorized parties by preventing unauthorized
alteration of use. In the present day context, it refers to the tools and techniques used to
make messages secure for communication between the participants and make messages
immune to attacks by hackers. For private communication through public network,
cryptography plays a very crucial role. The role of cryptography can be illustrated with
the help of a simple model of cryptography [10] (as shown in figure 1.1 and figure 1.2).
The message to be sent, through possibly an unreliable medium, is known as plaintext,
which is encrypted before sending over the medium. The encrypted message is known as
ciphertext, which is received at the other end of the medium and then it is decrypted to
get back the original plaintext message.

Encryption Key
Plain Text

Cipher Text

Decryption Key

Figure 1.1: Block diagram of a generic cryptography

Figure 1.2: A simple cryptography model

The following are the cryptographic primitives [2]:


x

Components: Algorithms, Protocols etc.

Functionality: Possible to use in honest environment.

Security: Impossible to use in malicious environment.

Although in the past, Cryptography referred only to the encryption and decryption of the
messages using secret keys, now-a-days, it is defined as involving three distinct
mechanisms: symmetric-key encipherment, asymmetric-key encipherment, and hashing
[6].
1.6.1.1 Symmetric-key Encipherment
It is also called as Secrete-Key Encipherment or Secret-Key Cryptography. An encryption
algorithm is used for converting the plaintext to ciphertext, operating on a key, which is
essentially a specially generated number (value). To decrypt a secret message (ciphertext)
to get back the original message (plaintext), a decrypt algorithm uses a decrypt key. In
8

symmetric key cryptography, single secret-key is shared, i.e. the same key is used in both
for encryption and decryption [10] (as shown in Figure 1.3 and figure 1.4). Both the
sender and receiver would have to know the key beforehand or it would have to be sent
along with the message.

Plain Text

Cipher Text

Plain Text

Figure 1.3: Secret key (symmetric) cryptography.

Figure 1.4: A simple symmetric key cryptography model

Symmetric-key Encipherment Primitives:


x

Components: Algorithms for Encryption, Decryption and Key etc.

Functionality: DecryptK (EncryptK (x)) = x, for any key K and message x

Security: Depends on confidentiality of key.

In this mechanism, encryption/decryption can be thought of as electronic locking. The


algorithm used to decrypt is just the inverse of the algorithm used for encryption.
Symmetric key cryptography algorithms are simple requiring lesser execution time. As a
consequence, these are commonly used for long messages. However, these algorithms
suffer from the following limitations [10]:
x

Requirement of large number of unique keys. For example, for n users the
number of distinct keys required is n (n-1)/2.
9

Distribution of keys confidentially among the users in a secured manner is


difficult.

Well-known examples for symmetric-key encipherment are: Data Encryption Standard


(DES)64 bit key, and Advanced Encryption Standard (AES)28 bit key etc.
1.6.1.2 Asymmetric-key Encipherment
It is also called as public-key encipherment or public-key cryptography. Here, we have
similar situation as the symmetric-key encipherment, but with few exceptions. First, there
are two keys instead of one: one public-key (for encryption) and one private-key (for
decryption). The public key is announced to the public, where as the private-key is kept by
the receiver. To send a secured message to receiver, sender first encrypts the message

using public-key. Then, receiver uses his own private-key to decrypt the message [10] (as
shown in Figure 1.5 and figure 1.6). Any one can encrypt using the public key, but only
the holder of the private key can decrypt. Security depends on the secrecy of the private
key only.

Plain Text

Cipher Text

Plain Text

Figure 1.5: Public key (asymmetric) cryptography

Figure 1.6: A simple asymmetric key cryptography model


10

A big random number is used to make a public-private key pair. As public-key is known
to all, the security in the Asymmetric-key methods depends on private-key.
Asymmetric-key Encipherment Primitives:
x

Components: Algorithms for Encryption, Decryption and Keys etc.

Functionality: DecryptKs (EncryptKp (x)) = x, for any public-private key pair (Kp,
Ks) where Kp is the public-key, Ks is the secret-key and x is any message.

Security: Depends on the confidentiality of the private-key.

Asymmetric key algorithms are complex and require more execution time as compared to
Symmetric key algorithms. For n users, only 2n keys are required in public key
cryptography.
x

Advantages:
o The pair of keys can be used with any other entity
o The number of keys required is small

Disadvantages:
o It is not efficient for long messages
o Association between an entity and its public key must be verified

Well-known example for asymmetric-key encipherment is: Rivest, Shamir and Adleman
(RSA) Algorithm.
1.6.1.3 Hashing
Hashing is an algorithm that digests data and represents its bits and bit patterns by a
numerical equivalenta Hash Value. Single bit change in the data, may change half of
the bits in the resultant hash-value. Hash function is used for one-way cryptography.
Hash functions have no key since the plaintext is not recoverable from the ciphertext, as
depicted in figure 1.7. In hashing, a fixed-length message digest (MD) is created out of a
variable-length message. In other words, Hash function takes a message of any length as
input and produces a fixed length string as output termed message digest or a digital
fingerprint.
11

Plain Text

Hash Function

Cipher Text

Figure 1.7: A simple hashing model

The digest is normally much smaller than the message. It is a non-reversible algorithm,
i.e., hash value cant reproduce data. For the method to be useful, both the message and
the digest must be sent to receiver. Message Authentication Code (MAC) Hash algorithm
is Symmetric Key-dependant hash for more security. Hash is mainly used in the Digital
Signature process and for data integrity. Hashing is used to provide check values for
providing data integrity [6].
Hash Function Primitives:
x

Components: Algorithms for Hashing

Functionality: Implement a deterministic function

Security: Security depends upon, among others, on many factors It is hot


research topic.

Some examples for Hashing are [2]:


x

Message Digest (MD) devised by Ronald Rivest

Secure Hash Algorithm (SHA) standardized by NIST

MD4 in 1990 (128-bit digest)

MD5 in 1991 (128-bit digest), published as RFC 1321 in 1992

SHA in 1993 (160-bit digest) now obsolete

SHA-1 in 1995 (160-bit digest)

SHA256, SHA384, SHA512 in 2002 (256-, 384-, 512-bit digest)

Security Properties of Hash Functions:


One-wayness: Given y, it is hard to find even one x, such that
y = h(x), witness for a password
12

Collision-resistance: It is hard to find x and x, such that


h(x) = h(x) and x z x, e.g. Digital fingerprint of the bit-string
Randomness: Given h1(x), . . ., hn(x) it is hard to predict hn+1 (x), secret key generation.

1.6.2 Steganography
In addition to the Cryptography, another technique that is used for secret communication
is Steganography. The word Steganography, with origin in Greek, means covered Writing,
in contrast with Cryptography, which means secret writing. Cryptography means
concealing the contents of a message by enciphering where as Steganography means
concealing the message itself by covering it with something else.
Any form of data, such as text, image, audio or video can be digitized, and it is possible
to insert secret binary information into the data during digitization process. Such hidden
information is not only used for secrecy, but it can also be used for protecting copyright,
preventing tampering or addition of extra information [6].
The cover of secret data can be text and then it is called Text Cover. Secret data can also
be covered under a coloured image, and then it is called as Image Cover. Like-wise, we
can have Audio Cover and Video Cover and more. Both audio and video data can be
compressed and the secret data can be embedded during or before the compression.

1.7 IDENTITY PROOF AND AUTHENTICATION MECHANISMS


Authentication is the process of verifying the claimed identity of a user [58]. The process
of identification and authorization provides, in essence, the ability to prove or verify an
identity. Todays systems can authenticate individuals identity [9, 11, and 12] and
provide the right person with the right privileges the right access at the right time [13]
by any one of the following three mechanisms or an arbitrary combination of these:
x

Knowledge: The specific knowledge of a secret which is pre-defined, such as a


password, login ID and Personal Identification Numbers (PINs) [9], etc. which
permits access to a service. Typically, secret-driven applications include: for
13

Possession: The possession of a specific item or token, such as key, member card,
smart card [9], identity card, magnetic-stripe cards, optical-stripe cards, printed
barcodes [14], identity document, etc. Most of the tokens mentioned above are in
widespread use for granting access to physical assets (for example, doors into
buildings) and logical assets (for example, corporate networks or bank accounts).

Biometrics: An authentication of an individual with a specific characteristic of


the individual, called biological features or personal traits. These can be either
physiological or behavioural characteristics [15] of a person, such as fingerprint,
iris, ear, gait, keystroke dynamics, voice, facial properties, signature or knuckle
profile, DNA, hand geometry, hand vein pattern, etc.

The combination, i.e. fusion, of two or three of the aforementioned attributes can be used
to further increase the security level. All of these attributes have their specific advantages
and disadvantages. The use of the third attribute, viz., biometrics, if required, in
combination with the others, has significant advantages, because, without sophisticated
means, biometrics is difficult to share, steal or forge and is generally not easily forgotten
or lost.
In this thesis, we are mainly focusing on identification and authentication using biometric
techniques along with some of other security techniques.

1.8 AIMS AND OBJECTIVES OF THESIS


Biometric authentication schemes raise security concerns because biometric data is
permanently associated with its owner and therefore can not be replaced even if it is
compromised. One of the most promising solutions to this problem is cancellable
biometrics [53], where system does not store the original biometric data; rather, it stores
only the version transformed by a non-invertible transform using a one-way function [54].
Then, the verification/ identification is done on this transformed data without any need to
recover or use the original data, keeping the original data safe even if the system is
14

compromised. This concept ensures that the original biometric template doesnt exist in
the system database. As such, it is not in danger of being exposed. Thus, the privacy issue
is completely nonexistent.
Aim of the thesis:
We embarked upon the research work with the following aims:
x

Ensuring security against the impostor through new cancellable biometric


approach.

Proposing two novel techniques of generating bio-crypto key from fingerprint


using cancellable templates.

Proving the effectiveness of the proposed techniques in generating an irrevocable


cryptographic key through the experimental results. Making security analysis to
prove the non-invertibility and ensuring the security against the impostor attacks.

Providing Biometric-based Key Generation, through the proposed methods that


perform better than traditional systems in usability domain.

Objective of the thesis:


The above-mentioned aims are realized through the following techniques:
x

Integrating the fingerprint technology with the existing cryptographic


methodologies that generate efficient, sophisticated bio-crypto keys and that
further make simple the key-generation and key-release issues in a still more
efficient manner.

Generating cryptographic key which is irrevocable and unique to a specific


cancelable template, and availing better protection and replacement features for
lost or stolen biometrics.

Implementing and analyzing techniques proposed in the two pioneering research


papers by Ratha et al. and S. Tulyakov et al. and finally comparing these results
with results by our proposed methods and then giving the complete analysis of all
these algorithms.

15

1.9 ORIGINAL CONTRIBUTIONS


In the process of meeting the above-mentioned aims and objectives, we made the
following contributions:
x

Proposing two efficient techniques, each generating a bio-crypto key from


fingerprints using cancellable templates.

Generating efficient irrevocable cryptographic keys from fingerprint biometrics


using cancellable biometric templates, using a method composed of three phases:
1)

Minutiae points extraction from the fingerprint image: It is same in both


of our methods, however, in the first method, Gabor filters are used whereas
in the second method Wiener filters are used, for better results.

2)

Cancelable template generation with added security: Proposing two novel


approaches to Secured Feature Matrix Generation and for Generating BioCrypto Keys from Secured Feature Matrix

3)

Cryptographic key generation from Secured Cancelable template:


Proposing two novel approaches to Secured Feature Matrix Generation and
for Generating Bio-Crypto Keys from Secured Feature Matrix

Studying the comparative analysis of our proposed methods with the algorithms
of previous approaches, and finally discussing the security analysis of the four
methods against the imposter attacks.

1.10 ORGANIZATION OF THE THESIS


This thesis is organized into ten chapters which are briefly discussed below. The first
three chapters deal respectively with introduction to network security, biometric system
security and cancellable biometrics. The next three chapters cover literature review,
theoretical background and motivation for the research. Seventh and eighth chapters
describe our two proposed methods, experimental results and comparative analysis of our
two proposed methods with the algorithms of two earlier methods respectively. Ninth
chapter discusses the security analysis of the four methods including the proposed ones,
16

against the impostor attacks. And the last chapter presents the conclusion of current
research work and future work.

Chapter 1: Introduction to Network Security


This chapter mainly discusses briefly the concepts and issues about the network security,
security goals and attacks, types of network threats, security mechanisms, security
services and security techniques. There after, we discuss about identification and
authentication mechanisms. Finally, we describe the aims and objectives of this thesis,
original contributions and summary or outline of the thesis, in which, we have given the
gist of each chapter briefly.

Chapter 2: Biometric System Security


In this chapter, we discuss the need for Biometric Systems where we mention the
problems with existing traditional systems and comparison of authentication mechanisms.
Then we give brief introduction to biometric systems where we discuss the characteristics
of biometric systems, biometric system components, and modes of operation and
information flow in the systems. After that, we discuss biometric technologies and
classification, biometric modalities and comparison of biometric technologies,
performance measurements of biometric systems. Later on, we discuss the merging of
biometrics and cryptographic techniques to have reliable network security, and finally we
discuss in detail about fingerprint technology as one of biometric technologies.

Chapter 3: Cancellable Biometrics


In this chapter, we discuss briefly about problems with existing biometric technologies
and then show that cancellable biometric system is one of the solutions for these
problems. Finally, we explore the cancellable biometrics in detail.

17

Chapter 4: Literature Survey


This chapter first briefly discusses history of biometrics, how it has evolved, and how it
has become a major challenging research topic and continues to be so even today. Then
we give the time-line of biometrics, and the history of research in the field of biometrics.
Later on, we discuss about cancellable biometrics and the relevant research in this area.

Chapter 5: Theoretical Background


This chapter discusses the theoretical background of our proposed cancellable biometric
key generation system. For the purpose of cancellation and re-issue of biometric
templates and safeguard of privacy in biometric systems, the concept of cancellable
biometrics, is proposed. So as to safeguard the privacy and to the prevent disclosure of
any information saved in databases for personal identification or verification, cancellable
biometric templates are preferred to be non-invertible. Here, we have also provided the
background information related to cancellable biometric systems and bio-cryptographic
techniques. In addition, the main concepts we have used in our proposed systems, are
discussed concisely in the subsequent sections. These include the concepts utilized for the
pre-processing of the input fingerprint image, Region of Interest (ROI) selection and the
extraction methods together with the minutiae extraction algorithms and the
encryption/decryption techniques.

Chapter 6: Motivation for the Research


This chapter presents the motivation for the research and its significance in biometric
field, especially in the field of cancellable biometrics. At first, the necessity of
cancellable biometrics and its benefits are discussed in detail. Then, the two significant
contributions by Ratha et al. and S. Tulyakov et al. in recent time to develop the
cancellable templates of the fingerprint images are presented. Further, the detailed steps
involved in generating the cancellable templates are discussed. Finally, the conclusion of
this chapter is summarized in a concise manner.

18

Chapter 7: New Approaches to Cancellable Biometric Based Security


In this chapter, we propose two new algorithms, viz., New-Fangled Approach for
Cancellable Biometric Key Generation and Development of Bio-Crypto Key from
Fingerprints Using Cancellable Templates etc. for secure fingerprint biometric systems.

Chapter 8: Experimental Results and Analysis


This chapter describes the experimental results and the performance analysis of our
proposed methods and methods proposed by Ratha et al. and S. Tulyakov et al. This
chapter is significant in the sense that it concludes the research in terms of its
effectiveness and advantages over the previous ones. The experimental analyses are
carried out utilizing the eminent fingerprint databases. Subsequently, Receiver Operating
Characteristic (ROC) graph is plotted in between the FNMR vs. FMR to signify the
relative effectiveness of our methods among the various methods. Finally, previous
methods and our proposed methods are analyzed for respective performances showing
that the performances of our proposed methods are superior to those of the previous
methods.

Chapter 9: Security Analysis


This chapter discusses the security analysis of the proposed methods along with that of
the previous methods. In order to signify the effectiveness of the proposed methods, the
security analysis is carried out in terms of different transformations utilized in the various
methods to prove the non-invertibility. Finally, it concludes the discussion with analysis
of the methods for providing the security against the impostor attacks.

Chapter 10: Conclusion and Future Work


This chapter summarizes the research contribution by highlighting conclusions,
significance, limitations of present study and by discussing future directions.

19

CHAPTER 2
BIOMETRIC SYSTEM SECURITY
"Biometrics is certainly the most secure form of authentication. It's the hardest to
imitate and duplicate." Avivah Litan

2.1 INTRODUCTION
In this chapter, we discuss the importance of Biometric Systems in respect of security
authentication, especially in view of the problems with existing traditional systems and
compare various authentication mechanisms. Then we give brief introduction to
biometric systems where, we discuss the characteristics of biometric systems, biometric
system components, and modes of operation and finally information flow in the system.
After that, we describe biometric technologies and classification, biometric modalities
and comparison of biometric technologies, performance measurements of biometric
systems. Later on, we discuss merging of biometrics and cryptographic techniques to
have reliable network security, and finally we discuss in detail about fingerprint
technology as one of biometric technologies.

2.2 THE NEED FOR BIOMETRIC SYSTEMS


A wide variety of systems require reliable personal authentication schemes to either
confirm or determine the identity of individuals requesting for network services. The
purpose of such schemes is to ensure that the rendered services are accessed by a
legitimate user, and not by anyone else. Examples of these systems include secure access
to buildings, computer systems, laptops, cellular phones and ATMs. In the absence of
robust authentication schemes, these systems are vulnerable to the wiles of an impostor
[65].
A critical consideration when designing an application is about choice of appropriate
identification method, or combination of methods, based on the three attributes as
described in section 1.7 of chapter 1. Each method offers its own pros and cons and there
20

are many ways to compare and contrast them. However, the comparison has been focused
here on the basis of criteria of an identification system failing either in granting access to
an authorized person or in rejecting a legitimate authorized user.
Problems with Passwords:
1. Passwords can be obtained or cracked using a variety of techniques, including:
a) Common password usagea lot of people use common passwords like
guest, password, pword, help, aaa, 1234 etc. Similarly, people
often create passwords from pertinent information about themselves, like the
name of a child or pet, which might be easily guessed [16].
b) Exhaustive or brute force attack [16]this is an attack where all possible
passwords are exposed;
c) Dictionary attacka variant of the brute force attack that uses words from a
specific list (for example, the English dictionary) [17];
d) Using programs/tools to crack the passworda lot of programs and tools are
available to crack and access passwords [17].
2. Passwords can be disclosed. If the password is disclosed to an individual, he/she
will be able to gain access to areas, information etc., which are meant to be
confidential.
3. Passwords can be forgotten. Although this is not a security threat directly, it does
place an additional burden upon an organizations administration in respect of
retrieving the information. If an individual has forgotten his/her password, he/she
needs to be issued with a new one.
Problems with Tokens:
1. Tokens can be forged and used without the knowledge of the original bearer. For
example, a forger can steal an identity and create a fake ID document using
another persons information. Armed with the forgery, fraudulent transactions can
be authorized without the original bearers knowledge.

21

2. Tokens can be lost, stolen or given to someone else. In any of these instances, an
illegitimate person will be able to fraudulently transact with the system by
impersonating the original bearer [28].
Problems with Biometrics:
1. Biometrics can be forgedfor example, a forged signature could be accepted by a
signature recognition system if performed skillfully enough [18].
2. Biometrics can be destroyeda biometric characteristics ability to be read by a
system can be reduced. An individuals fingerprints, for example, can be affected
by cuts and bruises [14] and can even be destroyed by excessive rubbing on an
abrasive surface or through exposure to certain chemicals like acids etc.
In this section, all three authentication mechanisms are shortly described with their
respective drawbacks. All these mechanisms have been summarized and contrasted in
Table 2.1 below [28, 30]:

S. No.
1.
2.
3.
4.
5.
6.
7.
8.
9.

Criterion
Technology
User Friendly
Forgery
Can be Stolen
Can be lost/ damage
Can be Forgotten
Transportability
System Price
Hygienic Reservation

Identification Mechanisms
Knowledge Possession Biometrics
Trivial
Moderate
Difficult
Yes
Yes
Depends
Yes
Yes
Yes
Yes
Yes
No
No
Yes
No
Yes
Yes
No
Yes
Yes
No
Marginal
Moderate
High
No
No
Yes

Table 2.1: Comparison summary of authentication mechanisms


On the one hand, knowledge-based and token-based authentications share the distinct
drawback that passwords or tokens can be forgotten, stolen or lost. Furthermore, they are
not capable of telling the difference between a client and an impostor with a stolen
password or token. Conversely, a users biometric is always present and can neither be
lost nor stolen. But, once biometric is compromised, then it becomes bigger risk for
security/authentication system.
22

When deciding which identification method to use in a particular application, the


applications environment is a key consideration and if, this basic comparison is applied
to a remote, distributed environment like the Internet, the fact that, it is risky to transport
an individuals biometric characteristics emerges as critical. Transportability and forgery
are major concerns in an environment where the user can be located anywhere and enjoys
anonymity. In such circumstances, presenting fraudulent passwords or tokens is both easy
to use and hard to detect. Some biometrics can be forged, but the fact that a measurement
must be made to authorize a transaction ensures the individual must also be present (to be
measured) and offers a superior level of security, in itself, to tokens or passwords. For
this reason, and the relative complexity required to forge most biometrics, solutions based
on biometric identification methods appear to offer better security, especially when
transacting in remote, dispersed environments like the Internet and World Wide Web etc.
[28]. In spite of all these, biometric systems offer better security than existing approaches
and serve as a deterrent. Table 2.1 shows clearly that the biometric identification systems
could be the best solution for Network Security applications.

2.3 BIOMETRICS A RELIABLE AUTHENTICATION MECHANISM


Person Identification is the process of verifying a persons identity. This process is also
called authentication. Identifying fellow human beings has been crucial to the fabric of
human society. In the early days of civilization, people lived in small communities and
everyone knew each other. With the population growth and increase in mobility, we
started relying on documents and secrets to establish identity. Person identification is
now an integral part of the infrastructure needed for diverse business sectors such as
banking, border control, law enforcement and more. This is a fundamental problem in
science and engineering with broad economic and scientific impact.
The term biometrics is derived from the Greek words bio (life) and metric (to measure)
[19, 20]. Biometry or biometrics means The statistical analysis of biological
observations and phenomena. A biometric system is a pattern recognition system that
establishes the authenticity of a specific physiological or behavioural characteristic
possessed by a user. Biometric authentication is considered the automatic identification,
23

or identity verification, of an individual using either a biological feature he/she possesses


(physiological characteristics/traits, like fingerprint) or something he/she does
(behavioural characteristics/traits, like signature) [15]. These traits are called biometric
identifiers or simply biometrics.
Definition: A biometric system is an automated method for identifying or authenticating
the identity of a living person based on physiological or behavioural characteristics. [31]
Generally, the term biometrics is used in two different sensesto describe a
characteristic and a process.
As a Characteristic: Biometrics is a measurable biological (anatomical and
physiological) or behavioural characteristic that can be used for automated recognition.
As a Process: Biometrics encompasses automated methods of recognizing an individual
based on measurable biological (anatomical and physiological) and behavioural
characteristics.
As an introduction to the field of biometrics, this section will focus on the three initial
considerations that need to be taken into account when accessing biometric systems [28]:
1. Characteristics are standard means of assessing the usability or usefulness of a
given biometric trait as an authentication mechanism.
2. Commonality is the generic components to every biometric system.
3. Effectiveness is the way of testing the performance of an individual biometric.
2.3.1 Biometric Characteristics
The main characteristics used to determine whether or not a physiological or behavioural
biological trait can be used as a biometric authentication mechanism are [55]:
x

Robustness: It measures the stability of the biometric trait. In other words, it is


the ability of the biometric to stay constant or un-changeable over time [57]. It
becomes important when biometric trait can be physically changed, either
intentionally or accidentally [56].

24

Distinctiveness: It measures the complexity or potential differences in a particular


biometric traits patterns and helps determine how large a population sample can
be used [57].

Accessibility: It measures how easy the particular biometric trait is to get to and
measure. Foot geometry, for example, would not be very accessible since
individuals would have to remove their shoes first.

Acceptability: It is about how readily individuals adopt a biometric system or


about how intrusive the individual feels the systems is, based on the trait in the
question.

Availability: It ascertains how many different/unique, independent samples the


system could potentially acquire from an individual.

2.3.2 Simple Biometric System Components


Any biometric system consists of five basic modules: Sensor Module, Feature Extraction
Module, Matching Module, Decision Module and System Database [29, 32]. All these
modules are depicted in Figure 2.1.
x

Sensor Module: The sensor module registers an individual into the biometric
system database. During this phase, a biometric reader scans the raw image of the
user biometric trait and produces its digital representation.

Feature Extraction Module: This module processes the raw image data (sample)
obtained during sensor module and extracts certain features to generate a compact
representation of the biometric trait called the template or a feature-set, which is
then stored in the system central database or a smartcard issued to the individual.

Matching Module: This module compares the current input with the template. If
the system performs identity verification, it compares an extracted feature-set or
the characteristics of the current input to the users master template and produces
a score or match value (one-to-one matching). A system performing identification
matches the current characteristics against the master templates of many users,
already stored in database, resulting in multiple match values (one-to-many
matching).
25

Decision Maker: This module accepts or rejects the user based on a security
threshold and matching score.

System Database: This module collects and stores all biometric templates
(including brief profiles of users), obtained (generated) during enrollment process
into the system database. This database is also called as template database.
Depending on the application, the system database may be either centralized
database (physical database) that resides in the system or distributed database
(virtual database) with the record of each individual being carried on the magnetic
card issued to the individual.

Figure 2.1: Simplified logical block diagram of a biometric system


2.3.3 Biometric Modes of Operation
A biometric system may operate either in Verification Mode or in Identification Mode to
resolve a persons identity. Verification (am I who I claim to be?) involves confirming or
denying a persons claimed identity. In identification, one has to establish a persons
identity (who am I?). But, before the system can be put into verification or identification
mode, a system database consisting of biometric templates, that must be created through
the process of enrollment.

26

Enrollment: In the enrollment process, users initial biometric samples are collected,
assessed, processed, and stored for ongoing use in a biometric system, as depicted in
figure 2.2.

Figure 2.2: Enrollment process

Figure 2.3: Verification and identification process


27

Verification is a 1:1 matching process, where the user claims an identity and the system
verifies whether the user is genuine or not. If the users input and the template of the
claimed identity have a high degree of similarity, then the claim is accepted as genuine
otherwise, the claim is rejected and the user is considered as fraud, as depicted in figure
2.3.
Identification is a 1: N matching process, where the users input is compared with the
templates of all the persons enrolled in the database and the identity of the person whose
template has the highest degree of similarity with the users input is processed by the
biometric system. If the highest similarity between the input and all the templates is less
than a fixed minimum threshold, the system rejects the input, which implies that the user
presenting the input is not one among the enrolled users, as shown in figure 2.3.
Only biometrics can provide negative identification (i.e., I am not he) capability. Like
any security system, biometric systems are not foolproof. However, Biometrics can help
in protecting individual privacy and can guard personal & sensitive information because
biometrics provides stronger identification than password.

2.3.4 Information Flow in Biometric System


Figure 2.4 shows the flow of information in verification and identification systems [38].
In verification or authentication, the user claims an identity and the system verifies
whether the claim is genuine. If the users input and the template of the claimed identity
have a high degree of similarity, then the claim is accepted as genuine. Otherwise, the
claim is rejected and the user is considered as fraud. The matching is 1:1 in verification
system, as it is explained in earlier section. In Identification, the users input is compared
with the templates of all the persons enrolled in the database and the identity of the
person whose template has the highest degree of similarity with the users input is output
by the biometric system. Typically, if the highest similarity between the input and all the
templates is less than a fixed minimum threshold, the system outputs a reject decision,
which implies that the user presenting the input is not the one among the enrolled users.
Therefore, the matching is 1: N in an identification system.

28

Figure 2.4: Information flow in biometric systems

2.4 BIOMETRIC TECHNOLOGIES AND CLASSIFICATIONS


As we have discussed in section 2.3, human biometric characteristics can be separated
into two different categories: the physiological and the behavioural traits. This
classification of biometric traits is done related to the shape of the body and related to the
behavior of a person as shown in the biometric typology chart, Figure 2.5.
The physiological characteristics are relatively stable, such as a fingerprint, hand
silhouette, iris pattern, blood vessel pattern of the retina, or DNA fingerprint.
Physiological biometric traits are essentially fixed and do not change over time. On the
other hand, behavioral characteristic are more prone to changes depending on factors
such as aging, injuries, or even mood. The most common behavioral characteristic used
29

today is the signature, although not in biometric systems. Other possible behaviors that
can be used are how one speaks, types on a keyboard, or walks. Because of the inevitable
modest variations of all behavioral traits, many systems use an adaptation mechanism to
update the reference template in order to compensate for slight changes of the biometric
trait over time. Generally, behavioural biometrics work best with regular use [30].
Biometrics

Physiological
Biometrics

Behavioural
Biometrics

Fingerprint

Face

Voice

Gait

Hand

Eye

Signature

Keystroke

Ear Shape

DNA

Multimodal
(Combination)
Figure 2.5: Typology of biometric mechanisms
The above classification can be further divided in to sub-categories. For example,
HandPalm prints, Hand geometry, Hand veins etc., EyeIris recognition, Retina scan
etc., EarEar canal, Ear shape recognition etc., FaceFace recognition, Facial
thermogram etc., and likewise, KeystrokeKeystroke dynamics, Keystroke analysis etc.,
There are important differences between physiological and behavioral methods. First, the
degree of intra-personal variation in a physiological characteristic is smaller than in a
behavioral characteristic. Apart from injuries the iris pattern remains the same over time,
whereas speech characteristics change and are influenced by many factors, e.g. the
emotional state of the speaker. Developers of behavior based systems, therefore, have a
harder job in compensating for those intra-personal variations. Second, due to the intra30

personal variations of behavioral methods, their discriminatory power (How many


distinguishable persons are there?) is generally smaller than for physiological methods
[30].

2.5 BIOMETRIC MODALITIES


Any physiological or behavioral human characteristic can be used as a biometric as long
as it satisfies the following requirements [33]:
x

Universality, every person should have the characteristic.

Uniqueness, no two persons should be the same in terms of the characteristic.

Permanence or Immutability, the characteristic should be invariant in time.

Collectability is the characteristic that can be measured quantitatively. In addition,


application related requirements are also of utmost importance in practice.

Non-Circumvention, where, circumvention refers to how easy it is to fool the


system by fraudulent techniques;

Performance refers to the achievable identification accuracies, taking into


consideration, the resource requirements for acceptable identification accuracy,
and the working environmental factors that affect the identification accuracy
(accuracy, speed, and robustness of technology used).

Acceptability refers to what extent people are willing to accept the biometric
system.

2.6 COMPARISON OF VARIOUS BIOMETRIC TECHNOLOGIES


Human characteristics can be used for biometrics in terms of the parameters which was
discussed in the earlier section. Biometric characteristics provide a unique natural
signature of a person and it is widely accepted. While some of the requirements described
above like universality, and collectability are relatively easy to verify for certain human
characteristics, others like immutability, and uniqueness require extensive tests on a large
31

number of samples in order to be verified. Each of biometric technologies has its


advantage and disadvantage. The applicability of a specific biometric technique depends
heavily on the application domain. No single biometric can meet the entire requirement
(e.g. accuracy, cost, practicality), which means no biometric is optimal [34]. Fingerprints
have been used as a biometric characteristic because they could offer unique
advantages over other biometrics in terms of acquisition ease, relative temporal
invariance, and uniqueness among different subjects [35].
A brief comparison of biometric techniques based on seven factors is provided in Table
2.2 [35]. In this sense, each biometric technique is admissible. For example, it is well
known that both the fingerprint technique and the iris scan technique perform much better
than the voice print technique in terms of accuracy and speed. As can be seen from Table
2.2, overall fingerprint performs better than other biometric techniques. Fingerprint has
its own distinctiveness that has been used for personal identification for several years.

Table 2.2: Comparison of various biometric technologies

On this basis, biometrics were applied in many high end applications, with governments,
defence and airport security being major customers. However, there are some areas in
which biometric applications are moving towards commercial application, namely,
32

network/PC login security, web page security, employee recognition, time and attendance
systems, and voting solutions. While biometric systems have their limitations, they have
an edge over traditional security methods in that they cannot be easily stolen or shared.
Besides bolstering security, biometric systems also enhance user convenience by
alleviating the need to design and remember passwords. According to A. Jain [36], U.
Uludag [34], the table 2.2 shows the comparison of various biometric technologies, the
perception based on High=100, Medium=75, Low=50.

2.7 PERFORMANCE MEASUREMENTS OF BIOMETRIC SYSTEM


With the ever increasing market for biometric devices, there is a growing need for a
consistent way to evaluate biometric systems. There have been a number of documents
available to address the issue of biometric device testing, but few have developed generic
testing protocols for biometric systems [29]. Performance statistics for identification
systems differ substantially from those for authentication applications [30]. The main
performance measure for identification systems is its ability to identify a biometric
signatures owner. Further, the performance measure equals the percentage of queries in
which the correct answer is the top match.
In order to determine the performance of a biometric system, a number of different
factors need to be assessed. One of the key performance measurements is error rate.
The various types of error rates are defined as follows:
Image Acquisition Errors:
x

Failure to Enroll Rate (FTE or FER): FTE is the percentage of time that users
are unable to enrol in the biometric system [52]. In other words, it is the
percentage of data input that is considered invalid and fails to input into the
system. Failure to enroll happens when the data obtained by the sensor are
considered invalid or of poor quality.

Failure to Capture Rate (FTC or FCR): FTA is the percentage of time the
biometric system is unable to capture a biometric sample when one is presented

33

Decision Error Rates [60]:


x

False Accept Rate (FAR): It is the number of times an impostor user is falsely
granted access to the system divided by the total number of trials.
FAR(O )

Number of False Accepts


Number of Impostor Accesses

False Reject Rate (FRR): It is the number of times genuine users are falsely
rejected divided by the number of trials.

FRR(O )
x

Number of False Rejects


Number of Client Accesses

Equal Error Rate (EER): It is the rate at which both accept (FAR) and reject

(FRR) errors are equal. The lower the EER, the more accurate the system is
considered to be.
Matching Errors:

False Match Rate (FMR): FMR is the rate at which a template is falsely matched

to a template in a database [57].


x

False Non-Match Rate (FNMR): FNMR is the rate at which a template is falsely

not-matched to a truly matching template in the database [57].


Performance measures:

Receiver Operating Characteristics (ROC) Curve: It is the curve relating FAR to

FRR across various thresholds. In biometric systems, the FAR and FRR can
typically be traded off against each other by changing those parameters. ROC
curves are one of the ways to evaluate the performance of a biometric system.
x
x

Detection Error Trade-off (DET): DET curve is a modified ROC curve.


D Prime: D prime is a common scalar means of evaluating biometric system

performance. It is the normalized difference between the means of genuine and


34

impostor match scores. D prime is also known as a measure of goodness, and


assumes distributions to be normal [63].
x

Template Capacity: the maximum number of sets of data which can be input in to

the system.
x

Decision errors vs. matching errors

FMR and FNMR are calculated over the number of comparisons while FAR and FRR are
calculated over the number of transactions. Another difference is that FAR and FRR also
account for FTA rates [61].
x

Systematic and Random Errors

Performance estimates of biometric systems will be affected by systematic errors and


random errors. Random errors result from the natural variation in biometric samples or
users, for example. Systematic errors are errors that are caused by the bias in testing
procedures [61].

Figure 2.6: Error trade-off in a biometric system

The accuracy of a biometric system is only as good as its sensor and the degrees of
freedom of the biometric trait being measured [64]. The accuracy of a biometric system is
represented by its FARFalse Accept Rate and its FRRFalse Reject Rate. The error
rates are a function of the threshold as shown in Figure 2.6. These two scores can be
plotted against each other through out all possible threshold values to show performance.
This plot is called the ROCReceiver Operating Characteristic curve [62]. The two

35

errors are complementary in the sense that if one makes an effort to lower one of the
errors by varying the threshold, the other error rate automatically increases [53].

2.8 MERGING BIOMETRICS AND CRYPTOGRAPHY FOR RELIABLE


NETWORK SECURITY

Biometrics has the potential to identify individuals with a high degree of assurance, thus
providing a foundation for trust. Cryptography, on the other hand, concerns itself with the
projection of trust: by taking trust from where it exists to where it is needed.
Cryptography is an important feature of computer and network security [21]. Using
biometrics for security purposes becomes popular, but using biometrics by means of
cryptography is a new research topic. Many traditional cryptographic algorithms are
available for securing information, but all of them are dependent on the secrecy of the
secret or private key. To overcome this dependency, biometric features consider secrecy
of both keys and data. There are various methods that can be deployed to secure a key
with a biometric.
1. The first method involves remote template matching and key storage. In this
method a biometric image is captured and compared with a corresponding
template. If the user is verified, the key is released. The main problem here is
using an insecure storage media [21].
2. The second method hides the cryptographic key within the enrolment template
itself via a secret bit-replacement algorithm. When the user is successfully
authenticated, this algorithm extracts the key bits from the appropriate locations
and releases the key [22].
3. The third method is of using data derived directly from a biometric fingerprint
image. In this manner fingerprint templates are used as a cryptographic key [23,

24]. However, sensitivities due to environmental, physiological factors and


compromising of the cryptographic keys stand as a big obstacle to this method
[25].

36

Biometrics and cryptography should not be seen as competing technologies rather these
two are potentially complementary security technologies. Therefore, they have to be
symbiotic rather than competitive. Biometric Fingerprint technology is chosen because of
its information strength, namely the uniqueness for random sequences, needed for
cryptographic key generation [37]. This thesis puts forth a fresh methodology for the
secure storage of fingerprint template by generating Secured Feature Matrix and keys for
cryptographic techniques applied for data Encryption or Decryption with the aid of
cancelable biometric features. If a Biometric Key is missing or stolen, it is lost
perpetually and possibly for every application where the biometric is utilized, since a
biometric is permanently linked with a user and cannot be altered. In this thesis, we
propose a technique to produce cancelable key from fingerprint so as to surmount these
problems. The flexibility and dependability of cryptography is enhanced with the
utilization of cancelable biometric features. There are several biometric systems in
existence that deal with cryptography, but the proposed cancelable biometric system
introduces a novel method to generate Cryptographic Key. We have as well discussed
about the Security analysis of the projected Cancelable Biometric System.

2.9 FINGERPRINT AS A BIOMETRIC MODALITY

Among all biometric traits, fingerprints have one of the highest levels of reliability [66]
and have been extensively used by forensic experts in criminal investigations [67].
Fingerprint analysis, also known in the US as dactylography, is the science of using
fingerprints to identify a person.

2.9.1 Fingerprint Recognition

Fingerprint recognition represents the oldest method of biometric identification. Its


history goes back as far as to at least 2200 BC. The use of fingerprints as a personal code
has a long tradition and has already been used by the Assyrians, the Babylonians, the
Chinese and the Japanese long back. Since 1897, dactyloscopy (synonym for noncomputer-based fingerprint identification) has been used for criminal identification, and
37

the matching accuracy using fingerprints has been shown to be very high [39].
Fingerprint identification is well established and a mature science [70].
2.9.2 Fingerprint Uniqueness

A fingerprint refers to the flow of ridge patterns in the tip of the finger. Ridges are the
lines across fingerprints (raised skin) and valleys or furrows are the spaces between
ridges (lowered skin) on the surface of a fingertip. When an inked imprint of a finger is
made, the impression created is of the ridges while the furrows are the unlinked areas
between the ridges. A sample fingerprint is shown in figure 2.7. For a person, fingerprints
are formed or determined during the first seven months (in the third and fourth month) of
foetal development and are unique. The pattern of the ridges and valleys, called
minutiae, is unique for each individual [40, 47]. These are the basis for most of the
fingerprint identification and are acceptable even in a court of law. Even identical twins
will have differing fingerprint patterns and so are the prints on each finger of the same
person. Two like fingerprints would be found only once every 1048 years [78]. That's
why a proverb says: Faces can lie but fingerprints never.

Figure 2.7: A sample fingerprint

2.9.3 Fingerprint Categories

The skin excretes oils and perspiration through sweat glands, flowing along the tops of
the ridges. When a surface is touched the fingerprint is transferred. Smooth, clean
surfaces record better quality fingerprints but fingerprints can also be found on irregular
surfaces such as paper. There are three basic categories of fingerprint [70]:
38

Visible prints (also called patent), such as those made in oil, ink or blood

Latent prints which are invisible under normal viewing conditions; and

Plastic prints which are left in soft surfaces such as newly painted ones.

There are more than forty methods available for collecting fingerprints using powders;
chemicals such as iodine, ninhydrin, and silver nitrate; digital imaging, dye stains and
fumes [70].

2.9.4 Fingerprint Classification (based on Pattern Types)

Sir Edward Henry (1850 - 1931), as the Inspector General of Police for Bengal Province
in India, developed a classification system which was officially adopted by British India
in 1897. In December 1900, Britains Belper Committee recommended that the
fingerprints of criminals be taken and classified by the Indian System [49]. The Henry
Classification System organizes ten-print fingerprint records by pattern type. Finger
ridges and patterns can be continuous, interrupted, forked, and of other formations.
Fingerprints are classified and identified by the relationship of these formations,
described as minutiae. These patterns are classified into three major categories based on
their central pattern [44]. The patterns are the arch, loop, and whorl, which are shown in
figure 2.8. They are further divided into various subgroups [50].

Figure 2.8: Three major fingerprint classifiers

Major classification based on Patterns:

Arch: a ridge that runs across the fingertip and curves up in the middle. Tented

arches have a spiked effect.


39

Whorl: an oval formation, often making a spiral pattern around a central point.

Principal types are a plain whorl and a central pocket loop whorl.
x

Loops: These have a stronger curve than arches, and they exit and enter the print

on the same side. Radial loops slant toward the thumb and lunar loops away from
the thumb.
Further, we have the following subgroups:

Composites are a mix of patterns mentioned above;

Accidentals form an irregular pattern thats not classifiable as an Arch, Loop or

Whorl.
2.9.5 Types of Minutiae
Minutiae features, also known as Galton features are particular patterns consisting of

ridge endings (terminations) or ridge bifurcations. Minutiae points are local ridge
characteristics that appear as either a ridge bifurcation or a ridge ending or local
discontinuities in the fingerprint pattern [40, 46], as shown in Figure 2.9.

Figure 2.9: A fingerprint image with the core and four minutiae points

40

A total of 150 different types of minutiae have been identified. In practice only ridge
ending and ridge bifurcation minutiae types are used in fingerprint systems [36]. Figure
2.10 depicts some minutiae representations [47, 48].
x

Islands: ridges slightly longer than dots, occupying space between two

temporarily divergent ridges;


x

Ponds or lakes: empty spaces between two temporarily divergent ridges;

Spurs: a notch protruding from a ridge;

Bridges: small ridges joining two longer adjacent ridges; and

Crossovers: two ridges which cross each other.

Figure 2.10: Fingerprintridge patterns and minutiae examples

A complete fingerprint consists of about 100 minutiae points on average. The measured
fingerprint-area consists on average of about 30-60 minutiae points depending on the
finger and on the sensor area. These minutiae points are represented by a cloud of dots in
a coordinate system. They are stored together with the angle of the tangent of a local
minutiae point in a fingerprint-code or directly in a reference template. A template can
consist of more than one fingerprint-code to expand the amount of information and to
expand the enrolled fingerprint area. In general this leads to a higher template quality

and therefore to a higher similarity value of the template and the sample. The template
sizes vary from 100 bytes to 1500 Bytes depending on the algorithm and the quality of a
fingerprint. Nevertheless, very rarely there are fingerprints without any minutiae-points
41

that lead to a failure to enroll rate (FER). It is also difficult to extract the minutiae points
accurately when the fingerprint has got a low quality [40].

2.9.6 Fingerprint Matching Techniques

The existing popular fingerprint matching techniques can be broadly classified into two
categories: (a) minutiae-based and (b) correlation-based. Minutiae-based matching and
Correlation-based matching are also called as Minutiae Matching and Pattern Matching
respectively. Minutiae-based techniques attempt to align two sets of minutiae points and
determine the total number of matched minutiae [68, 69, and 35]. Correlation-based
techniques, on the other hand, compare the global pattern of ridges and valleys (furrows)
to see if the ridges in the two fingerprints align [71, 72, 73, and 74]. The minutiae points
define the local structure, while the ridge pattern along with the core and delta points
define the global structure or global configuration.

During the minutiae-based method, the ridges in the fingerprint are compared by their
unique details. Minutia points on the individuals finger are located and processed to
extract these points. They are then compared with a registered template. In comparison to
the minutia matching method, the correlation-based method compares all of the fingers
characteristics. Sub-areas of the ridge thickness, curves, or density are some of the
fingers characteristics. The area around the minutia, with low curvature, or combination
of ridges is taken from the fingerprint. The extracted area is then processed and compared
with a registered template [41, 45].
A typical minutiae extraction technique performs the following sequential operations on
the fingerprint image: (i) fingerprint image enhancement, (ii) binarization (segmentation
into ridges and valleys), (iii) thinning, and (iv) minutiae detection. Several commercial
[62] and academic [75, 76] algorithms follow these sequential steps for minutiae
detection. On the other hand, the simplest correlation-based technique is to align the two
fingerprint images and subtract the input from the template to see if the ridges
correspond. However, such a simplistic approach suffers from many problems including
the errors in estimation of alignment, non-linear deformation in fingerprint images, and

42

noise [77]. Currently the computer aided fingerprint recognition is using the minutiae
matching.
The performance of minutiae-based techniques rely on the accurate detection of minutiae
points and the use of sophisticated matching techniques to compare two minutiae sets
which undergo non-rigid transformations. The performance of correlation-based
techniques is affected by non-linear distortions and noise present in the image. In general,
it has been observed that minutiae-based techniques perform better than correlationbased ones [65].

In addition to the above techniques, there is another fingerprint matching technique called
feature-based technique that captures both the local and the global details in a fingerprint

as a compact fixed length feature vector. It uses orientation and frequency of ridges, ridge
shape and texture information etc. This technique suffers from low discriminative ability.
Challenges in Fingerprint Matching: Fingerprint matching is difficult due to large

intra-class variations caused by sensor noise, partial overlap, non-linear distortion, and
small inter-class variations (similarities in the global structure and ridge orientations).
Challenge is to handle poor quality fingerprints and fingerprints having little overlap.
Fingerprint Pattern Recognition System is shown in figure 2.11. The user places

his/her finger against a reader. The reader then scans the fingerprint and it is sent into a
database. Once in the database, the fingerprint is compared, verified, and identified [42,
78].
Fingerprint Identification: Fingerprint identification is based on two basic premises: (i)
Persistence, the basic characteristics of fingerprints do not change with time i.e.,

fingerprint characteristics are invariant and (ii) Uniqueness or Individuality, everybody


has a unique fingerprint. The uniqueness of fingerprints has been accepted over time
because of lack of contradiction and relentless repetition. As a result, fingerprint based
identification has been regarded as a perfect system of identification [78].

43

Matcher
Template
Database

Yes / No
(Threshold)

Authentication

Preprocessor

Feature
Extractor

Preprocessor

Enrollment

Figure 2.11: Fingerprint pattern recognition system

2.10 FINGERPRINT SYSTEMS

The principle of fingerprint systems is schematically illustrated in Figure 2.12. In an


enrollment process, the system captures finger data from an enrolee with sensing devices,
extracts features from the finger data, and then record them as template with a personal
information, e.g. a personal identification number (PIN), of the enrolee into a database.
Here, "finger data" means not only features of the fingerprint but also other features of
the finger, such as "live and well" features. Biometrics can operate in one of two modes:
the identification mode, in which the identity of an unknown user is determined, and the
verification mode, in which a claimed identity is either accepted or rejected. In an

identification (or verification) process, the system captures finger data from a finger with
sensing devices, extracts features, identifies (or verifies) the features by comparing with
templates in the database, and then outputs a result as Yes or True, only when the
features correspond to one of the templates [51].
Most of fingerprint systems utilize optical or capacitive sensors for capturing
fingerprints. These sensors detect difference between ridges and valleys of fingerprints.
Optical sensors detect difference in reflection. Capacitive sensors, by contrast, detect
difference in capacitance. Some systems utilize other types of sensors, such as thermal
sensors, ultrasound sensors [51].

44

Capturing
(With High Quality)

Feature
Extraction
Recording

Matcher
(Comparison)

Result

Referring

Finger
Presenting
System Database
(Template Database)

Fingerprint System
Enrollment

Verification or Identification

Figure 2.12: Typical structure of a fingerprint system

The enrollment module is responsible for enrolling individuals into the biometric system
database. During the enrollment phase, the biometric characteristic of an individual is
first scanned by a biometric reader to produce a digital representation (feature values) of
the characteristic. The data capture during the enrollment process may or may not be
supervised by a human depending on the application. A quality check is generally
performed to ensure that the acquired sample can be reliably processed by successive
stages. In order to facilitate matching, the input digital representation is further processed
by a feature extractor to generate a compact but expressive representation, called a
template. Depending on the application, the template may be stored in the central
database of the biometric system or be recorded on a distributed database (for example

smart card issued to the individual). Usually, multiple templates of an individual are
stored to account for variations observed in the biometric trait and the templates in the
database may be updated over time [52].
Advantages of Fingerprint Biometrics:
1. Can be placed on a smart card for an added degree of authentication
2. Low instances of false acceptance (a rate that fraudulent users are allowed access

to systems or areas, as a result of failure in the biometric device) [43]


45

3. Low cost
4. Integration is easier
5. Fingerprint readers are small in size.
Disadvantages of Fingerprint Biometrics:
1. Higher risk of false rejection (a rate that authentic users are denied or prevented

access to authorized areas, as a result of a failure in the biometric device) [43].


2. The degradation of the fingerprint caused by occupation, age or even trauma.

2.11 SUMMARY

In this chapter, we have discussed the importance of Biometric Systems in respect of


security authentication, especially in view of the problems with existing traditional
systems and compare various authentication mechanisms. Then we have given brief
introduction to biometric systems where, we discussed the characteristics of biometric
systems, biometric system components; modes of operation and information flow in the
system. After that, we described biometric technologies and classification, biometric
modalities and comparison of biometric technologies, performance measurements of
biometric systems. Later on, we presented the merging of biometrics and cryptographic
techniques to have reliable network security, and finally we discussed in detail about
fingerprint technology as one of biometric technologies.

46

CHAPTER 3
CANCELLABLE BIOMETRICS
Cancellable biometrics stores a non-invertible transformed version of the biometric
data and so if the storage is compromised the biometric data remains safe.
Reihaneh Safavi-Naini
3.1 INTRODUCTION
Now-a-days, biometric security systems have a number of problems because of the fact
that the biometric data of a person is generally stored in the system itself. The problems
would arise especially when that data is compromised. Standard password based security
systems are having the ability of cancelling the compromised password and reissuing
another one. But the biometrics cannot be changed or cancelled. Thus, advantage of
biometrics based security becomes disadvantage also in this particular situation. The
concept of cancellable biometrics can upgrade the existing biometric security system so
that it gains the advantages of the password based security systems, and at the same time
not losing the inherent superiority [79]. In this chapter, we discuss briefly problems with
existing biometric technologies and then will be showing that cancellable biometric
system is the solution for these. Finally, we explore the cancellable biometrics in detail.

3.2 PROBLEMS WITH THE EXISTING BIOMETRIC SECURITY SYSTEMS


Information security and ensuring personal privacy are todays growing concerns of
network security systems. One of the emerging technologies for automatic people
recognition, for identification and authentication, is biometrics. The standard biometric
system basically consists of two phases. The first phase is enrolment phase, in which
biometric template of the user is acquired. The second phase is authentication phase, in
which biometric sample is taken from the user and compared to the biometric template
which was already stored in the template database. If they match, positive authentication
is achieved [79]. The use of biometric data for authentication results in a greater comfort
and ease of use, with respect to traditional approaches. So, such biometric
47

implementations necessitate large-scale capture and storage of biometric data [80]. Most
of the existing biometric systems require central biometric template storages. Motivation
for this central storage comes from two different angles [79]:
1. The first motivation is the fact that the cost of enrollment phase is relatively high
[81]. Since every user has to go through this phase, and if templates are stored in
different storages, then the number of systems required to handle these
independent storages is large, and this process may be repeated a number of times.
Obviously, repeating one process many-times is inefficient and inconvenient for
the user. Thats why; the central template storage place is a good solution to
avoid the extra cost and inconveniences.
2. The second motivation factor is standardization. The central biometric template
storage would force all users of biometric authentication system, to use the same,
standardized methods. The entire process of authentication would have to be
standardized, from sensors to algorithms to security policies. The Standardization
would solve the compatibility problem over different services within the group
and enables a possibility for adding a new service to the group.
However, despite its obvious advantages, the use of biometrics has several potential
problems related to security and privacy [79], because of the fact that the biometric data
of every user in that system is stored in a centralized template database. They are outlined
below:
1. Identity theftthe attacker can steal the biometric data from the central database
and can use the same data for constructing an artefact which further can be used
to impersonate the original user. The artefact may be anything like an artificial
finger, eye, face mask, photography, or something else that may be depending on
the type of the biometrics in database. In other words, biometrics (even
fingerprints) can be recorded and misused without a users consent
2. Irrevocabilitythe nature of biometric sample is such that they are permanent
and consequently any user can not alter the acquired template. For instance, a
fingerprint of right index finger, once given, cannot be modified. The major

48

3. Exposure of personal informationit has been proved that fingerprints, beside


the information about minutiae which is used in authentication phase, also
relieves some more information about genetic origin of a person. Similarly, retina
scan can reveal existence of some diseases like diabetes or stroke [82]. All these
information pieces are very confidential and shouldnt be revealed to anyone
without the consent of the authorized user. Some critics of biometric systems
claim that every biometric sample is personal information by itself and highly
confidential, and shouldnt be used at all [83].
4. Scope of useany biometric sample should be used only for the purpose it was
given for, and any situation, in which that scope is overridden, is considered an
invasion of privacy should be strictly forbidden.
This kind of centralized storage opens many concerns for the user like who has the right
of usage? Who has the access? How can the user limit someones access? How can the
user trust all the member services in the group? etc. There exist a number of solutions to
the above-mentioned problems, all of which rely on the hiding of biometric template in
the storage. The following approaches are used to solve these problems:
1. One is a classical approach to protecting sensitive data: Securing Data using
Biometric Cryptography.
2. Another sophisticated approach to protect private data: Securing Biometric
Data Using Steganography.
3. The third, more secure method to protect biometric data, and one of the
original solutions to address the above concerns or problems: Securing
Biometric Data Using Cancellable Biometrics, also known as anonymous or
revocable biometrics.
We know clearly that the encryption is not the ultimate solution, since the template has to
be encrypted prior to matching with the new sample, and at that moment the template is

49

exposed in its original form. In this chapter, we explore the cancellable biometrics [53] in
detail in the following sections.

3.3 CANCELABLE BIOMETRICS


As long as the biometric template exists within the centralized template database, it is
exposed to a potential attacker [79]. However, instead of storing the original biometric, if
we store the transformed biometric using a one-way function, this problem would be
solved. This transformation can be in the signal domain (figure 3.1) or the feature domain
(figure 3.2) [53, 84]. But, in contrast to encrypted templates, they do not need to be
transformed back into their original form before they can be matched to new samples for
authentication purposes. In fact, for the transformation function we choose is the one
which is non-invertible, so that the template cannot be transformed back into its original
form even if we want it to. The matching is performed by transforming the new acquired
sample with the same transformation, and then making the comparison in transformed
space [79].

Figure 3.1: Transformation using signal domain


In respect of distortion in signal domain, consider figure 3.1, which shows the illustration
of cancellable biometrics for face recognition when the face is distorted. Here, the face is
50

distorted in the signal domain prior to feature extraction. The distorted version does not
match with the original biometric, while the two instances of distorted faces match
among themselves.
In respect of feature domain distortion, consider figure 3.2, which shows how each
feature (e.g., minutiae position) is transformed using a non-invertible function Y

For instance, the minutiae position X 0 is mapped to Y0

f (X ) .

f ( X 0 ) as shown. However, if

we knowY0 , the inverse mapping is a many-to-one transformation. X 0 , X 1 , X 2 ,..., X 6 , etc.


are all valid inverse mappings to Y0 . The complexity of the inverse mapping is
exponential in the number of features, making the transform practically noninvertible.

Figure 3.2: Transformation using feature domain

This concept ensures that the original biometric template doesnt exist in the system
database. As such, it is not in danger of being exposed. In this way, the privacy issue is
completely nonexistent. Even if an attacker is able to get to a transformed template, it
will be completely useless for him/her. Nobody can use it to construct an artefact which
could enable him/her to impersonate the user. Moreover, the template couldnt be used
for identification purposes. The existence of transformation function allows simple
control over the services which have access as well as those which dont have. The
authorized services will have the knowledge of the transformation function, but the others
51

will not [79]. Cancellable biometrics is a relatively new direction of research, spurred on
by the privacy invasion and biometrics non-revocable issues. To formally define a
cancellable biometrics, Maltoni et al. [80, 36, and 85] had outlined four principal
objectives as follow:
1. Diversity: The ability to generate multiple templates from the same biometric to
ensure that cancellable biometric is such that not the same cancellable templates
have to be used in two different applications.
2. Revocability/ Reusability: Templates are easily revoked and reissued when
compromised. i.e., straightforward revocation and re-issue are allowed in the
event of compromise.
3. Non-invertibility: Original biometric data cannot be recovered from the
transformed or encrypted templates, i.e., One-way transformation function be
used for template computation to prevent recovery of biometric data.
4. Performance: The scheme should not weaken the recognition performance
remarkably i.e., the formulation should not deteriorate the recognition
performance for sure.
But concept of cancellable biometric is not created only to address the privacy issues.
The fact, that the stored biometric templates are created by using a transformation
function on the original biometric templates, enables creation of new templates by using a
different transformation function on the original biometric templates of the user. If one
can generate a new biometric template, the old one can be cancelled. Biometric security
systems which implement the concept of cancellable biometrics can enjoy all the benefits
we were used to in classic password based security systems (revocability and ability to
reissue), along with preserving the benefits of biometric systems. Biometric templates are
bound to the user so that these cannot be given to someone else. These cannot be stolen
or forgotten. And these have a greater resilience to brute force attack since these have
greater information size [79].

52

3.4 HOW CANCELABLE BIOMETRICS WORK

Let us consider a biometric-based identification system. It proceeds through an enrolment


phase and an identification phase which follow the principle: When a user wants to enrol,
one biometric template b1 is captured from him and then b1 is transformed in a specific
form to be stored into a database. When the identification of a new biometric sample b2
is requested, then a matching algorithm is run within the whole database to retrieve a
corresponding enrolled data, according to a similarity measure, as known as a matching
function, m and some recognition threshold. The matching function m takes as input two
biometric templates b1 ,b2 and outputs m(b1 , b2 ) 1 if the two templates are similar
enough i.e. above the threshold, or m(b1 , b2 )

0 if the two templates do not match [163].

The principle of cancellable biometrics is to replace a biometric template by a revocable


one, through a kind of one-way transformation.
A cancellable biometrics system is defined through a family of distortion/transformation
functions, f t . The functions f t transform a biometric template b1 into another biometric
template f (b1 ) for each t . The distortion functions f t and the matching function m
must verify the following properties [84]:
Condition 1 (Registration): It should be possible to apply the same transformation f t to
multiple captures of the same biometric traits b1 ,b 2 .
Condition 2 (Intra-user variability tolerance): Two matching biometric traits should also
match after a distortion f t , i.e. m(b1 , b 2 ) 1 m( f t (b1 ), f t (b 2 )) 1 .
Condition 3 (Entropy retention): Two non-matching biometric traits should not match
either after distortion, i.e. m(b1 , b 2 )

0 m( f t (b1 ), f t (b 2 ))

0.

Condition 4 (Transformation function design): This condition is made of three points:


1. Distortion: A biometric trait b and its distorted version f t (b) should not match:
m(b, f t (b))

0.

53

2. Diversity: Two different distortions of the same biometric trait should not match:
m( f t (b), f d (b)

0 , where t z d .

3. Cancellability: It should be computationally hard to retrieve the original biometric


trait b from one of its distorted version f t (b) .
The first three conditions enable the system to be practical, i.e. identification of a genuine
template succeeds almost all the time whereas the identification of a non-registered
biometric data leads almost always to a negative answer. But, in practice, one can expect
the error rates to slightly rise after the distortions [163, 84].
The last condition expresses a security requirement: a distorted template is to be indeed
distorted (part 1), and it should not be computationally feasible to revert to the original
template (part 3). Moreover, it should be possible to derive multiple different distorted
templates from the original one (part 2) [163, 84].

3.5 IMPLEMENTATION OF CANCELABLE FINGERPRINTS

Classical scenario of using biometric security systems which implement cancellable


biometrics is very similar to the usage of classical biometric systems. First, a biometric
sample is taken from the user during the enrollment phase. That sample is transformed by
a chosen non-invertible and one-way transformation function and then stored as a
template in a database. Afterwards in authentication phase, once a sample is taken, it is
transformed by the same transformation function. The transformed sample is then
matched to the template. If the template is stolen, the template is cancelled and a new one
is enrolled, only by changing the transformation function used. The transformation
function by itself can be stored on a SmartCard or on the server along with the templates.
It can be kept secret or publicly available, depending on the system implementation. If
the function is non-invertible, then it can be kept together with the templates, and doesnt
need a higher degree of protection. In this section, we discuss the construction of
cancellable fingerprints using one-way transformations in the feature domain. Instead of
storing the original minutiae features, the minutiae location and orientations are
transformed irreversibly, as shown in figure 3.3 [84].
54

For the transform to be repeatable, the minutiae positions have to be measured w.r.t. the
same coordinate system each time. Prior to the transformation, each fingerprint needs to
be registered. One way this can be accomplished is by precisely estimating the position
and orientation of the core and delta and expressing the minutiae with respect to these
points. Though there are several approaches, determining these singular points is a
difficult problem. Another problem even after registration is the intra-user variability of
biometric signals. The features after transformation should be robust w.r.t. to this
variation. The transform has to further satisfy the following conditions: (i) the
transformed version of the fingerprint should not match the original and the original
should not be recoverable from it. This preserves the privacy of the transformed template.
(ii) Multiple transforms of the same fingerprint should not match, which prevents crossmatching between databases.

Figure 3.3: Construction of cancellable fingerprints using feature domain

The important steps that are involved in cancellable transformation are registration,
transformations (on the signal level and on the feature level), and selection of
transformation function. We discuss these steps briefly in the following sections.

55

3.6 REGISTRATION

The first important step in the application of a cancellable transform is the process of
registering the image. For the transform to be repeatable, the minutiae positions have to
be measured with regard to the same coordinate system. This can be accomplished by
estimating the position and orientation of the singular points (core and delta) and
expressing the minutiae positions and angles with respect to these points. There have
been several approaches for the detection of singular (core and delta) points in literature.
The most recent approach is based on complex filtering proposed by Nilsson et al. [86].
Their technique relies on detecting the parabolic and triangular symmetry associated with
core and delta points. The filtering is done on complex images associated with the
orientation tensor instead of the gray-scale image [87].

3.7 TRANSFORMATIONS

After global registration, the features can be transformed consistently across multiple
instances. The requirements of cancel ability put several constraints on the transformation
[84]:
1. The minutiae position after transformation has to be outside the tolerance box of
the matcher. A minimum amount of translation during the transformation needs to
be ensured.
2. The transformation should be locally smooth to ensure that small changes in the
minutiae position lead to small changes in the minutiae position after
transformation.
3. The transformation should not be globally smooth. Otherwise, the minutiae
positions after transformation are highly correlated with the positions before
transformation and can be inverted.
Such a transform can be implemented in several ways [84].

56

3.7.1 Transformation on the Signal Level

The transformation of samples can be performed right after the sensor, on the signal level.
The data on which the transformation is performed can be a picture of the face,
fingerprint, and picture of the iris or another kind of biometric sample. An example of
such transformation is grid morphing. Grid morphing changes the picture, for instance, a
picture of a face. First, a grid is positioned on a face so that it is aligned with face features
like eyes, nose and chin. Then the grid is morphed so that the face is morphed with it.
The result is another face that cannot be linked to original face. More information on grid
morphing can be found in [88, 89].
These kinds of transformations change the original biometric data in a way that existing
algorithms for feature extraction still function on them after the transformation. Actually
it is very important that they do not diminish the power of existing algorithms. The result
of signal level transformations is actually another biometric data but not linkable to an
actual person. The rest of the biometric security system is actually not even aware of the
transformation of the signal.
One of the prerequisites for this kind of biometric system to function is that the applied
transformation can be used to repeatedly transform the signal during the authentication
phase in the same way. The problem of repeatability arises. The original biometric data is
usually represented by a picture, but it could be any other human feature like scent or
sound. No matter what kind of biometrics is used, in order to repeatedly apply the
transformation in the same way, the signal has to be normalized. Some features of the
biometrics have to be found prior to transformation. For instance, position of the face on
picture, or position and angle of the iris need to be found and the picture has to be
normalized in a way that the element found is centered and in equal rotation. Only after
that kind of pre-processing the transformation can be applied. The grid morphing
example mentioned above has a grid that has to be aligned with the features of the face.
Only after the eyes, nose, chin and other relevant features are found, the grid can be
positioned, and the transformation can be applied. If the grid is not aligned the same way
every time a transformation is applied, the resulting image will not be comparable to the

57

stored biometric template of the user, and the authentication will fail. This process can be
very difficult and sometimes impossible.
3.7.2 Transformation on the Feature Level

Besides transformation on the signal level, transformation can be applied on the feature
level. The feature level of the biometric sample is represented by a list of features
describing the biometric sample. It is usually represented by a list of numbers, like
coordinates, angles or sizes. These numbers can represent fingerprint minutiae or sizes of
fingers and palm in hand geometry biometrics. Transformation on feature level doesnt
need the normalization which is crucial for transformations on signal level, since the
sample is already processed and all the features are extracted into a normalized form [79].
Some feature level transformations change the biometric template so that the existing
algorithms for matching still function on them without any need for adapting. One
example of such function would be a transformation of features that simply changes their
position in coordinate space. But some transformations change data into a form
completely different from any known biometric data, like hash functions [90]. Such data
cannot be matched using the same algorithms but require new algorithms, created only
for that purpose. An example of feature level transformation is applying a high order
polynomial function on every minutiae in biometric template [79].
One of the main goals of cancellable biometrics is ensuring the biometric data of the
person so that it can never be compromised. Two transformation functions may look
similar, but one of them should be with the addition of converting some features to zero
or any other randomly chosen number [91]. That way, even if the attacker recreates the
original template by inverting the transformed template, he wouldnt get the users true
identity because some of the features were irreversibly changed.
3.8 SELECTION OF TRANSFORMATION FUNCTION

The function that is used during the transformation phase has to have certain
characteristics discussed below:
x

In order to have the option of cancelling and reissuing the template, we dont
want a limited number of transformation functions which could be applied,
58

If we store the transformation function in the same place where we store the
biometric templates, then it can be stolen along with the template. It is necessary
that an attacker having the template and the transformation function that created it
cannot get to the original template. The only way to ensure this is that the
transformation function is non-invertible, or it has large enough number of inverts
that would discourage a brute force attack. If the function is not noninvertible, it
should be carefully hidden from the attacker. One way to hide it would be to place
it on a SmartCard and not in a shared storage [79].

Transformation function can enlarge the template size in bytes, which is desirable
because the time needed for a brute force attack on a security system (trying all possible
combinations until we hit the one that will allow access) increases exponentially by the
size of the template size.
Transformed biometric templates should not diminish the uniqueness of a biometric data
[59].
x

Two different transformation functions applied on the same sample must differ
(return false if compared).

Result of a transformation T1 applied on a sample S1 should never be the same as


a result of a transformation T2 applied on a sample S2.

Two different samples transformed by the same transformation function must


differ.

These three preconditions need to be fulfilled in order to preserve uniqueness. Because


biometric data of a person are usually quite similar from one person to the other, the
standard matching function, which measures the distance between samples, needs to be
very sensitive. The fact that we are no longer comparing original biometric samples
which are determined by a persons biometric, enables us, by using adequate
transformation functions, to ensure even higher uniqueness by making the difference

59

between samples greater. By increasing the distance between biometric samples we can
achieve lower false accept rate (FAR) without increasing false reject rate (FRR) [92].
We can conclude that the transformation function actually represents the essence of the
concept of cancellable biometrics. As such it must ensure that it does not diminish the
positive characteristics of biometric security systems. By choosing the right type of
function we can even enhance the system by producing higher uniqueness [79].
3.9 SUMMARY

In this chapter, we have discussed briefly the problems with existing biometric
technologies and explored cancellable biometrics in detail. Biometric authentication
schemes raise security concerns because biometric data is permanently associated with its
owner and therefore can not be replaced even if it is compromised. One of the most
promising solutions to this problem is cancellable biometrics [53], where system does not
store the original biometric data; rather, it stores only the version transformed by a noninvertible transform using a one-way function [54]. Then, the verification/ identification
is done on this transformed data without any need to recover or use the original data,
thereby making the original data safe even if the system is compromised. This concept
ensures that the original biometric template doesnt exist in the system database. As such,
it is not in danger of being exposed. The privacy issue is this way completely nonexistent.
The presented concept of cancellable biometric templates is a good solution to most of
the perceived problems of todays biometric security solutions. The ability to cancel and
reissue a biometric template is a giant step towards increasing the usability of biometric
security systems.
Because of the nature of data being transformed, it is probably easier to apply the
transformation on the feature level. Choosing the appropriate transformation function is
the hardest task in implementation of cancellable biometrics. The transformation function
can ensure greater uniqueness among samples. A large family of functions must be
chosen so that it is not limited in number of variations. It must be noninvertible. It should
increase the template size. Finally, every system implementing cancellable biometrics
should be carefully planned and tested to ensure that all of the mentioned goals are
achieved.
60

CHAPTER 4
LITERATURE SURVEY
"Literature is analysis after the event." Doris Lessing
4.1 INTRODUCTION
In this chapter, to begin with, we will briefly discuss history of biometrics, how it has
evolved, and how it has become a major challenging research topic and continues to be so
even today. Also we will give the time-line of biometrics, and the history of research in
biometrics. Later on, we will discuss cancellable biometrics and the relevant research in
this area.

4.2 BIOMETRICS IS NOT NEW!


Biometrics is becoming an interesting topic now in regard to computer and network
security and biometric technology seems to belong to the twenty-first century. However,
the ideas of biometrics have been around for many years and the history of biometrics
goes back thousands of years. Biometric measures of one kind or another have been used
to identify people since ancient timeswith handwritten signatures, facial features, and
fingerprints being the traditional methods. Using these methods and newer ones, such as
hand geometry, voiceprints, and iris patterns etc., systems have been built that automate
the task of recognition.
For example read the following verse which is recorded in the Bible:
And the Gileadites took the passages of Jordan before the Ephraimites and it was so,
that when those Ephraimites which were escaped said. Let me go over; that the men of
Gilead said unto him, Art thou an Ephraimite? If he said, Nay; Then said they unto him,
Say now Shibboleth: and he said Sibboleth: for he could not frame to pronounce it right.
Then they took him, and slew him at the passages of the Jordan: and there fell at that
time of the Ephraimites forty and two thousand. JUDGES 12:56 [BIBLE-KJV]

61

The above quotation may be the first recorded military use of a security protocol in which
the authentication relies on a property of the human being in this case, his accent.

There had been less formal uses even before and after this incident.
1. When Isaac tried to identify Esau by his bodily hair, but got deceived by Jacob.

He went to his father and said, "My father." "Yes, my son," he answered. "Who
is it?" Jacob said to his father, "I am Esau your firstborn. I have done as you told
me. Please sit up and eat some of my game so that you may give me your
blessing." Isaac asked his son, "How did you find it so quickly, my son?" "The
LORD your God gave me success," he replied. Then Isaac said to Jacob, "Come
near so I can touch you, my son, to know whether you really are my son Esau or
not." Jacob went close to his father Isaac, who touched him and said, "The voice
is the voice of Jacob, but the hands are the hands of Esau." GENESIS 27:18
22 [BIBLE-NIV]

2. When people identified Peter by his accent and by his face, he tried to deny the fact.

Now Peter was sitting out in the courtyard, and a servant girl came to him.You
also were with Jesus of Galilee," she said. But he denied it before them all. "I
dont know what youre talking about," he said. Then he went out to the gateway,
where another girl saw him and said to the people there, "This fellow was with
Jesus of Nazareth." He denied it again, with an oath: "I dont know the man!"
After a little while, those standing there went up to Peter and said, Surely you
are one of them, for your accent gives you away. MATTHEW 26:6973
[BIBLE-NIV]

3. Again a small girl identified Peter by his voice.


Peter knocked at the outer entrance and a servant girl named Rhoda came to
answer the door. When she recognized Peter's voice, she was so overjoyed she
ran back without opening it and exclaimed, "Peter is at the door!" ACTS
12:1314 [BIBLE-NIV]

62

With some of the above examples, we can conclude that biometrics identify people by
measuring some aspect of individual anatomy or physiology (such as hand geometry or
fingerprint), some deeply ingrained skill, or other behavioural characteristic (such as
handwritten signature), or something that is a combination of the two (such as voice).

4.3 A BRIEF HISTORY OF BIOMETRICS


Measurement of physical features such as height, eye colour, scars, etc, as a method of
personal identity is known to date back to the ancient Egyptians. Archaeological evidence
of fingerprints being used to at least associate a person with some event or transaction is
also said to date back to ancient China, Babylonia and Assyria. But it was not until the
end of the 19th century that the study of biometrics entered the realm of crime detection
[93].
Fingerprinting can be traced as far back as the 14th century in China. Possibly the first
known example of biometrics in general practice was a form of finger printing being used,
as reported by explorer Joao de Barros. He wrote that the Chinese merchants were
stamping children's palm prints and footprints on paper with ink to distinguish the young
children from one another. This is one of the earliest known cases of biometrics in use
and that is still being used.
The earliest form of Biometrics in Europe appeared on the scene back in the 1800's.
Fingerprints were first looked at as a form of criminal identification by Dr. Henry Faulds
who noticed fingerprints on ancient pottery while working in Tokyo. He first published
his ideas about using fingerprints as a means of identifying criminals, in the scientific
journal, Nature in 1880. William Herschel, while working in colonial India, also
recognized the unique qualities that fingerprints had to offer as a means of identification
in the late 1870's. He first began using fingerprints as a form of signature on contracts
with locals.
In the 1890s, Alphonse Bertillon, a French police clerk and anthropologist, pioneered a
method of recording multiple body (anthropometric) measurements for criminal
identification purposes, known as Bertillonage and it was adopted by many police
63

authorities worldwide during the 1890s, but soon became obsolete once it was recognized
that people could indeed share the same physical measurements. Because of the amount
of time and effort that went in to painstakingly collecting measurements and the overall
inaccuracy of the process, Bertillonage was quickly replaced when fingerprinting
emerged on the scene as a more efficient and accurate means of identification.
Fingerprint, as a means of identification, proved to be infallible. It was accepted that each
individual possessed a uniquely identifiable and unchanging fingerprint. This new system
of identification was accepted as more reliable than Bertillonage.
Meanwhile, the quest for a physical identifier that was unique to each individual gained
significant ground when British anthropologist, Sir Francis Galton, who had been privy to
Faulds research through his uncle, Charles Darwin would also be credited as making
considerable advancement to fingerprint identification. Galton ascertained that no two
fingerprints were alike, not even for a set of identical twins. He worked on the principle
that fingerprints were permanent throughout life, and that no two people had identical
fingerprints. Galton calculated the odds of prints from two people being identical to be 1
in 64 billion and also identified characteristicsknown as minutiaethat are still being
used today to demonstrate that two impressions made by the same finger match. Minutiae
are points of interest formed by the endings or forking of the friction skin ridges on each
finger and are defined as one of the following:
x

Ridge endingthe point at which a ridge terminates

Bifurcationthe point at which a single ridge splits into two ridges

It is the arrangement of all the minutiae in terms of their location, orientation of ridge
flow and type (i.e. ridge ending or bifurcation) that make an individuals fingerprints
unique. The flow of the friction skin ridges also form the patternsthe whorl, arch and
loop of each fingerthat were identified by Galton. Galtons patterns provided the basis
of the first fingerprint file established in 1891 by Juan Vucetich, an Argentine police
officer, who became the first to use a bloody fingerprint to prove the identity of a
murderer during a criminal investigation.
Sir Edward Henry (1850 - 1931) developed a classification system which was officially
adopted by British India in 1897. In 1897, Sir Edward Henry, a British police officer
64

serving as Inspector General of the Bengal Police in India, also developed an interest in
the use of fingerprints for identifying criminals, even though the Bengal Police was at
that time using Bertillonage. Based on Galtons observations, Henry and colleagues
established a modified classification system, based on physiological characteristics,
allowing fingerprints captured on paper forms using an ink pad to be classified, filed and
referenced for comparison against thousands of others [93]. In 1900 Henry presented a
paper entitled Fingerprints and the Detection of Crime in India. Shortly after, Henrys
book The Classification and Uses of Finger Prints was published [70]. The Henry
Classification System organises ten-print fingerprint records by pattern type. The system
assigns each individual finger a numerical value (starting with the right thumb and ending
with the left pinky) and divides fingerprint records into groupings based on pattern types.
Finger ridges and patterns can be continuous, interrupted, forked, and other formations
[93]. Fingerprints are classified and identified by the relationship of these formations,
described as minutiae. The system makes it possible to search large numbers of
fingerprint records by classifying the prints according to the patterns. These patterns are
divided into five basic groups, with various subgroups [70]: arch, whorl, loops, composites
and accidentals. In December 1900, Britains Belper Committee recommended that the

fingerprints of criminals be taken and classified by the Indian System. In 1901, Henry
was called back to England and was given the post of Assistant Commissioner of Police
in charge of Criminal Identification at New Scotland Yard. In 1903, Henry became
Commissioner of Police [70].
In 1901, Henrys fingerprinting system had been adopted in the UK and was introduced
in England by Scotland Yard. In 1902, the New York Civil service began testing the
Henry method of fingerprinting with the Army, Navy, and Marines, all adopting the
method by 1907. From this point on, the Henry System of fingerprinting became the
system most commonly used in English speaking countries to become a standard method
of identity detection and verification in criminal investigations [93].
With the advent of computers and digital technology in the 1970s, fingerprinting took on
a new dimension. As a result, the UKs fingerprint service now records 120,000 sets of
fingerprints each year a volume of records that was simply untenable before
computerization. Within a century, biometrics had evolved from tape measure, ink and
65

pad techniques requiring vast manual filing and archiving resources, to an automated
biometric digital scanning process using computerized storage, automated search and
find/match techniques, plus extensive archiving and access systems with worldwide links.
Such technology now provides for the capture and processing of biometrics information
and has transformed fingerprinting techniques and procedures.
In the past three decades biometrics has moved from a single method (fingerprinting) to
more than ten discreet methods. Companies involved with new methods number in the
hundreds and continue to improve their methods as the technology available to them
advances. Prices for the hardware required continue to fall making systems more feasible
for low and mid-level budgets. However, as the industry grows so does the public
concern over privacy issues. Laws and regulations continue to be drafted and standards
are being developed. While no other biometric has yet reached the level of use of
fingerprinting, some are beginning to be used in both legal and business areas [93].

4.4 BIOMETRIC TECHNOLOGIES FROM THE PAST TO THE PRESENT


The ancient Egyptians and the Chinese played a large role in biometrics' history.
Although biometric technology seems to belong to the twenty-first century, the history of
biometrics goes back thousands of years. Now-a-days, the focus is on using biometric
face recognition and identifying characteristics to stop terrorism and improve security
measures [94].
Major breakthroughs in the field of are enumerated below:

European explorer Joao de Barros recorded that the first known example of
biometrics in practice was a form of fingerprinting being used in China during the
14th century. Chinese merchants used ink to take children's fingerprints for
identification purposes.

Elsewhere in the world up until the late 1800s, identification largely relied upon
"photographic memory".

In the 1880s, an anthropologist and police desk clerk in Paris named Alphonse
Bertillon sought to fix the problem of identifying convicted criminals and turned
66

biometrics into a distinct field of study. He developed a method of multiple body


measurements which got named after him (Bertillonage). Bertillon system (1882)
took a subject's photograph, and recorded height, the length of one foot, an arm
and index finger. This was the primary system of criminal identification used
during the 19th century. Bertillons system of identification was not without fault.
So, the Bertillonage method was quickly abandoned in favour of fingerprinting,
brought back into use by Richard Edward Henry of Scotland Yard.

Fingerprinting, as a means of identification, proved to be infallible. It was


accepted that each individual possessed a uniquely identifiable and unchanging
fingerprint. This new system of identification was accepted as more reliable than
Bertillonage. The Henry Classification system, named after Edward Henry who
developed and first implemented the system in 1897 in India, was the first method
of

classification

for

fingerprint

identification

based

on

physiological

characteristics.

In 1901 the Henry system was introduced in England.

In 1902 the New York Civil service began testing the Henry method of
fingerprinting with the Army, Navy, and Marines all adopting the method by
1907. From this point on, the Henry System of fingerprinting became the system
most commonly used in English speaking countries.

By the 1920s, fingerprint identification was used by law enforcement, the U.S.
military and the FBI as a form of identification.

Karl Pearson, an applied mathematician studied biometric research early in the


20th century at University College of London. He made important discoveries in
the field of biometrics through studying statistical history and correlation, which
he applied to animal evolution. His historical work included the method of
moments, the Pearson system of curves, correlation and the chi-squared test.

In the 1960s and '70s, signature biometric authentication procedures were


developed, but the biometric field remained static until the military and security
agencies researched and developed biometric technology beyond fingerprinting.
67

Although finger printing is still in use today, computer aided techniques began
developing rapidly in the last quarter of the twentieth century. These techniques
sought to measure our voices, our hands, fingers, iris is and faces. Once ideas
were proposed, development was rapid. In 1985, the idea that irises are unique
was proposed; development of an iris identification system began in 1993; in
1994 the first iris recognition algorithm was patented, and the year after that, a
commercial product measuring irises became available.

In 2001, Super Bowl in Tampa, Florida, each facial image of the 100,000 fans
passing through the stadium was recorded via video security cameras and checked
electronically against mug shots from the Tampa police. No felons were identified
and the video surveillance led many civil liberties advocates to denounce
biometric identifying technologies.

In 2001, after September 11 attacks, authorities installed biometric technologies in


airports to identify suspected terrorists.

In 2005, Rep. Robert Andrews (D-NJ) introduced the Iris Security Scan Security
Act of 2005, intended to give States grants to use iris scan records of convicted
criminals for various purposes.

Since July 7th, 2005, British law enforcement is using biometric face recognition
technologies and 360-degree "fish-eye" video cameras to identify terrorists.

4.5 CONDENSED TIMELINE OF BIOMETRIC TECHNOLOGIES


The developments in the field of biometrics are given below in the form of a table [95]:
Year
1858
1870
1892
1894
1896
1903
1903
1936

Description
First systematic capture of hand images for identification purposes is recorded
Bertillon develops anthropometrics to identify individuals
Galton develops a classification system for fingerprints
The Tragedy of Puddnhead Wilson is published
Henry develops a fingerprint classification system
NY State Prisons begins using fingerprints
Bertillon System collapses
Concept of using the iris pattern for identification is proposed
68

1960s
1960
1963
1965
1969
1970s
1970
1974
1976
1977
1980s
1985
1985
1986
1987
1988
1988
1991
1992
1993
1993
1994
1994
1994
1994
1995
1996
1996
1997
1998
1999
1999
2000
2000
2000
2001
2002
2002
2002
2003

Face recognition becomes semi-automated


First model of acoustic speech production is created
Hughes research paper on fingerprint automation published
Automated signature recognition research begins
FBI pushes to make fingerprint recognition an automated process
Face Recognition takes another step towards automation
Behavioral components of speech are first modeled
First commercial hand geometry systems become available
First prototype system for speaker recognition is developed
Patent is awarded for acquisition of dynamic signature information
NIST Speech Group is established
Concept that no two irides are alike is proposed
Patent for hand identification is awarded
Exchange of fingerprint minutiae data standard is published
Patent stating that the iris can be used for identification is awarded
First semi-automated facial recognition system is deployed
Eigenface technique is developed for face recognition
Face detection is pioneered, making real time face recognition possible
Biometric Consortium is established within US Government
Development of an iris prototype unit begins
FacE REcognition Technology (FERET) program is initiated
First iris recognition algorithm is patented
Integrated Automated Fingerprint Identification System (IAFIS) competition is
held
Palm System is benchmarked
INSPASS is implemented
Iris prototype becomes available as a commercial product
Hand geometry is implemented at the Olympic Games
NIST begins hosting annual speaker recognition evaluations
First commercial, generic biometric interoperability standard is published
FBI launches CODIS (DNA forensic database)
Study on the compatibility of biometrics and machine readable travel
documents is launched
FBI's IAFIS major components become operational
First Face Recognition Vendor Test (FRVT 2000) is held
First research paper describing the use of vascular patterns for recognition is
published
West Virginia University biometrics degree program is established
Face recognition is used at the Super Bowl in Tampa, Florida
ISO/IEC standards subcommittee on biometrics is established
M1 Technical Committee on Biometrics is formed
Palm Print Staff Paper is submitted to Identification Services Committee
Formal US Government coordination of biometric activities begins
69

2003
2003
2004
2004
2004
2004
2004
2005
2005

ICAO adopts blueprint to integrate biometrics into machine readable travel


documents
European Biometrics Forum is established
US-VISIT program becomes operational
DOD implements ABIS
Presidential directive calls for mandatory government-wide personal
identification card for all federal employees and contractors
First statewide automated palm print database is deployed in the US
Face Recognition Grand Challenge begins
US patent on iris recognition concept expires
Iris on the Move is announced at Biometrics Consortium Conference
Table 4.1: Timeline of biometric technologies

By looking at the timeline of biometric technology, we can conclude that the true
biometric systems began to emerge in the later half of the twentieth century, coinciding
with the emergence of computer systems. Over the last quarter century or so, people have
developed a large number of biometric devices. But the best established biometric
techniques predate the computer age altogether namely the use of handwritten
signatures, facial features, and fingerprints.
In recent years, there were and are so many cases that have proved that biological
characteristics are the most powerful tools to authenticate person's identity. The emphasis
now is to automatically perform reliable identification of persons in unattended mode,
often remotely (or at a distance).

4.6 REVIEW OF CANCELABLE BIOMETRICS


Our research work has been inspired and motivated by a number of contributions related
to cancellable biometrics and cryptographic key generation that have been done by earlier
researchers. A brief review of some noteworthy contributions is given below:
A framework for stable cryptographic key generation from unstable biometric data was
proposed by Chang Yao-Jen et al. [96], in 2004. In other words, they have presented a
framework with the intention of generating a fixed cryptographic key from the biometric
data which is liable to change. The main difference between the proposed framework and
the prior work is that, user-dependent transforms are employed to generate more solid,
70

compact and distinguishable (noticeable) features. Thus a longer and highly stable bit
stream can probably be produced. Experiments were carried out on database containing
face images to demonstrate the practicability of the framework.
Cancellable biometrics proffers a greater level of privacy by facilitating more than one
template for the same biometric data and thus providing for the non-linkability of users
data stored in diverse databases. The measurement of the success of a particular
transformation and matching algorithm for fingerprints was described by Russell Ang et
al. [97], in 2005. A key dependent geometric transform was employed on the features
obtained from a fingerprint, so as to produce a key-dependent cancellable template for the
fingerprint. Besides, they also have studied the performance of an authentication system
that utilizes the cancellable fingerprint matching algorithm for detection purposes.
Experimental evaluation of the system was carried out and the results illustrated that it
was possible to bring a good performance when the matching algorithm remains
unaltered.
A cancellable biometric approach called PalmHashing was proposed by Connie Tee et al.
[98] in 2005, in order to address the non-revocable biometric issue. This technique hashes
palmprint templates with a set of pseudo-random keys to acquire a unique code known as
the palmhash. It is possible to store the palmhash code in portable devices such as tokens
or smartcards for authentication. Moreover, PalmHashing also provides numerous
advantages over other modern day approaches including clear separation of the genuineimpostor populations and zero Equal Error Rate (EER) occurrences. They outlined the
implementation facts besides emphasizing its capabilities in security-critical applications.
Hao, F. et al. [99] in 2006, presented a realistic and secure way to incorporate the iris
biometric into cryptographic applications. They deliberated on the error patterns within
iris codes and developed a two-layer error correction technique that merges Hadamard
and Reed-Solomon codes. The key was produced from the iris image of the subject
through the auxiliary error correction data that do not disclose the key and can be saved
in a tamper-resistant token-like smart card. The evaluation of the methodology was
performed with the aid of 70 different samples from eyes. It was established that an errorfree key can be reproduced reliably from genuine iris codes with a success rate of 99.5%.
71

It is possible to produce up to 140 bits of biometric key, more than adequate for 128-bit
AES.
On basis of recent works displaying the likelihood of key generation by means of
biometrics, the application of handwritten signature to cryptography was analyzed by M.
Freire-Santos et al. [100] in 2006. A cryptographic construction called the fuzzy vault
was employed in the signature-based key generation scheme. The analysis and evaluation
of the usability of distinctive signature features appropriate for the fuzzy vault was
carried out. Results of experimental evaluation were reported. The reports also included
the error rates to release the secret data with the aid of both random and skilled forgeries
from the MCYT online and offline signature database.
A fuzzy commitment method working on lattice mapping for cryptographic key
generation from biometric data was proposed by Gang Zheng et al. [101] in 2006. This
method, despite providing high entropy keys as output, as well obscures the original
biometric data, so that it becomes infeasible to recover the biometric data even if the
stored information in the system is open to an attacker. Simulation results illustrated that
the methods authentication accuracy was analogous to the well known k-nearest
neighbour (KNN) classification.
Biometric characteristics are immutable and hence their compromise is permanent. To
address this problem, A.T. Beng Jin and Tee Conniea [102] in 2006, have proposed the
cancellable biometrics. Also they described biometric templates that can be cancelled and
replaced (restored). BioHash is a cancellable biometric that combines a set of userspecific random vectors with biometric features. The main drawback of BioHash is its
great degradation in performance when the legitimate token is stolen and used by the
impostor to claim as the legitimate user. They employed a modified probabilistic neural
network as the classifier to alleviate this problem.
Teoh AB et al. [103] in 2007, have presented a two-factor cancellable formulation that
facilitates data distortion in a revocable but non-reversible manner by first converting the
raw biometric data into a fixed-length feature vector, followed by the projection of the
feature vector onto a sequence of random subspaces that were obtained from a userspecific pseudo-random-number (PRN). The process was revocable and making the
72

replacement of biometrics seems as easy as replacing PRNs. This formulation was


confirmed under numerous scenarios (normal, stolen PRN, and compromised biometrics
scenarios) with the aid of 2400 face images using Facial Recognition Technology.
Je-Gyeong Jo et al. [104] in 2007, presented a simple technique for the generation of
digital signatures and cryptography communication with the aid of biometrics. They state
that, it is necessary to generate the signature in such a way that it becomes possible to
verify the same with a cryptographic algorithm like the RSA/ElGamal without altering its
own security constraint and infrastructure. It was anticipated that the mechanism will be
capable of guaranteeing security on the binding of biometric information in the signature
scheme on telecommunication environments.
B. Chen et al. [105] in 2007 have presented a technique that makes use of entropy
oriented feature extraction procedure together with Reed-Solomon error correcting codes
that are capable of generating deterministic bit-sequences from the output of an iterative
one-way transform. The evaluation of the methodology was done with the 3D face data
and was illustrated to be capable of producing keys of suitable length for 128-bit
Advanced Encryption Standard (AES) in a reliable fashion.
Ratha, N.K et al. [54], in 2007, have established numerous methods to generate multiple
cancellable identifiers from fingerprint images. A user can be provided with as many
biometric identifiers as per need by using a new transformation key. The identifiers can
be eliminated and returned when a trade-off is obtained. The performance of numerous
algorithms like Cartesian, polar, and surface folding transformations of the minutiae
positions were compared empirically. The transforms were non-invertible and it was
shown that the original biometric identifier was difficult to recover from a transformed
version by means of random guessing. From the empirical results and theoretical analysis,
it was established that feature-level cancellable biometric construction can be applied in
large biometric deployments.
Along with the wide diffusion of biometric-based authentication systems, the need to
provide security and privacy to the employed biometric templates has become an issue of
paramount importance in the design of user-friendly applications. Unlike password or
tokens, if a biometrics is compromised, usually it cannot be revoked or reissued.
73

Maiorana, E. et al. [106] in 2008, proposed an on-line signature-based biometric


authentication system, where non invertible transformations are applied to the acquired
signature functions, making impossible to derive the original biometrics from the stored
templates, while maintaining the same recognition performances of an unprotected
system. Specifically, the possibility of generating cancellable templates from the same
original data, thus providing a solution to privacy concerns and security issues have
deeply investigated. Precisely the probability of producing cancellable templates from the
same original data, thereby proffering an appropriate solution to privacy concerns and
security problems was intensely explored.
Biometric-key generation can be defined as a procedure that is used to transform a
portion of live biometric data into key with the help of auxiliary information (biometric
helper). Generating a biometric-key continually and to storing the biometric physically is
not essential. Beng, A. et al. [107] in 2008, proposed a biometric-key generation system
that works based on a randomized biometric helper. A randomized feature discretization
process and a code redundancy construction were the part of the scheme. The
discretization process allows managing of intra-class variations of biometric data to the
minimal level and the code redundancy construction brings down the errors. The
randomized biometric helper ensures that a biometric-key is easy to be made void when
the key is acknowledged.
Sanaul Hoque et al. [108], in 2008, exemplified the production of biometric keys straight
from live biometrics as per the conditions, by categorizing feature space into subspaces.
Again categorized these into cells, where each cell subspace adds to the overall key that
is generated. They evaluated the scheme on real biometric data, by symbolizing real
samples and discussed its limitations. Experimental results have shown the level to which
the technique has been implemented reliably in practical conditions.
Andrew B. J. Teoh et al. [109] in 2009, proposed the notion of cancellable biometrics to
state biometric templates, which can be eliminated and re-established by appending
another independent authentication factor. BioHash is a type of cancellable biometrics
that combines a set of user-specific random vectors along with biometric features. The
quantized random projection collection is based on the Johnson-Lindenstrauss Lemma
74

and is used to achieve the mathematical foundation of BioHash. On the basis of this
model, they have described the characteristics of BioHash in pattern recognition besides
security perspective and have offered few methods to solve the stolen-token issue.
Huijuan Yang et al. [110], in 2009, have presented a non-invertible transform to
perpendicularly project the distances between a pair of minutiae to a circle and to
generate the characteristics. Additional local features like relative angles between the
minutiae pair and global features like orientation, ridge frequency and total number of
minutiae of the randomly sampled blocks around each minutia were also employed to
obtain better performance. Finally, the Bin-based Quantization (BQ) generates the
cancellable templates. The feature extraction and cancellable template generation are
controlled by a secret key to ensure revocability and security. An experimental result
shown on FVC 2002 data set and the scheme is providing better performance.
B. Prasanalakshmi and A. Kannammal [111] in 2009, proposed a novel technique to
generate an irrevocable cryptographic key from the biometric template. The biometric
trait considered in their proposal was the palm vein. The proposed technique uses the
minutiae features extracted from the pattern generated. The features include bifurcation
points and ending points. Since other cryptographic keys are probable to theft or guess,
keys generated from the biometric entity are more preferable as biometric keys are
attached with the user. Minutiae patterns generated from the palm vein are converted to
cancellable templates which in turn can be used for irrevocable key generation.
H. A. Garcia-Baleon et al. [112] in 2009, proposed an approach for cryptographic key
generation which is on the basis of key-stroke dynamics and k-medoids algorithm.
Training-enrollment and user verification are the stages in the aforementioned approach.
The approach checks the identity of individuals off-line by not using a centralized
database. From the simulation results, a false acceptance rate (FAR) of 5.26% and a false
rejection rate (FRR) of 10% are obtained. The cryptographic key obtained from the
approach may be applied in diverse encryption algorithms.

75

4.7 SUMMARY
This chapter covered the history of biometricshow it has evolved, how it has become a
major challenging research topic and continues to be so even today. Further we have
given the time-line of biometrics. Finally, we have discussed the relevant research
publications in the area of cancellable biometrics. In the next chapter, we are going to
cover elaborately the cancellable biometrics and challenges in generating key generation
and their algorithms. The next chapter provides the base and motivation for our proposed
work.

76

CHAPTER 5
THEORETICAL BACKGROUND
"Theoretical principles must sometimes give way for the sake of practical advantages."
William Pitt
5.1 INTRODUCTION
Theoretical background of our proposed cancellable biometric key generation system is
discussed in this chapter. To make biometrics systems more robust, the concept of
cancellable biometrics is proposed. In order to safeguard the privacy and to prevent
disclosure of any information saved in databases for personal identification or
verification, cancellable biometrics template is preferred to be non-invertible. Here, we
provide the background information related to cancellable biometric systems and biocryptographic techniques. In addition, the main concepts we have used in our proposed
systems, are discussed concisely in the subsequent sections. This includes the concepts
used for the pre-processing of the input fingerprint image, Region of Interest (ROI)
selection and the extraction methods together with the minutiae extraction algorithms and
the encryption/decryption techniques.

5.2 CANCELABLE BIOMETRIC SYSTEMS


Biometrics has unassailable characteristics. Cancellation and reissue of biometric
templates becomes useless for users once biometric is compromised. The increased usage
of biometric systems in normal life has left the biometric security and privacy of users at
risk as one biometric template of the same user may be stored and shared in several
databases. Eight likely and susceptible security weaknesses, because of which a common
biometric system can be attacked at ease, are summarized in [113, 114]. Ratha et al. [53]
have proposed the idea of cancellable biometrics to safeguard privacy in biometric
authentication systems for overcoming these kinds of issues. A transformed version of
the biometric data is stored by cancellable biometrics. It is accomplished either in the
signal domain or in the feature domain by means of deliberate and repeatable distortions
77

(or transformations) on biometrics. Since the transformation is one way, disclosure of


information related to the actual biometric data by knowledge of a transformed biometric
is prevented. Consequently, the cancellable biometrics concept [53] is established as a
method for creating multiple protected biometric templates using deliberate and
repeatable transformation of biometrics signals. A cancellable transform should be noninvertible, cancellable, and capable of creating a huge number of unique protected
templates, and should not cause major deterioration of recognition precision [54].
Further, data pertaining to the same user cannot be inferred using other cancellable
templates [97]. Cancellable biometrics distortions are theoretically noninvertible.
However, distortions in realistic use can be invertible. Biometric authentication systems,
particularly those working under unattended and/or over networked settings require
cancellable biometric templates [53, 20]. Recently, cancellable biometrics are formed by
extending Bio-Hashing [117] variants [98, 115, 116] comprising feature domain random
transformation and discretization. Just like passwords, cancellation and reissue are
possible in cancellable biometrics which is nothing but a tailored variety of the original
biometrics. The four objectives [85] that must be commonly satisfied by any cancellable
biometric method are already discussed in the chapter 3, section 3.3.

5.3 BIO-CRYPTO KEY GENERATION


Biometrics is concerned with biological and behavioural characteristics based recognition
of individuals. A crucial role is individually played by both biometrics and cryptography
in providing security to user information [121]. Compared to knowledge and possession
based methods, the benefits provided by biometric based Authentication are numerous,
and hence there has been a quick growth in the use of biometrics for user authentication
applications in recent years. The variations or diversities in the biometric model among
the worldwide population are measured by the uniqueness in the biometric of each human.
Cryptography and biometrics are combined into biometric cryptosystems [34] to exploit
the strengths of both the fields. In such systems, superior and modifiable security levels
are provided by cryptography and the necessity of remembering passwords or carrying
tokens etc. are eliminated by biometrics introduced non-repudiation [120]. Since the
78

biological and behavioural features of a user cannot be disclosed by a different


unauthorized user, user privacy and security are ensured by the binding of cryptographic
key, besides biometric template [122].
Keys are extremely important in cryptography, because the trustworthiness of the
algorithm is reduced if keys are lost. Secure storage for keys and moderately long keys
are considered essential by all cryptographic algorithms. A biometric system itself is
susceptible to numerous threats, although access of secret key can be restricted to
legitimate users only using biometric authentication [118]. Biometric cryptosystems,
utilizing the biometric template of a user stored in the database creates a cryptographic
key in such a manner that a successful biometric authentication is essential for the
disclosure of the key. Uludag et al. [34] have utilized coupling level of cryptography and
biometrics to formulate the difference between two common approaches within the so
called crypto-biometric systems [119]: Utilization of biometric authentication to free an
already stored cryptographic key is termed as biometrics-based key release. Though
biometric authentication adds more convenience as it is used as a wrapper to
conventional cryptography where the user would be held responsible for remembering
his/her key; the two methods are only loosely coupled. Extracting/generating a
cryptographic key from a biometric template or structure is termed as biometrics-based
key generation. In this case, tight coupling exists between biometrics and cryptography:
the secret key is tied to the biometric information and plain form of storage is not used
for the biometric template.

5.4 CONCEPTS UTILIZED IN THE PROPOSED SYSTEM


We have proposed two major research works for biometric crypto key generation using
fingerprint cancellable templates. In this, the input fingerprint image is pre-processed
with the aid of the some standard filtering techniques and morphological operations
before generating the fingerprint cancellable templates. In addition, the security is
enhanced after the generation of the cancellable templates, by encrypting the template
using Advanced Encryption Standard (AES) algorithm. A brief discussion about these
concepts utilized in our proposed system is provided in this section.
79

5.4.1 Histogram Equalization


Histogram Equalization is a technique frequently used in Image Processing in order to
improve the image contrast and brightness and to optimize the dynamic range of the
greyscale. With a simple procedure, it automatically corrects the images which are too
bright, too dark or with not enough contrast. The grey-level values are adjusted within a
certain margin and the images entropy is maximized. Highlighting image brightness in a
manner especially appropriate for human visual analysis is the objective of the non-linear
histogram equalization process. Obtaining a picture with a flatter histogram where the
probabilities of all levels are equal by converting a picture is the objective of histogram
equalization. The histograms are first inspected in order to develop the operator. Then the
points per level are plotted against the level for a range of levels by the histogram. The
number of points per level for the input (old) and the output (new) image, are represented
as O (l ) and N (l ) (for 0 d l d M ), respectively. The sum of points per level in each image
should be equal for square images as there are N 2 points in the input and the output
image, [133]:
M

l 0

l 0

O(l ) N (l )

(1)

Also, as the objective is to obtain an output picture with a uniformly flat histogram this
should be the same for a randomly selected level p . So, covering up to the level q in the
new histogram necessitates transforming up to p level in the cumulative histogram,
p

l 0

l 0

O(l ) N (l )

(2)

The cumulative histogram up to level p should be a fraction of the overall sum as the
output histogram is uniformly flat. So dividing the number of points by the range of
levels in the output image gives the number of points per level in the output picture,

N (l )

N2
N max  N min

Therefore, the cumulative histogram of the output picture is,


80

(3)

N (l )

qu

l 0

N2
N max  N min

(4)

Equating this to the cumulative histogram of the input image as per Equation 2,

qu

N2
N max  N min

O(l )

(5)

l 0

We get a mapping for the output pixels level p as,

N max  N min
N2

O(l )

(6)

l 0

An output image having a roughly flat histogram is provided by the mapping function
that is obtained by phrasing Equation 6 as an equalizing function (E) of the level ( q ) and
the image ( O ) as,

E ( q, O )

N max  N min
N

u O(l )

(7)

l 0

Hence, the output image is,

N x, y

E (O x, y , O)

(8)

5.4.2 Filters
It is sometimes desirable to have circuits capable of selectively filtering one frequency or
range of frequencies out of a mix of different frequencies in a circuit. A circuit designed
to perform this frequency selection is called a filter circuit, or simply a filter. A common
need for filter circuits is in high-performance stereo systems, where certain ranges of
audio frequencies need to be amplified or suppressed for best sound quality and power
efficiency. You may be familiar with equalizers, which allow the amplitudes of several
frequency ranges to be adjusted to suit the listeners taste and acoustic properties of the
listening area. You may also be familiar with crossover networks, which block certain
ranges of frequencies from reaching speakers. A tweeter (high-frequency speaker) is
inefficient at reproducing low-frequency signals such as drum beats, so a crossover

81

circuit is connected between the tweeter and the stereos output terminals to block lowfrequency signals, only passing high-frequency signals to the speakers connection
terminals. This gives better audio system efficiency and thus better performance. Both
equalizers and crossover networks are examples of filters, designed to accomplish
filtering of certain frequencies.
Another practical application of filter circuits is in the conditioning of non-sinusoidal
voltage waveforms in power circuits. Some electronic devices are sensitive to the
presence of harmonics in the power supply voltage, and so require power conditioning for
proper operation. If a distorted sine-wave voltage behaves like a series of harmonic
waveforms added to the fundamental frequency, then it should be possible to construct a
filter circuit that only allows the fundamental waveform frequency to pass through,
blocking all (higher-frequency) harmonics. We will be studying the design of several
elementary filter circuits in this lesson. To reduce the load of math on the reader, I will
make extensive use of SPICE as an analysis tool; displaying Bode plots (amplitude
versus frequency) for the various kinds of filters. Bear in mind, though, that these circuits
can be analyzed over several points of frequency by repeated series-parallel analysis,
much like the previous example with two sources (60 and 90 Hz), if the student is willing
to invest a lot of time working and re-working circuit calculations for each frequency.
REVIEW:
x

A filter is an AC circuit that separates some frequencies from others within


mixed-frequency signals.

Audio equalizers and crossover networks are two well-known applications of


filter circuits.

A Bode plot is a graph plotting waveform amplitude or phase on one axis and
frequency on the other.

Filters are variously named by Low-pass filters, High-pass filters, Band-pass filters,
Band-stop filters, Resonant filters, Digital Filter, FFT Filter, Low-Pass Filter, Smoothing
Filter, Audio Filter, High Frequency Noise Reduction Filter, Lagging-Phase Filter and
more. Filtering is an operation which removes high-frequency fluctuations from a signal.
82

Low-pass filtering is another term for the same thing, but is restricted to methods which
are linear: i.e. if you want to filter a signal x(t )  y (t ) , it does not matter whether you

apply the filter before or after adding the two signals. Such linear operations can be
described by a frequency response. All methods described here are linear, with the
exception of curve fitting. The following sub-sections describe the two well-known
filtering mechanisms viz., Gabor filter and Wiener filter respectively.
5.4.2.1 Gabor Filter

Gabor filters have been successfully used for feature extraction in many machine vision
applications. Gabor filters have the ability to perform multi-resolution decomposition due
to its localization both in spatial and spatial-frequency domain. Texture segmentation
requires simultaneous measurements in both the spatial and the spatial-frequency
domains. Filters with smaller bandwidths in the spatial-frequency domain are more
desirable because they allow us to make finer distinctions among different textures. A
robust tool in the image processing field for texture analysis is Gabor Transform the
impulse response of which is a Gaussian function multiplied harmonic function. In the
spatial domain, a 2D Gabor filter is represented as a sinusoidal plane wave modulated
Gaussian kernel function. The kernel function of a Gabor filter is represented as:
<( X )

kv 2

k 2X 2
V2
exp( v
){exp(iKX )  exp(
)}
2
2
2V

(9)

Where, the real part and imaginary part of the oscillation function exp(iKX ) consist of
2

cosine function and sine function respectively, and exp(

kv X 2
) is a Gauss function. It
2V 2

limits the availability of oscillation function to local range by restricting the scope of the
oscillation function. The direct current component, exp(V 2 / 2) also called direct
current compensation can protect the filter from getting influenced by the extent of direct
current. This component can make the filter insensitive to illumination intensity by
preventing the influence of the absolute value of image grey-level [124].
The kernel function of 2D Gabor filter that has two parts, namely real and imaginary part
is a compound function,
83

G k ( x, y )

G r ( x, y )  Gi ( x, y )

(10)

The real component is given by,


G r ( x, y )

kv 2

k 2 (x 2  y 2 )
V2
) * [cos(k v cos(M u ) x  k v sin(M u ) y )  exp(
)]
exp( v
2
2V 2
V2
(11)

And the imaginary part is given by,


Gi ( x, y )

kv 2

k 2 (x2  y 2 )
V2
exp( v
) * [sin(k v cos(M u ) x  k v sin(M u ) y )  exp(
)]
2
2V 2
V2
(12)

5.4.2.2 Wiener Filter

Construction of Wiener filter is possible in the frequency domain also. One method to
derive such a filter uses the overlap-add to transform the time-domain wiener filter into
the frequency domain and it has precisely the same performance as its time-domain
counterpart. But, more commonly the clean speech spectrum is directly estimated from
the noisy speech spectrum to construct the frequency-domain wiener filter. The filter thus
obtained is different from the time-domain wiener filter in two respects: the time-domain
wiener filter is a sub-band technique where the sub-band filters are independent of the
other frequency band filters and can be non-causal while the frequency domain wiener
filter is a full-band technique and causal [134].
The frequency-domain sub-band Wiener filter can be represented as,
H o (iZ k )
where J X [ H (iZ k )]

arg min J X [ H (iZ k )]


H ( iZ k )

>

E X (n, iZ k )  H (iZ k )Y (n, iZ k )

(13)

@ is the MSE between the speech

spectrum and its assessment at frequency Z k . The Wiener filter can be straightforwardly
realized by equating the result of the differentiation of J X [ H (iZ k )] with respect to
H (iZk ) to zero.

84

H o (iZ k )

E[| X (n, iZ k ) | 2 ]
E[| Y (n, iZ k ) | 2 ]

Px (Z k )
Py (Z k )

(14)

Here, the power spectral densities (PSDs) of x(n) and y (n) are represented by
Px (Z k )

1
E[| X ( n, iZ k ) | 2 ]
L

and

Py (Z k )

1
E[| Y (n, iZ k ) | 2 ]
L

respectively.

The

nonnegative and real valued nature of the frequency-domain Wiener filter


H o (iZ k ) becomes clearly evident from this expression. Hence, the amplitude of the
noisy speech spectra is only adjusted without altering the phase components. The symbol
i can be omitted from the H o (iZ k ) expression as it is a real valued function [135].

5.4.3 Adaptive Threshold

Thresholding is a non-linear operation that converts a greyscale image into a binary


image where the two levels are assigned to pixels that are below or above the specified
threshold value. During the thresholding process, individual pixels in an image are
marked as object pixels if their value is greater than some threshold value (assuming an
object to be brighter than the background) and as background pixels otherwise. This
convention is known as threshold above. Variants include threshold below, which is
opposite of threshold above; threshold inside, where a pixel is labelled "object" if its
value is between two thresholds; and threshold outside, which is the opposite of threshold
inside. Thresholding is called adaptive thresholding when a different threshold is used for
different regions in the image. This may also be known as local or dynamic thresholding.

The background grey-level and the contrast between the objects and the background
frequently differ inside the single image because of irregular illumination and other
reasons. Since a threshold that performs properly in one region of the image might
perform defectively in other regions, achieving acceptable results by means of global
thresholding is unlikely in such cases. This variation can be avoided using an adjusting or
alterable threshold that is a gradually changing function of location in the image.
Adaptive thresholding can be performed by analyzing the grey-level histograms of
pixel non-overlapping blocks obtained by dividing an

85

image (n<N) and then

interpolating the resulting threshold values calculated from the blocks to construct a
thresholding surface for the whole image. Reliable estimation of the histogram and
setting of a threshold necessitates that the blocks should be of appropriate size to contain
an adequate number of background pixels in each block [132].
A two-pass operation can also be used for implementing Adaptive thresholding [131].
Based on the histogram of each block a threshold is calculated prior to the first pass by
selecting, for example, the value positioned in the middle of the background and the
object peaks. The unimodal histograms containing blocks can be discarded. In the first
pass, a grey-level threshold that is fixed within each block but varies for different blocks
is used to define object boundaries. The interior mean grey-level of each of the objects
thus defined is computed, though the objects are not extracted from the image. In the
second pass, each object is designated with its own threshold that is situated in the middle
of its internal grey-level and the background grey-level of its major block [123].
5.4.4 Morphological Operations

The principal objective of the morphological operations is the elimination of obstacles


and noises from the image. Moreover, the morphological operators remove the
unnecessary spurs, bridges and line breaks. Subsequently, the thickness of the lines is
reduced by the thinning process to represent only the specific regions of the image. So,
the form, structure or shape of an object is affected by morphological operations and they
are applicable only in binary images (two colour images black & white). Pre or post
processing such as filtering, thinning, and pruning or acquisition of a sketch or
specification of the shape of objects/regions like boundaries, convex hulls of skeletons
etc., use morphological operations. Dilation and erosion are the two main morphological
operations [125]. Dilation virtually fills small holes and joins disjoint objects as it
permits objects to expand. Erosion contracts the size of objects by etching away (eroding)
their boundaries. These operations can be adjusted for an application through the
appropriate choice of the structuring component that precisely describes the manner in
which the objects will be dilated or eroded [126].
Notations:

Black pixel: its value will be 0 for an 8 bits/pixel indexed image in greyscale
86

White pixel: its value will be 255 for an 8 bits/pixel indexed image in greyscale
The dilation: The dilation process is carried out by placing the structuring element B on

the image A and moving it over the image as is done for convolution. But the performed
operation is different and it is most excellently explained in a succession of steps:
*

No change is made if the image pixel is white at the location of the origin of the
structuring element in the image, so skip to the next pixel.

Each pixel of the image that comes under the structuring element must be made
black, if the origin of the structuring element occurs at a black pixel of the
image.

The erosion: The process of erosion is same as the dilation process except that the pixels

are changed to 'white' instead of 'black'. The structuring element is moved over the image
and the following steps are performed:
*

No change is made, if the image pixel is white at the location of the origin of the
structuring element, and hence, skips to the next pixel.

The black pixel in the image which lies at the centre of the structuring element
is made white if a black pixel occurs at the location of the origin of the
structuring element and at least one black pixel of the structuring element lies
upon a white pixel in the image.

5.4.5 Minutiae Extraction

This utilizes the binarization and thinning methods detecting adjacent ridge information
of minutiae to compute the minutiae scores.
5.4.5.1 Binarization

Image binarization is an important process for image analysis. The inherently bi-level
nature of the image has led to many of the image analysis algorithms being designed for
use on bi-level images. If the image binarization is improperly done, then the follow-on
steps cannot proceed appropriately. So there is a necessity to do the binarization process.
Converting the greyscale fingerprint image into a binary image is essential in majority of
the available methods. A priori enhancements immensely benefit certain binarization
87

processes while binary output is directly generated by certain enhancement algorithms.


Hence, the difference between enhancement and binarization is often less significant.
Normally, a skeleton image is obtained utilizing a thinning stage that permits the decrease
of ridge line thickness to one pixel. Pixels that correspond to minutiae are then identified
by utilizing a simple image scan. Image processing and pattern recognition fields
extensively investigate the common problem of image binarization [127]. The easiest
approach performs this by setting the pixels whose grey-level are lower than a global
threshold t to 0 and others to 1. This can be written as,

5.4.5.2 Ridge Thinning Algorithm

The ridge thinning process is utilized to remove the redundant pixels until the ridges
become one pixel wide. In image analysis and understanding thinning is considered to be
one of the most significant pre-processing steps. Several thinning methods that have been
created for binary image exhibit reasonably good results [128]. Though thinning has
substantial number of its own specific applications like thinning non-uniform brightness
objects or tinned edge identification, the grey-level thinning has been not so far seriously
investigated as the generalization of bi-level thinning. Precise standard evaluation of
thinning algorithms has not yet been developed in the available literature. Properties such
as topology, shape, connectivity, and sensitivity to boundary noise are commonly
considered essential for a good skeleton. In other words, the following characteristics
must be present in a good thinning algorithm [129]:
1) The resulting skeleton and the object must be topologically same.
2) It must run close to the medial axes of the object regions.
3) It must have a thickness of either one pixel or the least thickness.
4) It should maintain both foreground and background connectivity.
5) It should be noise insensitive to tiny protrusions and indentations in the borders.
6) It should restrain pervasive erosion and should not perform total deletion.
7) It should not necessitate iteration greater than the least possible amount.
8) It should prevent bias in some directions by symmetrical deletion of pixels.
88

5.4.6 AES Encryption

Similar to other algorithms, Advanced Encryption Standard (AES) may also be utilized in
diverse ways to perform encryption. Different methods are appropriate for different
settings. Though AES is secure, the result may become insecure unless the appropriate
method is employed in the appropriate manner for each and every circumstance. Though
it is extremely simple to employ a system that uses AES as its encryption algorithm, an
exceedingly greater amount of skill and expertise is needed to perform it properly for a
specified situation. Data is processed in blocks of 128 bits by AES which is a symmetric
encryption algorithm. Unlike decimal digits which can take 10 possible values in
operation, a binary digit or a bit can take only two possible values either zero or one. A
128-bit block is converted into a new block of the same size under the influence of a key
by encryption. The reverse transformation namely decryption uses the same key that is
used for encryption and hence AES is symmetric. Key is the only thing that must be kept
secret for security. AES can be configured to use keys of different lengths and the names
of the three commonly used configurations AES-128, AES-192 and AES-256 signify the
length in bits of the key which they use. The strength of the algorithm in terms of the
time needed for an attacker to carry out a brute force attack i.e., to find the right key by
performing a complete search of all possible key combinations, becomes twofold with the
increase of an extra bit in the key [130].

5.5 EVALUATION SCHEMES FOR BIOMETRIC SYSTEMS

Mainly there are three evaluation schemes of biometric systems [120]:


1. Technology evaluation
2. Scenario evaluation
3. Operational evaluation
Technology evaluations compare competing technologies with a single technology by
testing all algorithms on a standardized database by a universal sensor. This approach
tests novel data, and is done offline. Since the database is fixed, these technology test
results are repeatable [61].
89

Two common technology evaluations are the Fingerprint Verification Competition


(FVC) and the Fingerprint Vendor Technology Evaluation (FpVTE).
Fingerprint Verification Competitions aim is to track recent advances in fingerprint
verification, for both academia and industry, and to benchmark the state-of-the-art in
fingerprint technology. This competition should not be viewed as an official
performance certification of biometric systems, since the databases used in this contest
have not been necessarily acquired in a real-world application environment and are not
collected according to a formal protocol. Only parts of the system software will be
evaluated by using images from sensors not native to each system [136].
Fingerprint Vendor Technology Evaluation is independently administered technology
evaluation of fingerprint matching, identification, and verification systems [137].
Scenario evaluations determine the performance of a complete biometric system in an
environment that models a real-world target application. These test results can only be
repeatable if the modelled scenario is controlled [29].
In operational evaluations biometric system performance is determined by testing in a
specific environment and with a specific population. These tests offer limited
repeatability because of many unknown variables in the operational environment [61].

5.6 SUMMARY

In this chapter, we have briefly presented all the well-known existing concepts that we
have utilized in our proposed systems. The major concepts discussed include fingerprint
image pre-processing and the minutiae extraction methods. In addition to this, we have
also briefly discussed the encryption algorithm. This brief introduction is presented to
facilitate easy understanding of the working of our proposed systems.

90

CHAPTER 6
MOTIVATION FOR THE RESEARCH
Research is to see what everybody else has seen, and to think what nobody else has
thought. Albert Szent-Gyorgyi
6.1 INTRODUCTION
This chapter presents the motivation for the research and its significance in biometric
field, especially in the field of cancellable biometrics. At first, the necessity of
cancellable biometrics and its advantageous benefits are discussed in detail. Then, the
two significant contributions presented in recent time to develop the cancellable
templates of the fingerprint images are presented. Further, the detailed steps involved in
generating the cancellable template, are discussed. Finally, the conclusion of this chapter
is summarized in a concise manner.

6.2 MOTIVATING ALGORITHMS


Information Security and ensuring of privacy on personal identities is an emerging
concern in the present society. Conventional authentication schemes usually depend on
some secret knowledge from the user or utilize tokens for checking his or her identity.
These conventional methods are very popular but have a number of drawbacks. The
token-based and knowledge-based conventional authentication schemes cannot make a
difference between an authorized user and an impostor having access to the tokens or
passwords. While considering knowledge-based authentication systems, organizing
different passwords (i.e., identities) presents usability issues. In order to overcome the
limitations of the traditional authentication schemes; biometrics-based authentication
schemes using fingerprints, face recognition etc. are introduced.
However, use of biometrics itself also raises several privacy concerns. A user is
invariably associated with its biometrics. Consequently, when a biometric identifier is
compromised, every application using the biometric may collapse. Likewise, if similar
91


biometric is used in numerous applications, by cross-matching biometric databases, a


user can be easily tracked from one application to the other. Also, in the implementation
of biometrics based authentication systems [113, 138, 141], ensuring security to a
biometric template itself is a serious task. In general, biometric templates are stored
insecurely in a central database. When an encrypted template is stored, matching has to
be performed using decrypted templates where the decryption method itself can be
compromised. If a biometric template is compromised, it leads to serious security and
privacy threats because, unlike passwords, it is not possible for a legal user to withdraw
his biometric identifiers and switch to another set of uncompromised identifiers. More
over, if the database of a password protected systems is compromised, a new set of
passwords can be built up. Where as, biometric systems are probabilistic, whose
validation is based on scores which are 0% to 100% in range.
In order to prevent the attacks, only the hash values are stored in the database by hashing
plain-text passwords, and these are sent all over the networks. The hash value for text
passwords entirely changes for change of even a single character in the password. This is
due to the avalanche effect, a desirable property of the hash algorithm. Hashing is still
practical when considering passwords, in the case of authentication paradigm. The access
permission is offered only when the entire password matches correctly [140]. Combining
hashing and biometrics, i.e., considering hashed biometric data, a totally different hash
value will be generated if there is any slight change in the acquisition of the biometric (a
very likely scenario). A type of biometrics which combines a set of user-specific random
vectors with biometric features is called as BioHash. In the verification mode, BioHash
offers very low error rates parallel to the biometric method when a genuine token is used
[142].
As mentioned earlier, biometric data can be compromised. The theory of cancellable
biometrics was introduced to solve the issues [53] mentioned above. Cancellable
biometrics means that the biometric templates can be cancelled and replaced with the
insertion of additional independent authentication factor. A noninvertible transformed
form of the biometric data is kept by the cancellable biometric system that makes the
biometric data secure, when the storage is compromised. Various templates are given for
92


the same biometric data leading to higher level of privacy by keeping transformed user
data in different databases.
Here, we briefly describe biometric based applications, especially fingerprint system.
Biometric-based authentication applications comprise of workstation and network
access, data protection, remote access to resources, transaction security, Web security and
more. One of the most viable existing biometric technologies is fingerprint recognition.
The incapability to normalize the fingerprint data is the major concern in generating the
hash functions for fingerprint minutiae. The values of the hash functions are intended to
be orientation/position-dependent when the fingerprint data is not normalized. The
method to avoid this problem is to have both hash functions and matching algorithm to
deal with the transformations of the fingerprint data. It is infeasible to apply hash
functions in regard to the minutiae set of the entire fingerprint. Substantial alterations in
hash values are produced even with the minor difference in minutia sets of two prints of
the same finger. Further, the higher order hash values are likely to vary in a large measure
with even a minor variation in positions of the minutia points. The two additional factors
that govern the security given by non-invertible transform are system module where, the
transformation is applied (e.g., fingerprint scanner, client, server, or third party certifier)
and the location where, the fingerprint template is presented (e.g., client, server, third
party certifier, or smartcard). The creation of the hash function guarantees the noninvertibility and so in principle, this method is very attractive [148].
Next, we briefly discuss security systems based on merging of cryptography and
biometric techniques, viz., crypto-biometric systems [99, 139, 104, and 101] that have
been extensively developed for solving the key management problem of cryptographic
systems and providing security against the stored templates in biometric systems. This
research work has been motivated by a significant number of previous researches
presented in the literature regarding cancellable biometrics and cryptographic key
generation. With these factors, two important contributions [54, 143] available in the
literature have motivated us to continue the research work in efficient key generation
using cancellable fingerprint template. Radha et al. [54], in order to address biometric
authentication problem, have proposed several techniques based on both cryptographic
93


and biometric techniques. In particular, they have explained the advantages of cancellable
biometrics over other approaches. Also, a case study of applying the technique to a
fingerprint database is made by them. Also, relative merits of several other methods, like
Cartesian, polar, and functional transformation, were studied and compared empirically.
On the other hand, S. Tulyakov et al. [143] have proposed a method which uses
innovative symmetric hash functions to secure fingerprint templates. The features are
unordered for fingerprint minutiae and thus, these symmetric functions can be utilized for
any biometric modality. Their [54, 143] description specifies that implementation of a
highly performing and secured authentication system is comparable to straight matching
systems. In the following sub-sections, we briefly discuss about the two important works
[54, 143] that was proposed by Radha et al. and S. Tulyakov et al.

6.2.1 Ratha et al.s Work on Cancellable Template Generation for Fingerprints


Ratha et al. [53] put forward the model of non-invertible transforms for template
protection. Here, they [54, 84] extend their theoretical effort by providing three specific
non-invertible transforms. Also, by using one-way transformations in the feature domain,
they described a lot of methods for crafting cancellable templates for fingerprint
biometrics, an irreversible transformation of stored minutiae location and orientations are
done, instead of storing the original minutiae features. At first, they locate singular points
(cores and deltas) in fingerprint images. (i) Cartesian, (ii) polar, and (iii) functional
surface folding are the three different points that are used for the transformation of the
minutiae point on accordance with the core point. These transforms were chosen in such
a way that it forms a two-dimensional arrangement of points of the resulting minutiae (in
the transformed space). Thus, a comparison can be done to match a protected template
against a protected feature set using existing fingerprint comparison algorithms.
In Cartesian and polar transforms, a small change in the relative locations of two
minutiae in the original space is converted into a greater change in the transformed space,
which leads to the increase of false rejects and is a limitation. Therefore, the authors
endorse the usage of a transform that is locally smooth. Though, the transform should not
94


be globally smooth to make it cryptographically secure, else it would be easy to invert it.
Hence, they propose functional surface folding transform that is locally smooth but not
globally smooth. Their proposed function has many positions in the actual space which
are mapped to the same position in the transformed space known as folds [53].
Abstractly, standard hash functions achieve non-invertibility through this same property
and essentially, this creates obscurity in reversing the transform which results in the
desired non-invertibility. To achieve non-invertibility, they performed the steps such as,
registration and transformation.
Registration Prior to Transformation: The procedure of registering the image is the
foremost vital step in the use of a cancellable transform. The minutiae locations have to
be measured with respect to the same coordinate system so that transform becomes
repeatable. It is achieved by taking estimation of the location and the orientation of the
singular points (core and delta) and representing the minutiae positions and angles with
reference to these points. Though there have been various approaches to find out the core
and delta [146, 147], precise estimation of it, is a tedious task. The minutiae feature
points are transformed reliably across many instances, when the global registration has
been established once through the usage of a singular point position. Though the basic
notion of cancellable biometrics is to permanently transform the minutiae feature
locations and orientations, the transform itself can be achieved in various ways.
Cartesian Transformation: While considering the Cartesian transformation, rectangular
coordinates, which points to the position of the singular point, are used for the
measurement of minutiae positions. The x-axis is aligned with respect to the point of
reference of the singular point. The space are divided into cells of equal size i.e., this
coordinate system is separated into fixed size cells as shown in figure 6.1. The
permutation of transformation is not a strict one because of the condition of irreversibility
need more than one cell to be mapped to similar cell. When looking into this case, a
Mapping Matrix

governs the cell mapping. The positions of the cells after applying

transformation can be written simply as

95


Polar Transformation: In this method, measurement of the minutiae positions are


obtained through polar coordinates by means of the core position. The measurement of
angles is done in association to the point of reference of the core. The feature space is
divided into sectors i.e., the coordinate space is now separated into polar sectors ( levels
and

angles) that are numbered in a sequence as shown in figure 6.2.

Figure 6.1: Cartesian transformation which maps each cell to some random
cell with collisions.

Figure 6.2: Polar transformation where each sector is mapped into some other
random sector after transformation.

96


Unlike the Cartesian transformation, the angular shift

is changed to a positional shift

at a distance from the core. Thus, unconstrained mapping is not practicable in polar
coordinates. This raises a state that the occurrence of minutiae pairs under a tolerance
distance of each other previous to transformation and, not being a match after
transformation since the huge divergence that occurs away from the core. Therefore, in
the polar transformation, governing of mapping is done by a translation key
which defines the cell transformation. The locations of the sectors after and before
transformation is related as,
6.2.2 S. Tulyakov et al.s Work on Symmetric Hash Functions for Secure Fingerprint
Biometric Systems

Since the difficulty in the fingerprint data normalization, it is quite hard to produce hash
functions for fingerprint minutiae. The hashing functions are intended to be
orientation/position-dependent if the fingerprint data is not normalized. This difficulty
can easily be managed by having hash functions as well as the matching algorithm which
deals with transformations of the fingerprint data. The authors [143] accomplished
matching only on localized sets of minutia in order to override these difficulties. Figure
6.3 describes the biometric matching performed on the hashed feature sets proposed in
[143].
From the authors [143], the task of fingerprint matching can be done using the two
important steps, (i) Localized minutiae sets (ii) Hashing localized minutiae sets.
Localized minutiae sets: Global matching of two fingerprints is taken as a group of the
localized matching with analogous transformation parameters

and , where

is rotation

and is transformation. The localized set is found out by a specific minutiae and a few of
its neighbours, as in the base fingerprint matcher [145]. In order to evade the global
alignment, S. Tulyakov et al. [143] used concepts alike to Germain et al. [144] and Jea et
al. [145] in order to combine the outcomes of localized matching into the fingerprint
recognition algorithm. Matching of minutia triplets by the usage of attributes such as
angles and distances between minutia points is done in localized matching. For every
97


minutiae attribute vector of length 3


feature vector

and its two closest neighbours, a secondary

is produced, which depends on the Euclidean distances

and the orientation difference between the central minutia and its nearest neighbours. For
localized matching, they record only restricted information about the matched
neighbours, and hence, minutiae positions cannot be reinstated from the transformed data.

Figure 6.3: S. Tulyakov et al.s secure fingerprint biometric systems using


symmetric hash functions
Hashing localized minutiae sets: The localized minutia sets obtained from the previous
step are hashed using the symmetric hash functions. The overall hash data contains a set
of hashes

where

is the total number of localized minutiae

sets. A minor alteration in the input such as lost information, noise or a modification in
the order of the input and so on, can cause a substantial variation in the hash value. Some
classes of hash functions can be generated such that they are invariant to the order in
which the input pattern is offered to the hash function which are known as order

98


independent

or

symmetric

hash

functions.

Considering

an

input

sequence

and the following two hash functions [143]:

If the order of the input is altered to

the first function produces a

different hash value while the second is left unaffected. S. Tulyakov et al. represent
minutiae points as complex numbers

. Their assumption is that two fingerprints from

one finger that coming from different scanners and different positioning of the finger on
the scanner will have separate position, rotation and scale. A complex function
can be used to represent the transformation of one fingerprint to another.
represents the minutiae point

located at co-ordinates

and

are used to

characterize the scalar rotation and translation parameters of the accidental shift of points
related to the registration and authentication scans. In this approach, the authors build
hash functions and the equivalent matching algorithm such that the accidental shifting is
considered.

6.3 SUMMARY
This chapter discusses earlier research work in the biometric field, especially in
cancellable biometrics. Initially, we have presented the advantages of the cancellable
biometrics and their importance. Later, we explained about the two significant
contributions made by the researchers in respect of developing the cancellable templates
of the fingerprint images. This chapter provides motivation to pursue research work in
this field.

99


CHAPTER 7
SUGGESTED NEW APPROACHES
TO CANCELLABLE BIOMETRIC BASED SECURITY
"The new strategic environment requires new approaches to deterrence and defence."
Peter Flory

7.1 INTRODUCTION
As discussed in previous chapters, Biometric security systems have a number of
problems because of the fact that the biometric data of a person is generally stored in the
system itself. Cancellable Biometrics is one of the solutions for solving these problems.
There are several methods that have been proposed in the literature related to Cancellable
Biometrics. But we consider two most significant methods which are proposed by Ratha
et al. and S. Tulyakov et al. These approaches have already been discussed in chapter 6
elaborately, and the discussion provides motivation for us to continue the work in this
field. In this chapter, we present our research work consisting of two new algorithms, viz.,
New-Fangled Approach for Cancellable Biometric Key Generation and Development of
Bio-Crypto Key from Fingerprints Using Cancellable Templates etc. for secure
fingerprint biometric systems.

7.2 NEW APPROACHES PROPOSED FOR CANCELABLE FINGERPRINTS


In the following sections, we propose efficient approaches for cryptographic key
generation from fingerprint biometrics using cancellable templates. We put forth a new
methodology for the secure storage of fingerprint template by generating Secured Feature
Matrix and keys for cryptographic techniques applied for data Encryption or Decryption
with the aid of cancellable biometric features. The proposed techniques produce
cancellable key from fingerprint so as to surmount the problems which we have been
facing while using conventional methods as discussed in earlier chapters. The flexibility
and dependability of cryptography is enhanced with the utilization of cancellable
100

biometric features. There are several biometric systems in existence that deal with
cryptography, but the proposed cancellable biometric system introduces a novel method
to generate Cryptographic Key. We discuss about the security analysis of the proposed
Cancellable Biometric System as well. In the following sections, we discuss our proposed
algorithms one by one.

7.3

FIRST

PROPOSED

METHOD:

NEW-FANGLED

APPROACH

FOR

CANCELABLE BIOMETRIC KEY GENERATION FOR FINGERPRINTS


To overcome the problems mentioned in the previous chapters, with the exiting biometric
systems, we suggest a method of generating a secured feature matrix from the fingerprint
template strengthened by Advanced Encryption Standard (AES) Encryption/Decryption
algorithm. Besides that, this section also discusses keys generation methods using
fingerprint images. The process of key generation methods consists of the following
stages:
1. Extracting minutiae points from Fingerprint
2. Secured Feature Matrix generation
3. Key generation from Secured Feature Matrix

7.3.1 Extracting Minutiae Points from Fingerprint


For extracting minutiae points from fingerprint, a three-level approach is broadly used by
researchers. These levels are:
1) Pre-processing
2) Region of Interest (ROI) Selection
3) Minutia extraction
For the fingerprint image pre-processing, Histogram Equalization [149] and Gabor Filters
[150] are used to do image enhancement. Binarization is applied on the fingerprint image.
Locally adaptive threshold method [151] is used for this process. Then Morphological
101

operations [151, 152] are used to extract Region of Interest (ROI). In a morphological
operation, the value of each pixel in the output image is based on a comparison of the
equivalent pixel in the input image with its neighbours. By selecting the size and shape of
the neighbourhood, we can construct a morphological operation that is sensitive to
specific shapes in the input image.
7.3.1.1 Pre-processing
i) Histogram equalization: This method usually increases the local contrast of many
images, especially when the usable data of the image is represented by close contrast
values. Through this adjustment, the intensities can be better distributed on the histogram.
Moreover, histogram equalization increases the perceptional information of the image by
permitting the pixel values to expand the distribution of an image.

(b)

(a)

Figure 7.1: (a) Original fingerprint image (b) Histogram equalized image
The original histogram of a fingerprint image will be of bimodal type, and the histogram
after the equalization converts all the range values from 0 to 255 and the visualization
effect is improved. Here, the Figure 7.1 depicts the original fingerprint image and its
corresponding histogram equalized image.
ii) Gabor filtering: The Gabor filter is applied to the fingerprint image obtained in the
previous step by spatially convolving the image with the filter.
A two-dimensional Gabor [150] filter consists of a sinusoidal plane wave of a specific
orientation and frequency, modulated by a Gaussian envelope. Gabor filters are employed
as they have frequency-selective and orientation-selective properties. These properties
permit the filter to be tuned to give maximal response to ridges at a specific orientation
102

and frequency in the fingerprint image. So, a properly tuned Gabor filter shall be used to
effectively retain the ridge structures while reducing noise. The even-symmetric Gabor
filter is the real part of the Gabor function, which is yielded by a cosine wave modulated
by a Gaussian.
A Gaussian function multiplied by a harmonic function defines the impulse response of
the linear filter, the Gabor filter. Because of the multiplication-convolution property
(Convolution theorem), the Fourier transform of a Gabor filter's impulse response is the
convolution of the Fourier transform of the harmonic function and the Fourier transform
of the Gaussian function as given below:
g ( x, y; O , T ,\ , V , J ) exp(

x'2 J 2 y'2
x'
) cos(2S
\ )
2V 2
O

Where,

x'

x cos T  y sin T

y'

 x sin T  y cos T

and

In this equation, O represents the wavelength of the cosine factor, T represents the
orientation of the normal to the parallel stripes of a Gabor function, < is the phase offset,
and J is the spatial aspect ratio, and specifies the ellipticity of the support of the Gabor
function.
7.3.1.2 Region of Interest (ROI) Selection
i) Binarization: Nearly all minutiae extraction algorithms function on binary images
where there are only two levels of interest: the black pixels that denote ridges, and the
white pixels that denote valleys. Binarization is the process that translates a grey level
image into a binary image. This improves or enhances the contrast between the ridges
and valleys in a fingerprint image, and consequently makes it possible for effectual
extraction of minutiae points.
One practical property of the Gabor filter is that it has a DC component of zero, which
means the resultant filtered image has a mean pixel value of zero. Hence, straightforward
103

binarization of the image can be achieved using a global threshold of zero. The
binarization process involves analyzing the grey-level value of each pixel in the enhanced
image, and, if the value is greater than the global threshold, then the pixel value is set to a
binary value one; otherwise, it is set to zero. The result is a binary image holding two
levels of information, the foreground ridges and the background valleys.

Figure 7.2: After binarization

ii) Adaptive Thresholding: The adaptive thresholding method [157] is on the basis of
the analysis of statistical parameters. This includes arithmetic mean, geometrical mean
and standard deviation of the sub-band coefficients. Local adaptive thresholding scheme
has been the most commonly used, by researchers due to the fact that it binarizes and
improves the poor quality of the images for locating the meaningful textual information
[158].
iii) ROI extraction by morphological operations: For ROI extraction from the binary
fingerprint image, we apply the morphological opening and closing operations on the
greyscale or binary image, using a structuring element. The structuring element is a
single structuring element object, as opposed to an array of objects for both open and
close. Hence as a result, the morphological operators will throw away the leftmost,
rightmost, uppermost and bottommost blocks out of the bound, so as to get the tightly
bounded region just containing the bound and inner area.
7.3.1.3 Minutiae Extraction
The last image enhancement step normally performed is thinning. Thinning is a
morphological operation that successively erodes away the foreground pixels until they
are one pixel wide. Ridge Thinning is to eliminate the redundant pixels of ridges till the
104

ridges are just one pixel wide [153] uses a Ridge Thinning algorithm, which is used for
Minutiae points extraction in our approach. The image is divided into two distinct
subfields in a checkerboard pattern. In the first sub-iteration, delete pixel p from the first
subfield if and only if the conditions G1, G2, and G3 (defined below) are all satisfied. In
the second sub-iteration, delete pixel p from the second subfield if and only if the
conditions G1, G2, and G3' (defined below) are all satisfied.
Condition G1:
X H ( P) 1

Where
4

X H ( P)

i 1

bi

1 if x 2i 1 0 and x 2i

0 otherwise

1 or x 2i 1

x1 , x2 ,..., x8 are the values of the eight neighbours of p , starting with the east neighbour
and numbered in counter-clockwise order.
Condition G2:
2 d min{n1 ( p ), n2 ( p )} d 3

Where,
4

n1 ( p)

2 k 1

x2k

k 1
4

n2 ( p)

2k

x 2 k 1

k 1

Condition G3:

( x2 x3 x8 ) x1

Condition G3:

( x6 x7 x ) x5

0
105

7.3.2 Secured Feature Matrix Generation

The steps involved in the generation of Secured Feature Matrix are discussed in this subsection. We assume that the extracted minutiae points co-ordinates are maintained in a
vector.

M p o Minutiae point set

N p o Size of Minutiae point set


Vk o Key Vector
Lk o length of the AES key

p o (x, y) co- ordinate of a minutiae point


The Extracted minutiae points are represented as
Pm

^pi `

1,, N p

Initially the Pm is transformed to a Key vector as follows


Vk

^xi : p xi ` i

>

1,, Lk

Where, p x Pm Pm >i @mod N p i 1,, Lk

Then

the

initial

key

vector (VK ) is

converted

in

to

matrix

BK m of

size sqrt Lk * sqrt Lk .


Further, the resultant matrix BK m is encrypted with the AES algorithm to form Secured
Feature matrix SFm .
SFm

Evk ( BK m )

The key used in the AES encryption is the generated key from the whole process. Once
the key is generated, AES encryption further to generates a secured feature matrix, but
initially encryption process doesnt occur in the key generation from the whole process.

106

The generated Secured feature matrix is irreversible, moreover it cannot be hacked by an


attacker because of the strength of AES and the mathematical operations involved in the
generation.

7.3.3 Key Generation from Secured Feature Matrix ( SFm )

The key is generated as follows. First of all, the Secured Feature Matrix is decrypted by
AES Decryption to form the deciphered matrix.
Then the resultant matrix BK m is given as
BK m

Dvk ( SFm )

Where, BK m

a sqrt L sqrt L
ij

Then an intermediate key vector is generated as follows


Iv

^K i : p k ` i

1,, Lk

Where, p k SMij , SMij BKmi, j : i  size, j  size, 1 i  sqrt Lk SM ij is an extracted matrix
formed from the key matrix. Then the final key vector is formed as
FBK v

1, ifI V >i @ ! mean I V

0, otherwise

The extracted final key vector is more secured and it is non-reversible. That means the
final key cannot be traced back from the template. This irreversible property makes the
key unbreakable, because we processed through minutiae points and secured feature
matrix.

7.4 SECOND PROPOSED METHOD: DEVELOPMENT OF BIO-CRYPTO KEY


FROM FINGERPRINTS USING CANCELABLE TEMPLATES

The proposed approach is also composed of three phases namely: 1) Extraction of


minutiae points from the fingerprint image, 2) Generation of cancellable biometric
107

templates with added security and 3) Cryptographic key generation from the Secured
cancellable template.
The resultant cryptographic key thus generated is irrevocable and unique to a particular
cancellable template, making the generation of new cancellable templates and
cryptographic keys feasible. The experimental results portray the effectiveness of the
cancellable template and the cryptographic key generated. The various steps and
techniques used in the proposed approach are detailed in this section.

7.4.1 Extraction of Minutiae Points from Fingerprint

In this proposed approach also, the process of extracting the minutiae points from the
fingerprint is composed of three processing steps namely,
1) Pre-processing
2) Region of Interest (ROI) selection
3) Minutiae extraction
Histogram Equalization [154] and Wiener Filters [155] have been made use to achieve
image enhancement in fingerprint images. Subsequently, the locally adaptive threshold
method [151] is applied to perform binarization on the fingerprint image. Morphological
operations [151, 152] are then utilized to extract the Region of Interest (ROI) from the
fingerprint image. Eventually, minutiae points are extracted using the Ridge Thinning
algorithm [153].
7.4.1.1 Pre-processing

The first level of pre-processing is same as discussed in section 7.3.1.1. However, in this
method, instead of Gabor Filter, we have used Wiener Filter.
i) Histogram equalization: It same as discussed in section 7.3.1.1.
ii) Wiener Filtering: Wiener filter can be defined as a Mean Squared Error (MSE)-

optimal stationary linear filter for images degraded by additive noise and blurring. In
order to perform wiener filtering, [156] we assume that the signal and noise processes are
second-order stationary (in the random process sense). Generally, the wiener filters are
108

made use of in the frequency domain. When the stationary nature of the concerned
signals is presumed, the average squared distance between the filter output as well as a
desired signal is lessened by means of computing the coefficients of a wiener filter [155].
This can be accomplished with ease in the frequency domain. As a consequence, we get
the frequency domain equation:

S( f )

W ( f )Y ( f )

where S ( f ) is the wiener filter output, Y(f ) is the wiener filter input, W ( f ) is coefficient
of wiener filter and W ( f )

PDY ( f )

PYY ( f ) , PYY(f ) , PDY(f ) are the power spectrum of

Y(f) and the cross power spectrum of Y( f ) , D( f ) (desired signal) respectively.


7.4.1.2. Region of Interest (ROI) Selection

The second level of ROI Selection is already discussed in section 7.3.1.2.


7.4.1.3 Minutiae Extraction

The last level of minutiae extraction is similar to the one discussed in section 7.3.1.3.

7.4.2 Secured Cancellable Template Generation

In this sub-section, we have presented the steps involved in the generation of the secured
cancellable template from the extracted minutiae points. The steps involved are as
follows:
The extracted minutiae points form the set P and their corresponding x , y co-ordinates
form the set M p as represented below:
P
Mp

[ P1 P2 P3  Pn ]

[ x1 y1 x2 y 2  xn y n ]

Subsequently, a set R N is created with random values of size M p .


RN

[r1 r2 r3  rn ]; where n | M p |, ri

109

random(); 1 d i d n

Then, exponential values are computed for each individual element in the vector R N and
stored in ER N .
[e r1 , e r2 ,  , e rn ]

ER N

For every element in ER N , choose a set of x subsequent prime numbers (very large) to form
a row of the matrix PN . Every row of the matrix PN will have distinct number of elements.
The number of elements x will be equal to the coordinate value of the elements in M p .

PN

( P1

( P1


( P
1
( P1

P2 P3  Pn

x1 )

P2 P3  Pn

y1 )

P2
P2

P3  Pn xn )

P3  Pn yn )

Subsequently, a prime number pair is selected randomly from the two succeeding rows of PN ,
with a prime number from each row, and the pair is multiplied to obtain the transformed
point TP . The transformed points are stored in a vector PFV .
PFV

[TP1 TP2 TP3  TPn / 2 ] ; where TPi

( Pl * Pm )

Since each transformed point TP is formed by the multiplication of two large prime numbers
Pl and Pm , it is almost computationally infeasible to determine the factors Pl and Pm from TP ,

as described in RSA factoring challenges [159].


The size of the cryptographic key FK v to be generated is decided previously and is set as a
pre-defined key value k v . From the vector PFV , a transformed point TP is chosen randomly
and its distance with respect to all other transformed points is computed and stored in a
vector DV . The above process is repeated until | DV |

k v . The distance between any two

transformed points is computed using the following equation,


Distance (TPi , TP j )

DV

(TPi  TP j ) 2

[ d1 d 2 d 3  d kv ]

110

The vector DV is then transformed into a matrix to form the cancellable template TM .
TM

| DV ij |

kv u kv

Henceforth, the cancelable template, serves as the source for the generation of the
cryptographic key. This necessitates the secure storage of the cancelable template such
that it is either un-modifiable or un-accessible to the people other than authorized users.
Hence, the resultant cancelable template TM is encrypted with the AES algorithm to
form the encrypted cancelable template, CTM , i.e.,
CTM

Enc [TM ]

The generated cancellable template TM is irreversible; also, the security of the cancellable
template created is increased by the strength of AES.

7.4.3 Cryptographic Key Generation from Secured Cancellable Template

The steps involved in the generation of the cryptographic key from the secured
cancellable template are as follows: Initially, the encrypted cancellable template is
decrypted with the AES Decryption algorithm to obtain the cancellable template TM , i.e.,
TM

Dec(CTM )

An intermediate key vector I k is then generated from TM , by employing matrix operation


(Computing determinants of 4x4 matrices). Subsequently, a threshold is determined by
computing the mean value of I k , where
Ik

and P(v)

Tij

4X 4

(vi : P (v)),

i 1,, n

; i, j : i  size, j  size;1  i, j  n

Based on the values in I k and the threshold, the individual values of the final key vector
FK v are computed. The vector FK v is created using the following equation,

111

FK v

; if I k (i ) ! mean( I k )
; else

The final key FK v generated is also irrevocable and complex consisting of 256 bits. The
irreversible property makes the key almost unbreakable, because it is very intricate to
compute the cancellable template from the final cryptographic key FK v generated.

Experimental results and security analysis of the above two proposed methods will be
discussed in the next chapter (Chapter No. 8).

7.5 SUMMARY

We have suggested two new algorithmsNew-Fangled Approach for Cancellable


Biometric Key Generation and Development of Bio-Crypto Key from Fingerprints Using
Cancellable Templates etc.for secure fingerprint biometric systems.

Our first proposed method, Biometrics-based Key Generation will be shown to perform
better than traditional systems in usability domain. Precisely, it is not possible for a
person to lose his/her biometrics, and the biometric signal is intricate to falsify or steal.
The proposed cancellable biometric Crypto System is an all-new technique for the
authentication that yields the synergistic control of biometrics. The proposed system
employs intentional distortion of fingerprint in a repeatable fashion and the fingerprint
thus obtained is utilized in the cryptographic key generation. When the old transformed
template for finger print is stolen, it is possible to obtain a new template for
fingerprint, just by altering the parameters of the distortion process. Subsequently,
enhanced privacy for the user results, as his true fingerprint is not used. A notable
enhancement in terms of decrease in the consumed time is attained with the elimination
of some steps that are redundant with the mixture of the proposed methodology.
Integration of the proposed technique with the existing cryptographic methodologies is
uncomplicated and it simplifies key-generation and key-release issues in a remarkable
manner. This methodology can be further made efficient and sophisticated with the
combination of any of the evolving cryptographic systems.
112

Our second proposed method, viz., Biometrics-based Key Generation also outperforms
traditional cryptographic systems, mainly because, it is impossible for a person to lose
his/ her biometrics, and also the biometrics is intricate to falsify or steal. Also, we have
presented an efficient approach for generation of irrevocable cryptographic keys from
fingerprint biometrics using cancellable biometric templates.

Each of the two methods is composed of three phases namely: 1) Minutiae points
extraction from the fingerprint image, 2) Cancelable template generation with added
security and 3) Cryptographic key generation from Secured Cancelable template.
However, in the first of these methods, Gabor filters are used where as in the second
method Wiener filters are used. The resultant cryptographic key thus generated is
irrevocable and unique to a specific cancelable template, availing better protection and
replacement features for lost or stolen biometrics. The experimental results, to be
discussed in later chapters have portrayed the effectiveness of the proposed method in
generating an irrevocable cryptographic key.

113

CHAPTER 8
EXPERIMENTAL RESULTS AND ANALYSIS
"It is quite possible to work without results, but never will there be results without
work." Sunil V. K. Gaddam
8.1 INTRODUCTION
This chapter describes the experimental results and the performance analysis of the
proposed methods vis--vis the previous methods. This chapter is significant in the sense
that it concludes the research in terms of effectiveness and advantages of proposed
methods over the previous ones. The results and discussion are carried out using the well
known fingerprint databases to clearly evaluate the performance of the methods. In
addition to this, the well-accepted evaluation metrics are employed here to directly
compare with the previous ones.

8.2 TECHNOLOGY EVALUATION SCHEME OF BIOMETRIC SYSTEMS


Evaluation Schemes of Biometric Systems has already been explained in the 5th Chapter,
section 5.5. In order to prove the effectiveness, we initially set up the experimental
environment and collect the various databases which are more appropriate to conduct an
experimental study. When conducting an experimental study, there should be real and
synthetic fingerprint databases, so that the effectiveness can be evaluated in a complete
manner according to experts view. Because of the fact, we have collected the synthetic
as well as real fingerprint databases that are obtained from FVC 2002 open competition.
After collecting the fingerprint databases, the suitability of evaluation metric is
essentially needed for computing the effectiveness among the various methods. Here, we
have employed the metrics used in the FVC 2002 competition so as to effectively
compare the performance of the approaches. These metrics are used to compute the
effectiveness for each method used for comparison by extensively performing the
experimentation on various databases employed. For easy understanding, we plotted
graphs obtained from the experimentation with various databases and the performances
are analyzed with the help of plotted graphs. For the comparison, we have drawn a ROC
114


graph in between the FNMR vs. FMR that are the two chief metrics used in this
experimentation. From the graphs, we made a conclusion that signifies the relative
effectiveness of the approach among the various methods used for computing the secure
cryptographic key. Furthermore, the idea of cancellable biometrics is the main concern in
the proposed research so that the effectiveness should be additionally analyzed in terms
of non-invertibility. Thus, the non-invertibility in securing the biometric key is
extensively analyzed with the help of different transformations carried out within the
approaches and the conclusion is made from the extensive analysis.

8.3 EXPERIMENTAL ENVIRONMENT AND DATASETS


The proposed approach for secure key generation from the fingerprint template has been
implemented in MATLAB (Matlab 7.10). The experimentation has been carried out on a
3.20 GHz i5 PC machine with 8 GB main memory running a 64-bit version of Windows
2007. The experimental analysis of the approaches has been done on the four fingerprint
databases, in which three real fingerprint database and one synthetic database. The four
databases are obtained from the FVC 2002 [162] that is the Second International
Competition for Fingerprint Verification Algorithms. The main objective of this
Competition is to trace the modern progress in fingerprint verification and to provide the
state-of-the-art technologies in fingerprint technology for both academia and industry.
This competition should not be visualized as an official performance certification of
biometric systems, since the databases used in this contest have not been necessarily
acquired in a real environment and according to a formal protocol. The acquired results
give a practical summary of the state-of-the-art in this field and provide assistance to the
participants for improving their algorithms in any event.

FVC 2002 contains four different databases (DB1, DB2, DB3 and DB4), which were
collected by the following sensors/technologies (given in table 8.1): DB1: optical sensor
"TouchView II" by Identix, DB2: optical sensor "FX2000" by Biometrika, DB3:
capacitive sensor "100 SC" by Precise Biometrics, and DB4: synthetic fingerprint
generation. Here, each database is 110 fingers wide (w) and 8 impressions per finger deep
(d) so, in total, it contains 880 fingerprints. These fingerprints are stored in two sets such
115


as, set A and set B. Set A contains the fingers from 1 to 100 and set B is a representative
of the whole database. The 110 collected fingers were ordered by quality and then the 8
images from every tenth finger were included in set B that is numbered from 101 to 110.
Here, we make use of the set B from each dataset to conduct an experimental study. The
ultimate aim is to directly compare the performance of the proposed approaches with the
previous methods over these well-accepted fingerprint databases. The sample of
fingerprint images taken from the four fingerprint databases is shown in figure 8.1.





(a)




(b)

(c)

(d)

Figure 8.1: Sample of fingerprint images taken from four fingerprint databases
Sample images of (a) DB1 (b) DB2 (c) DB3 (d) DB4
116


Sensor Type

Image Size

Set A (wxd) Set B (wxd) Resolution

DB1

Optical Sensor

388x374 (142 Kpixels) 100x8

DB2

Optical Sensor

296x560 (162 Kpixels) 100x8

DB3

Capacitive Sensor 300x300 (88 Kpixels)

DB4

SFinGe v2.51

10x8

500 dpi

10x8

569 dpi

100x8

10x8

500 dpi

288x384 (108 Kpixels) 100x8

10x8

500 dpi

Table 8.1: Scanners/technologies employed for the collection of FVC2002 databases.

8.4 EVALUATION METRICS


The evaluation metrics are indispensable one for evaluating the effectiveness of the
approaches. The right choice to select the evaluation metrics is very important for
comparing the performance of the approaches. Based on that, we have chosen the
evaluation metrics used in FVC 2002 competition. False Match Rate (FMR) and False
Non-Match Rate (FNMR) are the two metrics that are used to analyze the approaches
proposed in the FVC 2002 competition. Initially, for each database and for each
algorithm, we have calculated the biometric template and then, the biometric key K ij ,
i

1,2,..,10; j

1,2,..,8 are generated from the corresponding template to compute the

FMR and FNMR.


After that, each fingerprint key K ij which has been generated in the earlier step is
matched against the fingerprint images Fki ( j  k d 8) and the Genuine Matching Score
( gms ) is attained from them. The number of matches (indicated as NGRA - Number of
Genuine Recognition Attempts) is ((8 u 7 / 2) u 10

280) only if, REJ ENROLL

0.

In

general, three types of rejection may happen for each fingerprint Fij and the three kinds

of rejections are especially summed and their totality is stored in REJ ENROLL . (1) F (Fail):
the enrollment cannot be possible by the algorithm, (2) T (Timeout): the enrollment goes
above the maximum allowed time, and (3) C (Crash): the algorithm crashes during
fingerprint matching.

FNMR(t )

card gmsijk gms ijk  t  REJ NGRA


NGRA

117


Again, each fingerprint key K 1i , i 1,2,..,10 is matched with the first fingerprint image
from different fingers F1k (i  k d 10) and the corresponding Impostor Matching Score
( ims ) is computed. The number of matches (denoted as NIRA - Number of Impostor
Recognition Attempts) is ((10 u 9) / 2)
FMR (t )

45 only if, REJ ENROLL

0.

card ^ims ik ims ik t t`


NIRA

Furthermore, the FMR (t ) (False Match Rate) and FNMR (t ) (False Non-Match Rate) are
calculated from the above distributions for t ranging from 0 to 1. Then, the ROC curve is
plotted FMR vs. FNMR for varying threshold t . The plotted ROC curve is extensively
used in the contest to compare the performance of different algorithms. One more
parameter used for comparison is, Equal Error Rate ( EER ) that is computed as the point
where, FNMR (t )

FMR (t ) .

8.5 EXPERIMENTAL RESULTS


This section depicts the experimental results obtained from the proposed approaches as
well as the motivating approaches. For experimentation, the fingerprint image is given as
an input and the resultant biometric key is generated from it. In this sub-section, the
intermediate results employed for generating the biometric key are clearly mentioned. For
depicting the various results, we execute the proposed approaches as well as the
motivating approaches on different fingerprint images that are obtained from four
different sensors. The obtained results are given in the following sub-sections.

8.5.1 Experimental Results of the Proposed Efficient Cancellable Biometric Key


Generation Scheme (New-Fangled Approach)

The experimental results of the proposed scheme for efficient cancellable biometric key
generation [160] is presented in this sub-section. The proposed approach is implemented
in Matlab (Matlab 7.10). At first, the fingerprint images are pre-processed using
histogram equalization and Gabor filtering that are enhanced the fingerprint images to
easily extract the minutiae points. Subsequently, the binarization is applied on the
118


enhanced fingerprint images and then, the region of interest is determined. After that, the
minutiae points are extracted after applying the ridge thinning algorithm. Based on the
co-ordinates of minutiae points, the secured feature matrix is computed and eventually,
the 256- bit key is generated from the secured feature matrix. The intermediate results of
the proposed scheme for the sample images from DB1 to DB4 are clearly depicted in the
figure 8.2 to figure 8.5 respectively.

Figure 8.2: Intermediate results of New-Fangled Approach from DB1


(a) Input Fingerprint Image (b) Histogram equalized image (c) Gabor Filtered Image
(d) Binarized Image (e) Region of Interest (ROI) (f) Fingerprint Image with
minutiae points (g) Generated 256-bit key

Figure 8.3: Intermediate results New-Fangled Approach from DB2


(a) Input Fingerprint Image (b) Histogram equalized image (c) Gabor Filtered Image (d)
Binarized Image (e) Region of Interest (ROI) (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key
119


Figure 8.4: Intermediate results New-Fangled Approach from DB3


(a) Input Fingerprint Image (b) Histogram equalized image (c) Gabor Filtered Image (d)
Binarized Image (e) Region of Interest (ROI) (f) Fingerprint Image with minutiae points (g)
Generated 256-bit key

Figure 8.5: Intermediate results of New-Fangled Approach from DB4


(a) Input Fingerprint Image (b) Histogram equalized image (c) Gabor Filtered Image (d)
Binarized Image (e) Region of Interest (ROI) (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key

120


8.5.2 Experimental Results of the Proposed Efficient Approach for Cryptographic Key
Generation from Fingerprint (Bio-Crypto Key)

This section presents the experimental results of the proposed efficient approach for
cryptographic key generation from fingerprint [161]. The proposed approach is
programmed in Matlab (Matlab7.10). Initially, the fingerprint images obtained from the
FVC 2002 dataset are pre-processed using histogram equalization and Wiener filtering to
achieve image enhancement. Then, the Region of Interest is selected from the enhanced
fingerprint images through the use of binarization, Adaptive Thresholding and
morphological operations. Afterward, the location of the minutiae points are extracted
after applying the ridge thinning algorithm. Subsequently, Secured Cancellable Template
is computed based on the co-ordinates of minutiae points and the 256- bit key is
generated from the secured Cancellable Template. The intermediate results of the
proposed approach for the sample images from DB1 to DB4 are clearly depicted in the
figure 8.6 to figure 8.9 respectively.

Figure 8.6: Intermediate results of Bio-Crypto Key Generation Approach from DB1
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae
points (g) Generated 256-bit key

121


Figure 8.7: Intermediate results of Bio-Crypto Key Generation Approach from DB2
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key

Figure 8.8: Intermediate results of Bio-Crypto Key Generation Approach from DB3
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key

122


Figure 8.9: Intermediate results of Bio-Crypto Key Generation Approach from DB4
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key
8.5.3 Experimental Results of the Method Introduced by Nalini K. Ratha et al.

This section presents the experimental results of the method proposed by Nalini K. Ratha

et al. [54] to generate cancellable fingerprint template. This approach has been
implemented using Matlab (Matlab7.10). Initially, the minutiae points are extracted from
the fingerprint template and the new transformation key is generated from the minutiae
points. The first important step is the process of registering the image. Then, the minutiae
positions and orientation of the singular points (core and delta) is measured with regard to
the same coordinate system and they are expressed by minutiae positions and angles with
respect to these points. In the Cartesian transformation, the minutiae positions are
measured in rectangular coordinates with reference to the position of the singular point.
In the polar transformation, the minutiae positions are measured in polar coordinates with
reference to the core position and the angles that are measured with reference to the
orientation of the core. Figure 8.10 to figure 8.13 represent the intermediate results of this
method for the sample images from DB1 to DB4.

123


Figure 8.10: Intermediate results of Ratha et al.s Approach from DB1


(a) Input Fingerprint Image, (b) Fingerprint image with orientation field,
(c) Fingerprint Image with minutiae points, (d) Minutiae points after applying
Cartesian transformation, (e) Minutiae points after applying Polar transformation.

Figure 8.11: Intermediate results of Ratha et al.s Approach from DB2


(a) Input Fingerprint Image, (b) Fingerprint image with orientation field, (c)
Fingerprint Image with minutiae points, (d) Minutiae points after applying Cartesian
transformation, (e) Minutiae points after applying Polar transformation.

124


Figure 8.12: Intermediate results of Ratha et al.s Approach from DB3


(a) Input Fingerprint Image, (b) Fingerprint image with orientation field, (c)
Fingerprint Image with minutiae points, (d) Minutiae points after applying Cartesian
transformation, (e) Minutiae points after applying Polar transformation.

Figure 8.13: Intermediate results of Ratha et al.s Approach from DB4


(a) Input Fingerprint Image, (b) Fingerprint image with orientation field, (c)
Fingerprint Image with minutiae points, (d) Minutiae points after applying Cartesian
transformation, (e) Minutiae points after applying Polar transformation.

125


8.5.4 Experimental Results of the Approach Introduced by S. Tulyakov et al.


This section presents the experimental results of the approach proposed by S. Tulyakov et

al. [143] to secure and personalize the hash for the fingerprint data. This approach has
been implemented using Matlab (Matlab7.10). At first, minutiae features are extracted
from the fingerprint images obtained from an online scanner. For each minutiae point,
nearest neighbour is identified to constitute minutiae subsets and the hashes of the
minutiae subsets are obtained using Symmetric hash functions. The computed hash
values for the minutiae subsets are stored in the database. During verification, new hash
values are generated and are matched with those stored in the database. Figure 8.14 to
figure 8.17 indicate the intermediate results of this approach for the sample images from
DB1 to DB4.

Figure 8.14: Intermediate results of S. Tulyakov et al.s Approach from DB1


(a) Input Fingerprint Image (b) Fingerprint Image with minutiae points (c) Minutia
points with its nearest neighbour (n=5) (d) fingerprint hash value

Figure 8.15: Intermediate results of S. Tulyakov et al.s Approach from DB2


(a) Input Fingerprint Image (b) Fingerprint Image with minutiae points (c) Minutia
points with its nearest neighbor (n=5) (d) fingerprint hash value
126


Figure 8.16: Intermediate results of S. Tulyakov et al.s Approach from DB3


(a) Input Fingerprint Image (b) Fingerprint Image with minutiae points (c) Minutia
points with its nearest neighbor (n=5) (d) fingerprint hash value

Figure 8.17: Intermediate results of S. Tulyakov et al.s Approach from DB4


(a) Input Fingerprint Image (b) Fingerprint Image with minutiae points (c) Minutia
points with its nearest neighbour (n=5) (d) fingerprint hash value

8.6 PERFORMANCE ANALYSIS OF THE ALGORITHMS


The experimental analysis of the proposed approaches as well as the motivating
approaches is presented in this sub-section. The extensive analysis of the different
algorithms is carried out on four fingerprint databases using FMR and FNMR values for
various thresholds.

127


8.6.1 Performance Analysis of the Proposed Efficient Cancellable Biometric Key


Generation Scheme (algorithm 1)
The performance of the proposed efficient cancellable biometric key generation scheme
[160] is extensively analyzed with the help of the FMR and FNMR computed for
different thresholds. In order to compute these values, at first, the feature vector is
extracted from the fingerprint images and then, it is stored in the database. For matching,
the feature vector (key) of fingerprint images is computed and it is matched with the
features stored in the database. From the matching result, the genuine and impostor
matching score is computed to find the FMR and FNMR of the proposed algorithm.
Then, by varying the threshold provided for matching, the FMR and FNMR are
computed from the genuine and impostor matching score. Then, the computed values are
plotted as graph (FMR\FNMR vs. t) to signify the performance of the approaches.
The graph (FMR\FNMR vs. t) is plotted for the results obtained from DB1 shown in
figure 8.18. The graph clearly depicts that when the threshold is varied to higher value,
the genuine acceptance score is increased. But, at the same time, FMR is decreased for
higher thresholds. The ultimate aim here is to find the exact value to keep the FMR will
be a minimum value along with the FNMR will be high. Using the FMR and FNMR
values, EER is computed to find the accuracy of the proposed approach in verification
task. From the graph, the equal error rate for the proposed algorithm in DB1 is 0.5.

Figure 8.18: Performance graph (FMR (t) and FNMR (t)) of


New-Fangled Approach on DB1
128


At the same way, the verification task is performed on the other databases, such as, DB2,
DB3 and DB4. The same graphs are plotted as per the values obtained after matching the
feature of the fingerprint images as shown in figures 8.19, 8.20 and 8.21. By analyzing
the graphs plotted for DB2, the FMR falls to zero value once the threshold is above than
0.6. At the same time, the FNMR is increased significantly from the initial value and it
stabilizes once the value is reached to 0.6. For DB3 and DB4, the corresponding value
that the FMR falls to zero is 0.7. Furthermore, the equal error rate for the proposed
algorithm is computed for all the databases. The corresponding values obtained from the
graph is given as, EER= 0.5 (DB2), EER= 0.6 (DB3) and EER= 0.6 (DB4).

Figure 8.19: Performance graph (FMR (t) and FNMR (t)) of


New-Fangled Approach on DB2

Figure 8.20: Performance graph (FMR (t) and FNMR (t)) of


New-Fangled Approach on DB3
129


Figure 8.21: Performance graph (FMR (t) and FNMR (t)) of


New-Fangled Approach on DB4

8.6.2 Performance Analysis of the Proposed Efficient Approach for Cryptographic Key
Generation from Fingerprint (algorithm 2)

This section presents the performance analysis of the proposed efficient cancellable
biometric key generation scheme [161].

Figure 8.22: Performance graph (FMR (t) and FNMR (t)) of


Bio-Crypto Key Generation Approach on DB1

130


Initially, the features are extracted from the fingerprint images using the proposed
algorithm and the matching process is carried out by varying the threshold values. For
different thresholds, the FMR as well as FNMR is computed from the genuine and
impostor matching score obtained after matching. The graph plotted for the
corresponding values of different databases is shown in the following figures (8.22 to
8.25) that provide the EER for all the databases. The values obtained are given as, EER=
0.542(DB1), EER= 0.45(DB2), EER= 0.6(DB3) and EER= 0.55(DB4).

Figure 8.23: Performance graph (FMR (t) and FNMR (t)) of


Bio-Crypto Key Generation Approach on DB2

Figure 8.24: Performance graph (FMR (t) and FNMR (t)) of


Bio-Crypto Key Generation Approach on DB3
131


Figure 8.25: Performance graph (FMR (t) and FNMR (t)) of


Bio-Crypto Key Generation Approach on DB4

8.6.3 Performance Analysis of Nalini K. Ratha et al.s Algorithm

The overall fingerprint matching performance of the approach proposed by Nalini K.


Ratha et al. [54] is given in this sub-section. To measure the efficiency of this approach,
the genuine and impostor matching score is computed for different fingerprint database.
These values are then used for finding the FMR and FNMR that are computed by
matching the features of the fingerprint images with the features stored in the database.
The graph is plotted for the computed values to find the efficiency in acceptance of the
genuine user and the rejection of the impostor user for different threshold levels. From
the graphs (shown in figure 8.26 to 8.29), genuine acceptance is increased whenever the
threshold is increased and the impostor matching is also increased if the threshold is high.
The problem here is to reduce the impostor acceptance so that the secure recognition is
possible. In addition to, the value that compensates both the genuine acceptance and
impostor rejection should be identified. From the graphs, the EER identified for the
algorithm in different databases is given as, EER= 0.5 (DB1), EER= 0.5 (DB2), EER=
0.5 (DB3), EER= 0.6 (DB4).

132


Figure 8.26: Performance graph (FMR (t) and FNMR (t)) of


Ratha et al.s Approach on DB1

Figure 8.27: Performance graph (FMR (t) and FNMR (t)) of


Ratha et al.s Approach on DB2

133


Figure 8.28: Performance graph (FMR (t) and FNMR (t)) of


Ratha et al.s Approach on DB3

Figure 8.29: Performance graph (FMR (t) and FNMR (t)) of


Ratha et al.s Approach on DB4

134


8.6.4 Performance Analysis the of S. Tulyakov et al.s Algorithm

The recognition performance of the approach proposed by the S. Tulyakov et al. [143] is
discussed in this sub-section. By applying the procedure, the cancellable template is
constructed and then, the cancellable template is used to generate the key vector of the
fingerprint images in the fingerprint database. Then, matching against the genuine
fingerprint and impostor fingerprints is carried out to find the FMR and FNMR of the
approach in fingerprint recognition. The graph is drawn for values obtained to find the
efficiency of the approach in different databases. From the graphs plotted (shown in
figures 8.30 to 8.33), the equal error rate of the approach in various databases is found
out to ensure the accuracy in fingerprint recognition. The values obtained are given as,
EER= 0.5 (DB1), EER= 0.6 (DB2), EER= 0.5 (DB3) and EER= 0.6 (DB4).

Figure 8.30: Performance graph (FMR (t) and FNMR (t)) of


S. Tulyakov et al.s Approach on DB1

135


Figure 8.31: Performance graph (FMR (t) and FNMR (t)) of


S. Tulyakov et al.s Approach on DB2

Figure 8.32: Performance graph (FMR (t) and FNMR (t)) of


S. Tulyakov et al.s Approach on DB3

136


Figure 8.33: Performance graph (FMR (t) and FNMR (t)) of


S. Tulyakov et al.s Approach on DB4
8.7 COMPARATIVE ANALYSIS OF THE PROPOSED METHODS WITH THE
PREVIOUS APPROACHES
This section presents the comparative analysis of the proposed methods with the previous
approaches. The comparison can be performed by plotting the ROC curves of different
methods in various fingerprint databases.

A ROC (Receiving Operating Curve) is

illustrated in log-log scales for enhanced comprehension, where FNMR is plotted as a


function of FMR. In every biometric system, there is an availability of strong agreement
between FMR and FNMR . In reality, both FMR and FNMR are functions of the
biometric system threshold t , and henceforth we should refer them as FMR (t ) and
FNMR (t ) respectively, The value of FMR (t ) increases to create the system more tolerant

concerning input variations and noise, while the threshold t get decreased. On the other
hand, if t is raised to make the system more protected, then FNMR (t ) increases.
Therefore, the report system performance at all operating points (threshold, t) is more
desirable. This can be achieved by plotting a Receiver Operating Characteristic (ROC)
curve. The ROC curve is a plot of FMR (t ) against (1  FNMR (t )) for diverse decision
thresholds, t .

137


8.7.1 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB1)
The performance of the proposed approaches is extensively compared with the previous
approaches using ROC curve. For DB1, the ROC curve of the different approaches is
plotted in log-log scale. The ROC curve plotted for the DB1 is given in the figure 8.34.
From the graph, it clearly denotes that the proposed approach has a lower FNMR that
signifies the better security of the proposed system. Compared with the previous
approaches, the proposed two approaches can provide better security against the impostor
attacks in due to its less FNMR. On the other hand, the previous two approaches have
provided the less FMR compared with the proposed approaches. Hence, although the
secure system is slightly worst in terms of FMR near the point of equal error, it is
significantly better in terms of FNMR.

Figure 8.34: ROC curve of DB1

8.7.2 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB2)
This section presents the comparative analysis of the proposed approaches with the
previous approaches in DB2. The approaches are tested on DB2 and the ROC curve is
plotted. From the graph (shown in figure 8.35), all the algorithms almost provide the
identical results except the approach proposed by the S. Tulyakov et al. that keeps lesser
FMR. On the other hand, the proposed algorithms proved to be very accurate and
138


exhibited a good trade-off between FMR and FNMR. This ensures that the security of the
proposed approaches is good compared with the previous approaches.

Figure 8.35: ROC curve of DB2


8.7.3 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB3)
Figure 8.36 compares the performance of the proposed two techniques with the previous
approaches through ROC curve. The comparison can be done using the fingerprint
images available in the DB3. In DB3, the second approach performs better compared
with all other methods. On the other hand, the first approach produces less security and
poor recognition in fingerprint images of DB3. The previous two approaches almost
behave similar in fingerprint recognition in such a way that the security is also
considerably good for these two approaches. Even though the previous two approaches
are fair in fingerprint recognition of DB3, the second approach may overcome them in
terms of providing better recognition against the impostor attacks when using the
reliability threshold.

139


Figure 8.36: ROC curve of DB3

8.7.4 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB4)

The comparison for particular approaches requires analysis of receiver operating curve
(ROC), which can be developed by varying a range of threshold values in between 0 and
1. Figure 8.37 shows the performance of the proposed two approaches in the verification
task. From the figure, the bigger range of threshold value yield the better performance, as
a large range of operating points, t with zero errors can be obtained. When the threshold
is fixed to notable range, the proposed two approaches are doing fine in their
performance of verification task. In addition to, FNMR and FMR can be improved
further by providing the secure recognition of genuine users and correctly rejecting the
impostor attacks. Compared with the previous two approaches, the proposed approaches
are more appropriate in providing the security against the anticipated attacks.

140


Figure 8.37: ROC curve of DB4

8.8 SUMMARY

This chapter presented the experimental results and the analysis of the proposed methods
with the previous methods. The experimental analyses were carried out utilizing the wellknown fingerprint databases to noticeably evaluate the performance of the approaches.
Subsequently, ROC graph was plotted in between the FNMR vs. FMR to signify the
relative effectiveness of the approach among the various methods. Finally, this chapter
concludes that the recognition performance is improved by using the proposed method
significantly.


141


CHAPTER 9
SECURITY ANALYSIS
Systems methods will neither be trustworthy nor successful unless the general
research regarding systems methodology incorporates security analysis design as an
explicit objective. Richard Baskerville

9.1 INTRODUCTION
The main consideration of the research work is to ensure security against the impostor
attacks. Hence, for proving the security, the analysis should be carried out with respect to
the non-invertiblity and the transformation used. For this purpose, a number of
transformation functions have been developed for building revocable or non-invertible
biometric templates. In the following sub-sections, we discuss the different
transformations used by the methods to provide the security in fingerprint matching and
the security analysis.

9.2 SECURITY ANALYSIS OF Radha et al.s WORK


The strength and security analysis of the approach given by Radha et al. [54] used the
Cartesian and the polar transformation explained below:
Cartesian Transformation: The binary representation of the exchange matrix is
appropriate from a storage viewpoint and also gives a first order approximation of the
bits are encoded in each of the

information embedded in the key. Here,

column of the matrix. Therefore, the complete information content has an upper bound of
bits. If studying the fairly accurate strength of the transformation
process, each resultant cell after the transformation could have been initiated from
possible source cells. Therefore, a brute force attack would have to attempt nearly
possibilities corresponding to

bits.

142


Polar Transformation: The authors assign every minutia with


when its a first-order approximation, where
positions and

bits of data

is the number of discrete minutiae

is the number of unique directions. For an effective brute force attack, the

attacker has to match only

of the

makes the matching score

minutiae existing in the reference print, which


surpass the threshold. Here, it is taken up that an

average fingerprint will be having about 35- 40 minutiae and also that
bits.

9.3 SECURITY ANALYSIS OF S.Tulyakov et al.s WORK


S. Tulyakov et al. [143] proposed an algorithm for the hashing of fingerprint templates
which remove the likelihood of an attacker learning the original minutia positions to
make the fingerprint hashes cancellable. This is attained by re-enrolling people with a
distinct set of hash functions. The systems are frequently implemented in a two-level
authentication, so as to enrich the security. In a two-level authentication, a user makes
use of both the biometric as well as a key that is stored on a card or on a keypad. As well,
this key can be re-released in case of a successful attack. The authors have proposed ways
to intensify the security of the hashing process by an exponential factor, which was done
by implanting a secret key into the hashing method. The key can be based on a token that
the user carries or a password that the user recalls or may even be based on any other
biometric modality, which in turn makes the key personal. In order to attain a cancellable
biometric algorithm, there is a requirement to arrange for a way to automatically build
and use arbitrarily produced hash functions. The offered set of hash functions,

that forms an algebraic basis in the set of polynomial symmetric functions which
basically take a random algebraic basis of symmetric polynomials of degree less than or
equal

to

Then,

the
and

hash

functions

of

the

transformed

minutiae,

are scalar rotation and translation parameters, will still

be symmetric functions of the similar degree regarding the variables

Accordingly, hashes of transformed minutia can be conveyed using the original hashes,
143


for some polynomial functions

These equations permit matching localized minutia sets, and getting equivalent
transformation attributes.

9.4

SECURITY

ANALYSIS

(CANCELABLE

OF

THE

PROPOSED

FIRST

BIOMETRIC

KEY

GENERATION

METHOD

SCHEME

FOR

CRYPTOGRAPHY)
The main aim of the cancellable transformation is to deliver cancellable skill a "noninvertible" transform. Usually, it lowers the discriminative power of the original
template. Thus, the cancellable templates and the secure templates of an individual in
diverse applications will be different. In the proposed algorithm [160], security is more
strengthened by AES encryption. Once after the template is formed and minutiae points
are acquired, a feature matrix is generated by a sequence of steps. The feature matrix is
then encrypted using AES. Reinforced by AES, the feasibility of decrypting the ciphered
feature matrix is almost negligible. Anticipating a worst-case scenario, even if a hacker
succeeds in decrypting the AES encrypted data with an intention to obtain the feature
matrix, the chances of reorganizing the minutiae points and the templates are almost nil.
Furthermore, there is no possibility of conjecturing the steps to generate the feature
matrix and it is little chance to restructure the template by any means. The key thus
formed cannot be traced back to the origin i.e. to the template and moreover the key itself
cannot be regenerated falsely using the template. This irreversible aspect makes the key
armoured and reliable and even resistant to brute force attacks. This shatter-proof
property emanates from the very essence of preserving the confidentiality of the battery
of operations in transforming minutia points to a Feature matrix.

144


9.5 SECURITY ANALYSIS OF THE PROPOSED SECOND METHOD (BIOCRYPTO

KEY

FROM

FINGERPRINTS

USING

CANCELABLE

TEMPLATES)
In the second Bio-Crypto Key Generation Algorithm [161], an efficient approach for
irrevocable cryptographic key generation is designed using a secured cancellable
template obtained from fingerprint biometrics. In this, the security of the proposed
cancellable template generation is enhanced by means of utilizing the well-known RSA
factorization concepts along with the exponentiation concept. The advantage of using the
RSA algorithm is chosen by finding the product of two distinct prime numbers from the
minutiae points set and it was almost computationally infeasible to determine the factors
of the prime numbers. Moreover, the security is more enhanced by computing the
exponential value of the prime factors. The complexity of this algorithm is analyzed
whose complexity is of exponential and hence computationally infeasible. Henceforth,
the cancellable template, even though irrevocable, serves as the source for the generation
of the cryptographic key. The resultant cryptographic key thus generated has been
irrevocable and unique to a specific cancellable template, availing better protection and
replacement features for lost or stolen biometrics.

9.6 SUMMARY
This chapter discussed the security analysis of the proposed methods along with the
existing methods. In order to signify the effectiveness of the proposed methods, the
security analysis was carried out in terms of different transformations utilized in the
various methods to prove the non-invertibility. Finally, the conclusion was made from the
extensive analysis for ensuring the security against the impostor attacks.

145


CHAPTER 10
CONCLUSION AND SCOPE FOR FUTURE WORK
I am turned into a sort of machine for observing facts and grinding out
conclusions. Charles Darwin
"Research is the process of going up alleys to see if they are blind."
Marston Bates

10.1 CONCLUSION

Biometric based authentication can reasonably guarantee well-built security assurance


system about the identity of users. The significant concern in the case of biometric
data is security of the data itself, since the compromise of the data will be permanent.
Cancellable biometrics store a noninvertible transformed version of the biometric
data and the data continues to be safe, if the storage get compromised. The
transformation of the data is only one way and the detail of a transformed biometric
does not leak information about the actual biometric data. Also, the Cancellable
biometrics offer an advanced level of privacy and non-linkability of users data stored
in databases by allowing numerous templates for the same biometric data.
In this research, we have presented efficient techniques to generate a bio-crypto key
from fingerprints using cancellable templates. The proposed cancellable biometric
Crypto System is an efficient technique for authentication. The proposed system uses
the intended distortion of fingerprint in a repeatable manner and the fingerprint thus
obtained is exploited in the cryptographic key generation. The proposed approach has
been composed of three phases namely: 1) Extraction of minutiae points from the
fingerprint image, 2) Generation of cancellable biometric templates with added
security and 3) Cryptographic key generation from the secured cancellable template.
As a result, the generated cryptographic key is irrevocable and unique to a particular
cancellable template.
The experimental analysis of the proposed techniques has been done on the four
fingerprint databases, in which three real fingerprint databases and one synthetic
database are used. The databases have been attained from the FVC 2002, the Second
146


International Competition for Fingerprint Verification Algorithms. We have evaluated


our methods using two evaluation metrics False Match Rate (FMR) and False NonMatch Rate (FNMR) with the two standard existing research works.

Then,

comparison is carried out using the ROC curve that signifies the efficiency of the
proposed methods. Also ROC curve provides a good trade-off between accuracy and
efficiency. In addition to the above, the EER obtained by the proposed approaches is
significantly less as compared with the previous approaches over the four fingerprint
databases. Finally, the security of the approaches is extensively discussed in terms of
non-invertibility and binding bio-crypto key in ensuring the security preservation.

10.2 SCOPE FOR FUTURE WORK

This work opens up new avenues for future work. This research can be extended in
various directions and some of these are summarized below:
x

Since the proposed approach achieved a good EER, the present work can be
extended further for improving the accuracy of biometric-based security
systems by designing more reliable matching strategies.

Even though the minutiae points are extracted efficiently, further extension
can be carried out utilizing the feature extraction methods that should be
suitable for noisy fingerprint images.

Another direction is to extend the approach by speeding up the feature


extraction as well as matching processes.

This methodology can be further made efficient and sophisticated with the
combination of some of the evolving cryptographic systems.

The proposed work can be promoted by stabilizing the bio-crypto key via error
correction methods.

147


GLOSSARY
TECHNICAL TERMINOLOGY

Acceptability: It is about how readily individuals adopt a biometric system or about how
intrusive the individual feels the systems is, based on the traits in the question.
Access Control & Availability: Ensuring that authorized users have access to information
and associated assets when required i.e., services must be accessible and available to
intended users.
Accessibility: It measures how easy the particular biometric trait is to get to and measure.
Foot geometry, for example, would not be very accessible since individuals would have
to remove their shoes first.
Active Attacks: Active attacks involve attempts on security leading to deletion,
modification, insertion, redirection, blockage or destruction of data, device or links.
Adaptive Thresholding: Thresholding is called an adaptive thresholding, when a different
threshold is used for different regions in the image. This may also be known as local or
dynamic thresholding.
Arch: a ridge that runs across the fingertip and curves up in the middle. Tented arches
have a spiked effect.
Authentication: It is the process of verifying the claimed identity of a user. i.e., sender
and receiver want to confirm the identity of each other.
Authorisation: authorizing access to resources.
Authorization Violation: An entity uses a service or resource it is not intended to use.
Availability: It ascertains how many different/unique, independent samples the system
could potentially acquire from an individual.

148

Biometric System: A biometric system is an automated method for identifying or


authenticating the identity of a living person based on a physiological or behavioural
characteristics/traits.
Biometric Template is the digital representation of biometric trait, which is further
processed by a feature extractor to generate a compact but expressive representation.
Black Pixel: its value will be 0 for an 8 bits/pixel indexed image in greyscale.
Bridges: small ridges joining two longer adjacent ridges.
Cancellable Biometrics, where system does not store the original biometric data, rather,
it stores only the version transformed through a non-invertible transform using a one-way
function by keeping the original data safe even if the system is compromised. Using this
concept, a biometric template can be cancelled and can be re-issued to increase the
usability of biometric security systems.
Cartesian Transformation: In this method, rectangular coordinates, which points to the
position of the singular point, are used for the measurement of minutiae positions. The xaxis is aligned with respect to the point of reference of the singular point. The space are
divided into cells of equal size i.e., this coordinate system is separated into fixed size
cells.
Circumvention refers to how easy it is to fool the system by fraudulent techniques.
Collectability of a Trait is the characteristic that can be measured quantitatively.
Computer Network is an interconnected collection of autonomous computers which use a
well-defined, mutually agreed set of rules and conventions known as protocols for
interacting with one-another meaningfully in the form of messages and for allowing
resource-sharing preferably in a predictable and controllable manner.
Computer Security can be defined as a set of technological and managerial procedures
applied to a computer system to ensure the availability, integrity and confidentiality of
the information managed by the computer.
Confidentiality (Secrecy): Ensuring that information (either data or software) is available
and accessible only to those authorized to have access. i.e., only sender and intended

149

receiver should understand message contentssender encrypts message and receiver


decrypts message according to mutually agreed protocols.
Cracker: a person who breaks into a computer system changing or damaging some type
of information or element. Their motivation is usually a financial gain.
Crossovers: two ridges which cross each other.
Cryptography: the science and art of transforming messages to make them secure and
immune to attacks i.e., Cryptography means concealing the contents of a message by
enciphering.
D Prime: D prime is a common scalar means of evaluating biometric system
performance. It is the normalized difference between the means of genuine and impostor
match scores. D prime is also known as a measure of goodness, and assumes
distributions to be normal.
Data Integrity: Safeguarding the accuracy and completeness of information and
processing methods i.e., sender and receiver want to ensure that data or message be notaltered (in transit, or afterwards) without detection.
Denial of Communication (Repudiation): An entity falsely denies its participation in a
communication act.
Detection Error Trade-off (DET): DET curve is a modified ROC curve.
Dilation: The dilation process is carried out by placing the structuring element B on the
image A and moving it over the image as is done for convolution.
Distinctiveness: It measures the complexity or potential differences in a particular
biometric traits patterns and helps determine how large a population sample can be used.
Distortion or Transformation: A biometric trait b and its distorted version f (b) should
not match.
Diversity: The ability to generate multiple templates from the same biometric to ensure
that cancellable biometric is such that not the same cancellable templates have to be used
in two different applications.

150

Eavesdropping or Snooping: It refers to unauthorized access or interception of data that


is not intended to be read.
Enrollment: In the enrollment process, users initial biometric samples are collected,
assessed, processed, and stored for ongoing use in a biometric system.
Entropy Retention: Two non-matching biometric traits should not match after distortion.
Equal Error Rate (EER): It is the rate at which both accept (FAR) and reject (FRR)
errors are equal. The lower the EER, the more accurate the system is considered to be.
Erosion: The process of erosion is same as the dilation process except that the pixels are
changed to 'white' instead of 'black'.
Failure to Capture Rate (FTC or FCR): FTC is the percentage of time the biometric
system is unable to capture a biometric sample when one is presented. In other words, it
is the probability that the system fails to detect a biometric characteristic when presented
correctly.
Failure to Enroll Rate (FTE or FER): FTE is the percentage of time that users are
unable to enrol in the biometric system. In other words, it is the percentage of data input
that is considered invalid and fails to input into the system. Failure to enroll happens
when the data obtained by the sensor are considered invalid or of poor quality.
False Accept Rate (FAR): It is the number of times an impostor user is falsely granted
access to the system divided by the total number of trials.
False Match Rate (FMR): FMR is the rate at which a template is falsely matched to a
template in a database.
False Non-Match Rate (FNMR): FNMR is the rate at which a template is falsely notmatched to a truly matching template in the database.
False Reject Rate (FRR): It is the number of times genuine users are falsely rejected
divided by the number of trials.
Filter or Filter Circuit is a circuit designed to perform frequency selection like
selectively filtering one frequency or range of frequencies out of a mix of different
frequencies in a circuit.

151

Forgery of Information: An entity creates new information in the name of another entity.
Gabor filter is a linear filter whose impulse response is defined by a harmonic function
multiplied by a Gaussian function. Because of the multiplication-convolution property,
the Fourier transform of a Gabor filter's impulse response is the convolution of the
Fourier transform of the harmonic function and the Fourier transform of the Gaussian
function. Gabor filters have the ability to perform multi-resolution decomposition due to
its localization both in spatial and spatial-frequency domain. These are used for feature
extraction in many machine vision applications.
Hacker: a person specialized in a topic and enjoys exploring it for the sake of learning
and overcoming barriers. Applied to IT, the term refers to a person whose ability to
understand computer systems, their design and programming, allows him/her to master
the systems for a particular use.
Histogram Equalization is a technique frequently used in Image Processing in order to
improve the image contrast and brightness and to optimize the dynamic range of the
greyscale.
Identification and Authentication is the process of verifying the identity of a user
through the use of specific credentials (e.g., passwords, tokens, biometrics), as a
prerequisite for granting access to resources in a network system.
Identification is a 1: N matching process, where the users input is compared with the
templates of all the persons enrolled in the database and the identity of the person whose
template has the highest degree of similarity with the users input is processed by the
biometric system. If the highest similarity between the input and all the templates is less
than a fixed minimum threshold, the system rejects the input, which implies that the user
presenting the input is not one among the enrolled users.
Image Binarization is an important process for image analysis. The inherently bi-level
nature of the image has led to many of the image analysis algorithms being designed for
use on bi-level images.
Information Security is the quality or state of being secure, i.e., to be free from danger.

152

Internet is not a single network; rather, it is conglomeration of several networks, which


implies that, it is an interconnected collection of heterogeneous networks i.e., a single
huge global Network of Networks.
Intra-user Variability Tolerance: Two matching biometric traits should also match after
a distortion.
Islands: ridges slightly longer than dots, occupying space between two temporarily
divergent ridges;
Latent Fingerprints which are invisible under normal viewing conditions.
Loops: These have a stronger curve than arches, and they exit and enter the print on the
same side. Radial loops slant toward the thumb and lunar loops away from the thumb.
Loss or Modification of (transmitted) Information: The alteration or destruction of data.
Masquerade or Spoofing: It refers to impersonation, i.e., an entity claiming to be another
entity.
Minutia is the pattern of the ridges and valleys, which is unique for each individual.
Minutiae Extraction utilizes the binarization and thinning methods detecting adjacent
ridge information of minutiae to compute the minutiae scores.
Minutiae points are local ridge characteristics that appear as either a ridge ending or a
ridge bifurcation.
Network (Internet) Security is the ability of a network system to protect information and
system resources with respect to confidentiality and integrity. In detail, it may be
described as Protection of networks and their services from unauthorized modification,
destruction or disclosure and provision of assurance that the network performs its critical
functions correctly and there are no harmful side effects. Here, security measures are
designed to protect data during their transmission over a collection of interconnected
computers and also interconnected networks (like Internet).
Non-invertibility: Original biometric data cannot be recovered from the transformed or
encrypted templates. i.e., One-way transformation function will be used for template
computation to prevent recovery of biometric data.

153

Non-repudiation or Non-denial: Ensuring an individual cannot deny the authorization


of a transaction, i.e., sender or receiver should not be able to disavow later-on the
message actually transmitted or received by them.
Passive Attacks: Passive attacks are the eavesdropping communications and releasing of
messages. These involve simply getting access to link or device and consequently data,
without altering/changing the data. It only requires traffic analysis on the identities,
locations, frequency etc of communications.
Performance refers to the achievable identification accuracies, taking into consideration,
the resource requirements for acceptable identification accuracy, and the working
environmental factors that affect the identification accuracy (accuracy, speed, and
robustness of technology used).
Permanence or Immutability: A characteristic being invariant in time.
Personal Identification is the process of verifying a persons identity. This process is
also called authentication.
Pervasive Security Mechanisms include trusted functionality, security label, event
detection, security audit trails, security recovery etc.
Plastic Fingerprints, which are left in soft surfaces such as newly painted ones.
Polar Transformation: In this method, measurement of the minutiae positions are
obtained through polar coordinates by means of the core position. The measurement of
angles is done in association to the point of reference of the core. The feature space is
divided into sectors i.e., the coordinate space is now separated into polar sectors ( levels
and

angles) that are numbered in a sequence.

Ponds or lakes: empty spaces between two temporarily divergent ridges;


Receiver Operating Characteristic (ROC) Curve: It is the curve relating FAR to FRR
across various thresholds. In biometric systems, the FAR and FRR can typically be traded
off against each other by changing those parameters. ROC curves are one of the ways to
evaluate the performance of a biometric system.

154

Registration: It should be possible to apply the same transformation function ( f ) to

multiple captures of the same biometric traits b1 ,b2 .


Revocability/ Reusability: Templates are easily revoked and reissued when compromised.

i.e., straightforward revocation and re-issue are allowed in the event of compromise.
Ridge thinning process is utilized to remove the redundant pixels until the ridges become

one pixel wide.


Robustness: It measures the stability of the biometric trait. In other words, it is the ability

of the biometric to stay constant or un-changeable over time. It becomes important when
biometric trait can be physically changed, either intentionally or accidentally.
Sabotage: Any action that aims to reduce the availability and/ or correct functioning of

services or systems.
Scenario evaluations determine the performance of a complete biometric system in an

environment that models a real-world target application. These test results can only be
repeatable if the modelled scenario is controlled.
Security is freedom from risk or danger.
Security Attack is the actual realization of a threat is called an attack i.e., any action that

compromises the security of information owned by an organization or individual.


Security Goals include protection of information from unwanted access and maintaining

Confidentiality, Authentication, Data Integrity, Access Control & Availability, and Nonrepudiation. Security goals can be defined, depending on the application environment, or
in a more general and technical way.
Security Mechanism is a process (algorithm, protocol or device) that is designed to

detect, locate, identify, prevent, or recover from security attacks.


Security Service: A Security Service is a processing or communication service that

enhances security of data processing and information transmission, and makes use of the
security mechanisms.
Security Threat in a communication network is any possible event or sequence of actions

that might lead to a violation of one or more security goals.


155

Specific Security Mechanisms include encryption, digital signature, access controls, data

integrity, authentication exchange, traffic padding, routing control, notarization etc.


Steganography means concealing the message itself by covering it with something else.
Technology Evaluations compare competing technologies with a single technology by

testing all algorithms on a standardized database by a universal sensor. Two common


technology evaluations are the Fingerprint Verification Competition (FVC) and the
Fingerprint Vendor Technology Evaluation (FpVTE).
Template Capacity: the maximum number of sets of data which can be input in to the

system.
Thresholding is a non-linear operation that converts a grey-scale image into a binary

image where the two levels are assigned to pixels that are below or above the specified
threshold value.
Uniqueness, no two persons should be the same in terms of the characteristic.
Universality, every person should have the characteristic.
Verification is a 1:1 matching process, where the user claims an identity and the system

verifies whether the user is genuine or not. If the users input and the template of the
claimed identity have a high degree of similarity, then the claim is accepted as genuine
otherwise, the claim is rejected and the user is considered as fraud.
White Pixel: its value will be 255 for an 8 bits/pixel indexed image in greyscale
Whorl: an oval formation, often making a spiral pattern around a central point. Principal

types are a plain whorl and a central pocket loop whorl.


Wiener filter is a filter which converts a known input signal into an output signal which,

according to a least-squares test, is the one most similar to a desired form of signal
output. The Wiener filter purpose is to reduce the amount of noise present in a signal by
comparison with an estimation of the desired noiseless signal. It is based on a statistical
approach.

156

BIBLIOGRAPHY
REFERENCES

[1]

Andrew S. Tanenbaum, Computer Networks, Third Edition, PHI Publications,


2002.

[2]

(n.d) Serge Vaudenay, Basic Cryptography Application to Machine Readable


Travel Documents, cole Polytechnique Fdrale De Lausanne,
http://lasecwww.epfl.ch/.

[3]

Rahul Benerajee, Introduction to Network Security, Self Instruction Material


(SIM): Module 01, Course Number: SSZG 513(2005-2006), netsec-sim-2006-01dr-rahul-banerjee-bits-pilani-secure.pdf.

[4]

IGNOU Material on Security and Management, Book No. 4 of Operating System


Concepts & Networking Management (MCS-022).

[5]

Christof Paar, Lecture Notes on Applied Cryptography and Data Security,


(version 2.5 | January 2005), Department of Electrical Engineering and Information
Sciences, Ruhr-University at Bochum, Germany, 2005, http:// www.crypto.rub.de.

[6]

Behrouz A. Forouzan, Cryptography & Network Security, Special Indian Edition,


TMH Publications, 2007.

[7]

William Stallings, Data and Computer Communications, Sixth Edition, Pearson


Education Asia Publications, 2001.

[8]
[9]

Atul Kahate, Cryptography and Network Security, TMH Publications, 2003.


Von Solms S.H., Eloff J.H.P., Eloff M., Smith E., Information Security, First
Edition, First Impression 2000/01, ISBN 1-919774-39-4, Amabhuku Publications
(Pty) Ltd 2000.

157

[10] (n.d.) Material on Network Security, Module 8, Version 2 CSE IIT, Kharagpur,
http://nptel.iitm.ac.in/courses/Webcoursecontents/IIT%20Kharagpur/Computer%20networks/pdf/M8L1.pdf
[11] B. Miller, Vital signs of identity, IEEE Spectrum, 31(2):2230, 1994.
[12] Jain, A., Bolle, R. M. and Pankati, S. Introduction to Biometrics, In Biometrics Personal Identification in Networked Society, Kluwer Academic Publishers
Boston/Dordrecht/London, Ch. 1, pp. 1-41, 1999.
[13] John D. Woodward, Christopher Horn, Julius Gatune, Aryn Thomas, Biometrics:
A look at facial recognition, Documented briefing, RAND, 2003.
[14] Davis D., High-Tech passport spark stiff competition, Card Technology, 8(4)
page 22-24, April, 2003.
[15] Wayman J.L., Alyea L., Picking the Best Biometric for Your Applications,
National Biometric Test Centre Collected Works, vol. 1, J.L. Wayman, Ed. San
Jose, CA: National Biometric Test Centre, 200, page 269-275.
[16] Pfleeger C.P., Security in Computing, second edition, ISBN 0-13-337486-6
Prentice Hall, PTR.
[17] Tiwana A., Web Security, ISBN 1-55558-210-9 Digital Press An imprint of
Butterwoth-Heinemann
[18] Wayman J.L., Biometric Identification Technologies in Election, ProcessSummary Report, National Biometric Test Centre Collected Works, vol. 1, J.L.
Wayman, Ed. San Jose, CA: National Biometric Test Centre, 200, page 269-275.
[19] (n.d.) Material on Biometrics, Wikipedia, http://en.wikipedia.org/wiki/Biometric,
accessed 2 October, 2005
[20] Schneier, Bruce, Inside risks: the uses and abuses of biometrics, Communications
of the ACM, Volume 42, Issue 8, pp. 136, August, 1999.
[21] P. Reid, Biometrics and Network Security, Prentice Hall, PTR, 2003.

158

[22] C. Soutar, D. Roberge, S. A. Stojanov, R. Gilroy, B. V. Kumar, "Biometric


encryption using image processing", SPIE, Optical Security and Counterfeit
Deterrence Techniques II, vol. 3314, pp. 178-188, 1998.
[23] M. S. Al-Tarawneh, L. C. Khor, W. L. Woo, S. S. Dlay, "Crypto key generation
using contour graph algorithm", in Proceedings of the 24th IASTED international
conference on Signal processing, pattern recognition, and applications, Innsbruck,
Austria ACTA Press, pp. 95-98, 2006.
[24] M. S. Altarawneh, L. C. Khor, W. L. Woo, S. S. Dlay, "Crypto Key Generation
Using Slicing Window Algorithm", presented at 5th IEEE Symposium on
Communication Systems, Networks and Digital Signal Processing (CSNDSP06),
Patras, Greece, July 19-21, 2006.
[25] C. Soutar, D. Roberge, A. Stoianov, R. Gilroy, B.V.K. Vijaya Kumar, "Biometric
Encryption", in ICSA Guide to Cryptography, McGrow-Hill, 1999.
[26] (n.d.) http://www.biometricscatalog.org/biometrics/GlossaryDec2005.pdf
[27] Ross, A., & Jain, A., Biometric Sensor Interoperability: A Case Study in
Fingerprints, Paper presented at the meeting of the International ECCV Workshop
on Biometric Authentication (BioAW). Prague, Czech Republic, 2004.
[28] Morne Breedt, Integrating Biometric Authentication into Multiple Applications,
Faculty of Engineering, Built Environment and Information Technology, University
of Pretoria, October, 2005.
[29] Travis W. Rosiek, Fingerprint Testing Protocols for Optical Sensors, Electrical
Engineering, West Virginia University, Morgantown, WV, 2005.
[30] Philippe C. Cattin, Biometric Authentication System Using Human Gait, Swiss
Federal Institute of Technology, Zurich, 2002.
[31] Document on Biometrics, US Department of Defense, Biometric Management
Office (BMO).
[32] Richard Youmaran, Algorithms to Process and Measure Biometric Information
Content in Low Quality Face and Iris Images, Electrical and Computer

159

[33] D. D. Zhang, Automated Biometrics: Technologies and Systems, Kluwer


Academic Publishers, 2000.
[34] U. Uludag, S. Pankanti, S. Prabhakar, A. K. Jain, "Biometric cryptosystems: issues
and challenges", in the Proceedings of the IEEE, vol. 92, No. 6, pp. 948-960, 2004.
[35] A. Jain, L. Hong, R. Bolle, On-Line Fingerprint Verification, IEEE Trans. Pattern
Anal. Mach. Intell., vol. 19, pp. 302-314, 1997.
[36] D Maltoni, D Maio, AK Jain and S Prabhakar, Handbook of Fingerprint
Recognition, New York, Springer, pp. 301-307, 2003.
[37] A. K. Jain and U. Uludag, "Hiding biometric data", IEEE Transactions on Pattern
Analysis and Machine Intelligence, vol. 25, pp. 1494-1498, 2003.
[38] Karthik Nandakumar, Integration of Multiple Cues in Biometric Systems,
Computer Science and Engineering, Michigan State University, 2005.
[39] D. Maio, D. Maltoni, R. Cappelli, J. L. Wayman, and A. K. Jain, FVC 2002:
Fingerprint Verification Competition, Proc. International Conference on Pattern
Recognition (ICPR), pp. 744-747, Quebec City, Canada, Augus,t 2002.
[40] Gerik Alexander von Graevenitz, Introduction to Fingerprint Technology,
published in A&S International, Volume 53, Taipei, 2003, pp. 84-86
[41] (n.d.). Retrieved July 12, 2007, from Biometric Newsportal.Com,
http://www.biometricnewsportal.com.
[42] (n.d.). Material on Fingerprint, Retrieved June 28, 2007, from Individual
Biometrics, http://ctl.ncsc.dni.us/biomet%20web/BMFingerprint.html.
[43] M. E. Whitman, & H. J. Mattord, Management of Information Security, Boston,
MA: Thompson Course Technology, pp. 372, 2004.
[44] Dileep Kumar, Dr.Yeonseung Ryu, Dr.Dongseop Kwon, A Survey on Biometric
Fingerprints: The Cardless Payment System, IEEE ISBAST, April, 2008.

160

[45] Salil Prabhakar, Anil Jain, Fingerprint Identification, Material.


[46] (n.d.) New Zealand Police Fingerprint Sections,
http://www.police.govt.nz/service/fingerprint/, accessed 02 December, 2005.
[47] Dileep Kumar, Yeonseung Ryu, A Brief Introduction of Biometrics and
Fingerprint Payment Technology, International Journal of Advanced Science and
Technology, Vol. 4, March, 2009.
[48] (n.d) Biometric Education/Fingerprints, Rosistem Romania, accessed 02 December,
2005, http://www.barcode.ro/tutorials/biometrics/fingerprint.html.
[49] (n.d.) The Henry Classification System, International Biometric Group, 2003,
accessed 02 December 2005,
http://www.biometricgroup.com/Henry%20Fingerprint%20Classification.pdf.
[50] (n.d.) All About Fingerprints, Chapter 4 - The Techniques, accessed 02 December,
2005, http://www.crimelibrary.com/criminal_mind/forensics/fingerprints/4.html.
[51] Tsutomu Matsumoto, Hiroyuki Matsumoto, Koji Yamada, Satoshi Hoshino,
Impact of Artificial "Gummy" Fingers on Fingerprint Systems, Prepared for
Proceedings of SPIE Vol. #4677, Optical Security and Counterfeit Deterrence
Techniques IV, Thursday-Friday 24-25 January, 2002,
http://www.spie.org/Conferences/Programs/02/pw/confs/4677.html.
[52] Anil K. Jain, Arun Ross and Salil Prabhakar, An Introduction to Biometric
Recognition, in IEEE Transactions on Circuits and Systems for Video Technology,
Special Issue on Image-and Video-Based Biometrics, Vol. 14, No. 1, January, 2004.
[53] Radha, N.K., Connell, J.H., and Bolle, R.M., Enhancing Security and Privacy in
Biometric-Based Authentication System, IBM Systems Journal, vol. 40, no. 3, pp.
614-634, 2001.
[54] Ratha N.K., Chikkerur S., Connell J.H. and Bolle R.M., Generating cancellable
fingerprint templates, IEEE Transactions on Pattern Analysis Machine
Intelligence, vol. 29, no. 4, pp. 561572, 2007.

161

[55] Wayman, J.L., Fundamentals of Biometric Authentication Technologies, Proc. Of


CardTech/SecurTech Conference, CardTech/SecurTech, Chicago, USA, page 81101, 1999.
[56] (n.d.) Jain A.K., Prabhakar S., Ross A., Fingerprint Matching: Data Acquisition
and Performance Evaluation, MSU Technical Report, TR99-14, 1999.
http://web.cps.msu.edu/TR/MSUCPS:TR99-14.
[57] Woodward J.D. Jr., Webb K.W., Newton E.M., Bradley M., Rubenson D., Army
Biometric Applications: Identifying and Addressing Sociocultural Concerns,
ISBN: 0-8330-2985-1 MR-1237-A, @2001 RAND, 2001.
[58] Jing Yan, Continuous Authentication Based on Computer Security, Computer and
Systems Science, Department of Business Administration and Social Sciences,
Division of Information System Sciences, Lulea University of Technology, ISSN:
1653-0187, ISRN: LTU-PB-EX, 2009.
[59] Andy Adler, Biometric System Security, Handbook of Biometrics, Systems and
Computer Engineering, Carleton University, Ottawa, Springer, 2007.
[60] Woodward, J. D., Orlans, N. M., Higgins, P. T., Biometrics: Identity Assurance in
the Information Age, pp. 187, Osborne, Mcgraw Hill, 2003.
[61] Mansfield, A. J., & Wayman, J. L. Best Practices in Testing and Reporting
Performances of Biometric Devices, Version 2.01 (NPL Report CMSC 14/02). UK
:National Physical Laboratory, August 2002.
[62] L. OGorman, Fingerprint Verification, in Biometrics: Personal Identification in
a Networked Society, A. K. Jain, R. Bolle, and S. Pankanti (editors), Kluwer
Academic Publishers, pp. 43-64, 1999.
[63] Bolle, R. M., Pankanti, S., & Ratha, N. K., Evaluation Techniques for BiometricsBased Authentication systems (FRR), Yorktown Heights, NY, IBM, 2000.
[64] Ashbourn, J., Biometrics: Advanced Identity Verification, Springer, London,
2000.

162

[65] Arun A. R., Information Fusion in Fingerprint Authentication, Department of


Computer Science & Engineering, Michigan State University, 2003.
[66] J. Berry and D. A. Stoney, The history and development of fingerprinting, in
Advances in Fingerprint Technology (H. C. Lee and R. Gaensslen, eds.), pp. 140,
Florida: CRC Press, 2nd ed., 2001.
[67] Federal Bureau of Investigation, The Science of Fingerprints: Classification and
Uses, Washington, D.C., 1984.
[68] F. Pernus, S. Kovacic, and L. Gyergyek, Minutiae-based fingerprint recognition,
in Proceedings of the Fifth International Conference on Pattern Recognition, pp.
13801382, 1980.
[69] B. Mehtre and M. Murthy, A minutiae based fingerprint identification system, in
Proceedings of the Second International Conference on Advances in Pattern
Recognition and Digital Techniques, (Calcutta, India), January 1986.
[70] (n.d) Chris Roberts, Biometric Technologies Fingerprints, February 2006.
[71] A. Sibbald, Method and apparatus for fingerprint characterization and
recognition using auto-correlation pattern, US Patent 5633947, 1994.
[72] A. Stoianaov, C. Soutar, and A. Graham, High-speed fingerprint verification using
an optical correlator, in Proceedings SPIE, vol. 3386, pp. 242252, 1998.
[73] D. Roberge, C. Soutar, and B. Vijaya Kumar, High-speed fingerprint verification
using an optical correlator, in Proceedings SPIE, vol. 3386, pp. 123133, 1998.
[74] A. M. Bazen, G. T. B. Verwaaijen, S. H. Gerez, L. P. J. Veelenturf, and B. J. Van
der Zwaag, A correlation-based fingerprint verification system, in Proceedings of
the ProRISC 2000 Workshop on Circuits, Systems and Signal Processing,
(Veldhoven, Netherlands), Nov 2000.
[75] N. Ratha, K. Karu, S. Chen, and A. K. Jain, A Real-Time Matching System for
Large fingerprint Databases, IEEE Trans. Pattern Anal. and Machine Intell., Vol.
18, No. 8, pp. 799-813, 1996.

163

[76] A. K. Jain, L. Hong, S. Pankanti, and Ruud Bolle, An Identity Authentication


System Using Fingerprints, Proceedings of the IEEE, Vol. 85, No. 9, pp. 13651388, 1997.
[77] Salil Prabhakar, Fingerprint Classification and Matching Using a Filterbank,
Computer Science & Engineering, Michigan State University, 2001.
[78] Anil K. Jain, Biometric Authentication: How do I know who you are?, Computer
Science & Engineering, Michigan State University, http://biometrics/cse/msu.edu.
[79] (n.d.) Miroslav Baca, Marko Antoni, Upgrading Existing Biometric Security
Systems by Implementing the Concept of Cancellable Biometrics, Retrieved July
15, 2007.
[80] Andrew Teoh Beng Jin, Cancellable Biometrics and Multispace Random
Projections, Proceedings of the 2006 Conference on Computer Vision and Pattern
Recognition Workshop (CVPRW06), published by IEEE Computer Society, 07695-2646-2/06, 2006.
[81] Michael Braithwaite, Ulf Cahn von Seelen, James Cambier, John Daugman, Randy
Glass, Russ Moore, Ian Scott, Application-Specific Biometric Templates, Auto
ID, 2002.
[82] Pim Tuyls, Jasper Goseling, Capacity and Examples of Template-Protecting
Biometric Authentication Systems, ECCV Workshop BioAW, Springer, 2004.
[83] Terrance E. Boult, Robert Woodworth, Privacy and Security Enhancements in
Biometrics, Advances in Biometrics Sensors, Algorithms and Systems, Springer,
2007.
[84] Ratha N., Connell J., Bolle R.M. and Chikkerur S., Cancellable Biometrics: A
Case Study in Fingerprints, in Proceedings of International Conference on Pattern
Recognition (18th), vol. 4, pp. 370373, IEEE Computer Society, 2006.
[85] Lu Leng, Jiashu Zhang, Muhammad Khurram Khan, Xi Chen1, Ming Ji, Khaled
Alghathbar, "Cancellable PalmCode generated from randomized Gabor filters for
palmprint template protection", Scientific Research and Essays Vol. 6, No: 4, pp.
784-792, 2011.
164

[86] K. Nilsson, Symmetry Filters Applied to Fingerprints, Chalmers University of


Technology, Sweden, 2005.
[87] S.Chikkerur, N. K. Ratha, Impact of singular point detection on fingerprint
matching performance, In IEEE AUTOID 05, 2005.
[88] G. Wolberg, "Image Morphing: A Survey", The Visual Computer, 14, pp. 360372,
1998.
[89] T. Beier, S. Neely, "Feature-Based Image Metamorphosis", Proceedings of
SIGGRAPH, ACM, New York, pp. 3542, 1992.
[90] Sergey Tulyakov, Faisal Farooq, Venu Govindaraju, Symmetric Hash Functions
for Fingerprint Minutiae, Springer, 2005.
[91] Kar-Ann Toh, Chulhan Lee, Jeung-Yoon Choi, Jaihie Kim, Performance Based
Revocable Biometrics, Biometrics Engineering Research Center, School of
Electrical and Electronic Engineering, Yonsei University, Seoul, Korea, 2007.
[92] Ying-Han Pang, Andrew Teoh Beng Jin, David Ngo Chek Ling, Palmprint based
Cancellable Biometric Authentication System, International Journal of Signal
Processing, Volume 1, Number 2, 2004.
[93] (n.d) White Paper by Motorola on An introduction to biometrics, August 2006,
http://www.motorola.com/Biometrics
[94] (n.d.) Alice Osborn, Biometrics history -- looking at biometric technologies from
the past to the present, 2005,
http://www.video-surveillance-guide.com/biometrics-history.htm
[95] (n.d.) National Science and Technology Council (NSTC) Subcommittee on
Biometrics, 2006, http://www.biometricscatalog.org/NSTCSubcommittee.

[96] Y. J. Chang, Z. Wende, and T. Chen, Biometrics-based cryptographic key


generation", IEEE International Conference on Multimedia and Expo, vol. 3, pp.
2203-2206, 2004.
[97] R. Ang, R. Safavi-Naini, L. McAven, Cancellable key-based fingerprint
templates", In the Proceedings of ACISP, pp. 242-252, 2005.

165

[98] T. Connie, A. Teoh, M. Goh, and D. Ngo, Palmhashing: A novel approach for
cancellable biometrics", Information processing letters, vol. 93, no. 1, pp. 1-5,
2005.
[99] F. Hao, R. Anderson, and J. Daugman, "Combining Crypto with Biometrics
Effectively", IEEE Transactions on Computers, vol. 55, pp. 1081-1088, 2006.
[100]M. F. Santos, J. F. Aguilar, and J. O. Garcia, Cryptographic key generation using
handwritten signature", Proceedings of SPIE, vol. 6202, pp. 225-231, Orlando, Fla,
USA, Apr. 2006.
[101]G. Zheng, W. Li, and C. Zhan, Cryptographic key generation from biometric data
using lattice mapping", Proceedings of the 18th International Conference on Pattern
Recognition, vol. 4, pp. 513-516, 2006.
[102]Andrew Teoh Beng Jin, Tee Conniea, "Remarks on Bio-Hashing based cancellable
biometrics in verification system", Neurocomputing, Vol: 69, No: 16-18, pp. 24612464, 2006.
[103]A. B. Teoh, and C. T. Yuang, "Cancellable biometrics realization with multispace
random projections", IEEE Transactions on Systems, vol. 37, no. 5, pp. 1096-1106,
2007.
[104]J. G. Jo, J. W. Seo, and H. W. Lee, Biometric digital signature key generation and
cryptography communication based on fingerprint", First Annual International
Workshop 2007, LNCS 4613, pp. 38-49, Springer-Verlag, 2007.
[105]B. Chen, and V. Chandran, Biometric Based Cryptographic Key Generation from
Faces", Proceedings of the 9th Biennial Conference of the Australian Pattern
Recognition Society on Digital Image Computing Techniques and Applications, pp.
394-401, 2007.
[106]E. Maiorana, P. Campisi, J. O. Garcia, and A. Neri, Cancellable biometrics for
HMM-based signature recognition", 2nd IEEE International Conference on
Biometrics: Theory, Applications and Systems, pp. 1-6, 2008.

166

[107]Beng, A.,

Jin Teoh,

Kar-Ann Toh,

"Secure biometric-key generation with

biometric helper, 3rd IEEE Conference on Industrial Electronics and Applications,


pp: 2145-2150, 2008.
[108]Sanaul Hoque, Michael Fairhurst, Gareth Howells "Evaluating Biometric
Encryption Key Generation Using Handwritten Signatures", Bio-inspired, Learning
and Intelligent Systems for Security, pp: 17-22, 2008.
[109]Andrew B. J. Teoh, Yip Wai Kuan, Sangyoun Lee, Cancellable biometrics and
annotations on BioHash", Pattern Recognition, Vol. 41, No. 6, pp. 2034-2044,
2008.
[110]Huijuan Yang, Xudong Jiang, Alex C. Kot, "Generating secure cancellable
fingerprint templates using local and global features", 2nd IEEE International
Conference on Computer Science and Information Technology, 2009.
[111]B. Prasanalakshmi, A. Kannammal, "A secure cryptosystem from palm vein
biometrics", ACM International Conference Proceeding Series, Seoul, Korea , Vol.
403, pp. 1401-1405, 2009.
[112]H. A. Garcia-Baleon, V. Alarcon-Aquino ,O. Starostenko, "K-Medoids-Based
Random Biometric Pattern for Cryptographic Key Generation", Proceedings of the
14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern
Recognition, Image Analysis, Computer Vision, and Applications, Vol. 5856, pp:
85 - 94, 2009.
[113]A. K. Jain, K. Nandakumar and A. Nagar, "Biometric Template Security",
EURASIP Journal on Advances in Signal Processing, Special Issue on Biometrics
(ASPPRMB), vol. 2008, pp.1-20, Jan. 2008.
[114]Khan MK, Xie L, Zhang JS, Chaos and NDFT-based concealing of fingerprintbiometric data into audio signals for trustworthy person authentication, Digit.
Sign. Proc., Vol: 20, No: 1, pp: 179-190, 2010.
[115]Pang, Y.H., Teoh, A.B.J., Ngo, D.C.L., "Palmprint based cancellable biometric
authentication system", International Journal of Signal Processing, Vol: 1, pp: 98104, 2005.

167

[116]Teoh, A.B.J., Ngo, D.C.L., "Cancellable biometrics featuring with tokenized


random number", Pattern Recognition Letters, 2004.
[117]Teoh, A.B.J., Ngo, D.C.L, Goh, A., "BioHashing: two factor authentication
featuring fingerprint data and tokenised random number", Pattern Recognition
Letters, Vol. 37, pp. 2245-2255, 2004.
[118]Jules and M. Sudan, A fuzzy vault scheme, in Proc. IEEE Int. Symp. Inform.
Theory, Lausanne, Switzerland, pp. 408, 2002.
[119]E. Srinivasa Reddy, Ramesh Babu; Performance of Iris Based Hard Fuzzy Vault
IJCSNS International Journal of Computer Science and Network Security, Vol.8
No.1, January 2008.
[120]R.Seshadri, T.Raghu Trivedi, "Efficient Cryptographic Key Generation using
Biometrics", Int. J. Comp. Tech. Appl., Vol 2, No: 1, 183-187, 2011.
[121]F. Hao, R. Anderson, J. Daugman, "Combining Cryptography and Biometrics",
Technical Report No. 640, University of Cambridge Computer Laboratory, 2000.
[122]K. Saraswathi and R. Balasubramaniam, "Bio-Cryptosystems for Authentication and
Network Security-A Survey", Global Journal of Computer Science and Technology,
Vol. 10, No.: 3, pp: 12, 2010.
[123]Qiang Wu, Fatima Merchant, Kenneth Castleman, "Microscope image processing",
2008.
[124]Jianfeng Li, Jinhuan Shi, Hongzhi Zhang, Yanlai Li, Naimin Li and Changming
Liu, "Tongue Image Texture Segmentation Based on Gabor Filter Plus Normalized
Cut", Medical Biometrics, Lecture Notes in Computer Science, Volume 6165, pp:
115-125, 2010.
[125]Umbaugh Scot E, Computer Vision and Image Processing, Prentice Hall, NJ,
ISBN 0-13-264599-8, 1998.
[126]R. C. Gonzales, R. E. Woods, Digital Image Processing, 2-nd Edition, Prentice
Hall, 2002.

168

[127]Trier O. and Jain A.K., Goal-directed evaluation of binarization methods, IEEE


Transactions on Pattern Analysis Machine Intelligence, Vol. 17, No. 12, pp. 1191
1201, 1995.
[128]Luping Ji, Zhang Yi, Lifeng Shang, Xiaorong Pu, Binary fingerprint image
thinning using template-based PCNNs, IEEE Transaction on Systems, Man, and
Cybernetics, Part B, Vol. 37, No. 5, pp. 14071413, 2007.
[129]Kyoung Min Kim, Buhm Lee, Nam Sup Choi, Gwan Hee Kang, Joong Jo Park and
Ching Y. Suen, "Gray-Scale Thinning Algorithm Using Local Min/Max
Operations", Document Analysis Systems VII Lecture Notes in Computer Science,
Volume 3872, pp: 62-70, 2006.
[130]Svante Seleborg, About AES Advanced Encryption Standard", 2007.
[131]Banshidhar Majhi, Pankaj Kumar Sa, "FLANN-based adaptive threshold selection
for detection of impulsive noise in images", AEU - International Journal of
Electronics and Communications, Volume 61, Issue 7, pp: 478-484, 2007.
[132]Derek Bradley, Gerhard Roth, "Adaptive Thresholding using the Integral Image",
Journal of Graphics, GPU, and Game Tools, Vol. 12, Issue 2, pp. 13 - 21, 2008.
[133]Mark Nixon, Alberto S Aguado, "Feature Extraction & Image Processing", pages424, 2008.
[134]Benesty, Jacob; Sondhi, M. M.; Huang, Yiteng (Eds.), "Springer Handbook of
Speech Processing", pages; 456, 2008.
[135]Jonathan Le Roux, Emmanuel Vincent, Yuu Mizuno, Hirokazu Kameoka,
Nobutaka Ono, Shigeki Sagayama, "Consistent wiener filtering: generalized timefrequency masking respecting spectrogram consistency", in proceedings of the 9th
international conference on Latent variable analysis and signal separation, 2010.
[136](n.d) Material on Fingerprint Verification Competition, accessed in 2005, from,
http://bias.csr.unibo.it/fvc2004/default.asp
[137](n.d) Material on The Fingerprint Vendor Technology Evaluation (FpVTE),
accessed in 2005, from, http://fpvte.nist.gov

169

[138]R.M. Bolle, J.H. Connel, N.K. Ratha, Biometrics perils and patches, Pattern
Recognition, vol. 35, pp: 27272738, 2002.
[139]Uludag U., Pankanti S., Prabhakar S. and Jain A.K., Biometric cryptosystems:
Issues and challenges, in Proceedings of the IEEE, vol. 92, no. 6, pp. 948960,
2004.
[140]Kong A., Cheung K., Zhang D., Kamel M. and You J., An analysis of Bio-Hashing
and its variants, Pattern Recognition, vol. 39, no. 7, pp. 13591368, 2006.
[141]Sakata K., Maeda T., Matsushita M., Sasakawa K. and Tamaki H., Fingerprint
Authentication Based on Matching Scores with Other Data, in Proc. Int. Conf. on
Biometrics, LNCS 3832, pp. 280286, 2006.
[142]Andrew B. J. Teoh, Yip WaiKuan, Sangyoun Lee, "Cancellable biometrics and
annotations on Bio-Hash", Journal Pattern Recognition, Vol. 41, No. 6, June, 2008.
[143]Sergey

Tulyakov,

Faisal

Farooq,

PraveerMansukhani,

VenuGovindaraju,

"Symmetric hash functions for secure fingerprint biometric systems", Pattern


Recognition Letters, Vol. 28, pp: 24272436, 2007.
[144]Germain, R., Califano, A., Colville, S., "Fingerprint matching using transformation
parameter clustering", IEEE Computer Science Engineering, Vol. 4, No. 4, pp: 42
49, 1997.
[145]Jea, T.-Y., Chavan, V.S., Govindaraju, V., Schneider, J.K., "Security and matching
of partial fingerprint recognition systems", In: SPIE Defense and Security
Symposium, 2004.
[146]K. Nilsson and J. Bigun, Localization of Corresponding Points in Fingerprints by
Complex Filtering, Pattern Recognition Letters, vol. 24, no. 13, pp. 2135-2144,
Sept. 2003.
[147]P. Ramo, M. Tico, V. Onnia, and J. Saarinen, Optimized Singular Point Detection
Algorithm for Fingerprint Images, IEEE Transaction on Image Processing, vol. 3,
pp. 242-245, 2001.

170

[148]Davide Maltoni, Dario Maio, Anil K. Jain, Salil Prabhakar, Handbook of


Fingerprint Recognition, 496 pages, Springer 2009.
[149](n.d.) Thomas Yeo, Wee Peng Tay, Ying Yu Tai, Image Systems Engineering
Program, Student project, Stanford University, http: //scien.stanford.edu /class
/ee368 /projects2001/
[150](n. d) Gabor Filter from http://en.wikipedia.org /wiki/Gabor_filter.
[151]L.C. Jain, U.Halici, I. Hayashi, S.B. Lee and S.Tsutsui., Intelligent biometric
techniques in fingerprint and face recognition, The Crc International Series on
Computational Intelligence, CRC Press, 1999.
[152]D.Maio, D. Maltoni, Direct gray-scale minutiae detection in fingerprints, IEEE
Trans. Pattern Anal. And Machine Intell., Vol: 19, No:1, pp: 27-40, 1997.
[153]Lam, L., Seong-Whan Lee, and Ching Y. Suen, "Thinning Methodologies-A
Comprehensive Survey", IEEE Transactions on Pattern Analysis and Machine
Intelligence, Vol 14, No. 9, pp: 869-885, September 1992.
[154]Joseph W. Goodman, Brian A. Wandell, Image Systems Engineering Program,
Stanford University, International Conference on Image Processing (ICIP), Vol: 1,
pp: 435-438, 1996.
[155]Amir Hussain, Stefano Squartini, and Francesco Piazza, "Novel Sub-band Adaptive
systems incorporating Wiener filtering for Binaural Speech Enhancement", A ISCA
tutorial research workshop on Non-Linear Speech processing, NOLISP, Barcelona,
April 19-22, 2005.
[156]Saeed V. Vaseghi, "Advanced signal processing and digital noise reduction
(Paperback)", John Wiley & Sons Inc, pp: 416, July 1996.
[157]D.Gnanadurai, and V.Sadasivam, An Efficient Adaptive Thresholding Technique
for Wavelet Based Image Denoising, International Journal of Signal Processing,
vol: 2, No: 2, 2005.

171

[158]Yahia S. Halabi, Zaid SASA, Faris Hamdan, Khaled Haj Yousef, "Modelling
Adaptive Degraded Document Image Binarization and Optical Character System",
European Journal of Scientific Research, Vol. 28, No.1, pp.14-32, 2009.
[159](n.d) Material on RSA Factoring Challenge,
http://en.wikipedia.org/wiki/RSA Factoring _Challenge.
[160]Sunil V. K. Gaddam and Manohar Lal, "Efficient Cancellable Biometric Key
Generation Scheme for Cryptography", International Journal of Network Security,
Vol: 10, No: 3, pp: 223-231, 2010.
[161]Sunil V. K. Gaddam and Manohar Lal, "Development of Bio-Crypto Key from
Fingerprints Using Cancellable Templates", International Journal on Computer
Science and Engineering (IJCSE), Vol. 3, No. 2, pp. 775-783, 2011.
[162]Maio, D. Maltoni, D. Cappelli, R. Wayman, J.L. Jain, A.K., FVC 2002: Second
Fingerprint Verification Competition, in Proceedings of the 16th International
Conference on Pattern Recognition, vol. 3, pp. 811 814, 2002.
[163]Julien Bringer, Herve Chabanne, Bruno Kindarji, Anonymous Identification with
Cancellable Biometrics, in the Proceedings of the 6th International Symposium on
Image and Signal Processing and Analysis, 2009.

172

Buy your books fast and straightforward online - at one of worlds


fastest growing online book stores! Environmentally sound due to
Print-on-Demand technologies.

Buy your books online at

www.get-morebooks.com
Kaufen Sie Ihre Bcher schnell und unkompliziert online auf einer
der am schnellsten wachsenden Buchhandelsplattformen weltweit!
Dank Print-On-Demand umwelt- und ressourcenschonend produziert.

Bcher schneller online kaufen

www.morebooks.de
VDM Verlagsservicegesellschaft mbH
Heinrich-Bcking-Str. 6-8
D - 66121 Saarbrcken

Telefon: +49 681 3720 174


Telefax: +49 681 3720 1749

info@vdm-vsg.de
www.vdm-vsg.de

You might also like