Professional Documents
Culture Documents
!
"#
!
$%&
'(
#
)
"$%
*
+
&(
#
# ,
-
*
+
&( $%
!
*
*
+
&(
#
# ,
-
*
+
&( *
!"
."*
/-#
01
(
*
"2
!3
456
$3 46%33
"2+
&
4
$%4647&
3 *6 **64 *63
5348#"
!"
!
"
# $
!%
!
&
$ '
'
($
' # % %
)
%* %'
$ '
+
"
%
&
'
! #
$,
( $
-
.
!
"-
( %
.
% % % %
$
$ $ -
-
- -
// $$$
0
1"1"#23."
DEDICATION
This is dedicated to.. my son, Samuel Abhishek, who is now, with the Lord and
whom I loved so much.
And also to my Mom, Dad & Wife
who offered me unconditional love and support throughout the course of this thesis.
Sunil V. K. Gaddam
(iv)
ACKNOWLEDGEMENTS
For the LORD gives wisdom,
and from his mouth come knowledge and understanding.
Proverbs 2:6
Trust in the Lord with all your heart, and lean not on your own understanding; In
all your ways acknowledge Him, And He shall direct your paths.
Proverbs 3:5-6
To our God and Father be glory for ever and ever. Amen.
Philippians 4:20
In the last few years, I have learnt the importance of relying on Jesus
completely. I thank you Lord for showing me the way.
Time flies and I shall leave Delhi State soon a place I shall cherish long after my
post graduation. This year has been pretty tough and I would not have survived with
what little sanity I possess intact without the help of many people, who have been so
kind and helpful to me during all these years; all of you have made a mark in my life!
First and foremost, I want to express my greatest gratitude to my thesis supervisor
Prof. Dr. Manohar Lal, Director, School of Computer and Information Sciences
(SOCIS), IGNOU, for accepting me as a doctoral student. He is such a wonderful
advisor, mentor, and motivator. Under his guidance, I have learned a lot in different
aspects of conducting research, including finding a good research problem, writing a
convincing technical paper, and prioritizing different research tasks, to name a few.
Really I thank him, for his able guidance and supervision, for his valuable
suggestions, comments and corrections of the manuscript and also for his constant
support and recognition.
I am also grateful to Prof. Dr. V. N. Rajasekharan Pillai, Vice-Chancellor, Indira
Gandhi National Open University (IGNOU), for his support and encouragement in
submission of this thesis.
Many thanks go also to members of trusty, Bharat Institute of Technology
(BIT), Meerut, for their support and encouragement.
I would also like to thank all of the staff members at the SOCIS office of IGNOU
for all of their help and support.
Further I would like to thank my friends Ms. J. Smita and Mr. Binu, D for their
technical support and interesting discussions. Also I acknowledge the moral support
of my friends & colleagues, especially Mr. Kshitiz Saxena and Ms. Mahima Jain.
Finally, I owe a thousand thanks to my parentsStaya Raj & Saramma, uncle
Prasanna Kumar, brother Rajesh, sister Baby, wife Prabha and children
Benny, Vinny & Blessy who accompanied me warmly through all these years for
their incredible love, prayers, enthusiasm, encouragement; and to fellow believers
for their timely counsel and prayer.
Sunil V. K. Gaddam
(v)
TABLE OF CONTENTS
Declaration
(ii)
Certificate
(iii)
Dedication
(iv)
Acknowledgements
(v)
Table of Contents
(vi) (xi)
Abstract
(xii) (xiii)
List of Publications
(xiv) (xvii)
List of Abbreviations/Acronyms
(xviii)
List of Figures
(xix) (xxv)
List of Tables
(xxv)
1 19
1.6.1 Cryptography
10
1.6.1.3 Hashing
11
1.6.2 Steganography
13
13
14
16
16
(vi)
20 46
2.1 Introduction
20
20
23
24
25
26
28
29
31
31
33
37
37
38
38
39
40
42
44
2.11 Summary
46
47 60
3.1 Introduction
47
47
50
53
54
3.6 Registration
56
3.7 Transformations
56
57
58
(vii)
58
3.9 Summary
60
61 76
4.1 Introduction
61
61
63
66
68
70
4.7 Summary
76
77 90
5.1 Introduction
77
77
78
79
80
2.4.2 Filters
81
83
84
85
86
87
5.4.5.1 Binarization
87
88
89
89
5.6 Summary
90
(viii)
91 99
6.1 Introduction
91
91
94
97
99
100 113
7.1 Introduction
100
100
101
101
7.3.1.1 Pre-processing
102
103
104
106
107
107
108
7.4.1.1 Pre-processing
108
109
109
109
112
(ix)
111
114 141
8.1 Introduction
114
114
115
117
118
118
121
123
126
127
128
130
135
137
138
138
139
140
141
142 145
9.1 Introduction
142
142
143
144
145
145
146 147
10.1 Conclusion
146
147
148 156
BIBLIOGRAPHY (References)
157 172
(xi)
ABSTRACT
Networks make the information available from one corner of the world
to another, almost instantaneously. However, the growing use of the
Internet, a network of Networks, by individuals and organizations has
presented formidable problems of identity fraud, organised crime,
money laundering, theft of intellectual property and a myriad of cyber
crimes. The world is witnessing attempts at hacking of crucial
information systems like that of defence installations including that of
Pentagon of USA, which may endanger the security of even a nation.
Since incidents of September 11, 2001 and even earlier, security has
been in the forefront of American and other nations concern and the
importance of corporate data privacy, the need and demand for a
biometric physical security solution has been higher. Hence, the study
of methods of analysis of security requirements and needs of such
systems and consequent design, implementation and deployment is
the primary scope of the discipline named as Network Security.
Biometric security systems have a number of problems because of
the fact that the biometric data of a person is generally stored in the
system itself. The problems arise especially when that data is
compromised. Standard password based security systems have the
ability of cancelling the compromised password and reissuing a new
one. But the biometrics cannot be changed or cancelled. Thus,
advantage of biometrics based security becomes disadvantage also in
such situations. The concept of cancellable biometrics can upgrade
the existing biometric security system so that it gains the advantages
of the password based security systems, and at the same time not
losing the inherent superiority. In this thesis, we will be discussing
about problems with existing biometric technologies and then will be
showing that cancellable biometric system is one of important
solutions for security of computing and information systems.
(xii)
non-invertibility.
Finally,
the
conclusion
is
made
through
(xiii)
LIST OF PUBLICATIONS
Publications in Peer Reviewed International Journals
1. Sunil V.K. Gaddam, Manoharlal, Development of Bio-Crypto Key from
Fingerprints Using Cancellable Templates, accepted for publication (ISSN: 09753397) in the International Journal on Computer Science and Engineering
(IJCSE), Vol.3, No.2, February 2011, PP.797-805.
2. Sunil V.K. Gaddam, Manoharlal, Efficient Cancellable Biometric Key Generation
Scheme for Cryptography, published in the International Journal of
Network Security (IJNS), Vol.11, No.2, PP.61-69, September 2010.
3. Surendra Rahamatkar, Sunil Vijaya K Gaddam & S. Qamar Machenism for
Termination Detection in Wireless Mobile Adhoc Network, published in
International Journal of Hybrid Computational Intelligence 1(1) January
2008; pp. 79-89.
4. Surendra Rahamatkar, Sunil Vijaya K Gaddam & S. Qamar Application and Use
of Object Model for Version & Configuration Control in Distributed S.D.Es,
published in International Journal of Hybrid Computational Intelligence
1(1) January 2008; pp. 91-102.
Publications in the Proceedings of International Conferences
5. Sunil V.K. Gaddam, Dr. Manoharlal, A New Approach for Formulating
Randomized Cryptographic Key Generation Using Cancellable
Biometrics, published in the proceedings of the 2010 International
Conference on Security and Management (SAM'10), a part of
WORLDCOMP'10 which is to be held during 12-15 July, 2010, in Las Vegas,
Nevada, USA.
6. Sunil V.K. Gaddam, Dr. Manoharlal, An Effective Method for Revocable
Biometric Key Generation, published in the proceedings of the The 2009
International Conference on Security and Management (SAM'09), a part
of WORLDCOMP'09 which was held during 13-16 July, 2009, in Las Vegas,
Nevada, USA.
7. Sunil V.K. Gaddam, Dr. Manoharlal, A Review on Next Generation
Networks: Convergence and QoS, in the proceedings of the The 2009
International Conference on Wireless Networks (ICWN'09), a part of
WORLDCOMP'09 which was held during 13-16 July, 2009, in Las Vegas, Nevada,
USA.
8. Sunil V.K. Gaddam, Dr. Manoharlal, Dr. Rajesh C. Phoha, New-Fangled
Approach for Cancelable Biometric Key Generation, published in the
proceedings
of
International
Conference
on
Computing,
Communicating and Networking (ICCCN-2008), during 18 20 December
2008 organised by Chettinad College of Engineering & Technology, Karur
sponsored by IEEE ED Society India Tamilnadu, India.
9. Surendra Rahamatkar, Sunil Vijaya K Gaddam, Samuel Qamar, WiHFi and NETH
SPOT: Effect of Wireless LAN Technology in MIET Campus, in the proceedings of
International Conference on Emerging Technologies & Applications in
(xiv)
19. Sunil V.K. Gaddam, Mohit Kumar, Re-conceptualisation of the Teaching and
Learning Process in the Contemporary Digital Age, in the proceedings of
National Seminar on Total Quality Management in Pedagogy (TQM_P), Sponsored
by AICTE on 27th May 2008 at Meerut Institute of Engineering & Technology
(MIET), Meerut 250 005, UP.
20. Sunil V.K. Gaddam, Surendra Rahamatkar, Samuel Qamar, Comparison of
Termination Detection Scheme in Mobile Distributed Network, in the proceedings
of National Conference on Methods and Models in Computing (NCM2C 2007),
during 13 14 December 2007 conducted by SC&SS, Jawaharlal Nehru
University, New Delhi.
21. Sunil V.K. Gaddam, Surendra Rahamatkar, Dharmendra Sharma, Pradeep Pant,
Miet-Net-Spot Wifi: Emerging Wlan Technology In Miet Campus, in the
proceedings of the National Conference on "Emerging Technologies in Computer
Science (ETCS 2007)", during September 22-23, 2007 conducted by Meerut
Institute of Engineering & Technology, Meerut, Uttar Pradesh.
22. Sunil V.K. Gaddam, Surendra Rahamatkar, P.K. Bharti, Futuristic Developments
in Communication Paradigm in Distributed and Ubiquitous Computing, in the
proceedings of the National Conference on "Emerging Technologies in Computer
Science (ETCS 2007)", during September 22-23, 2007 conducted by Meerut
Institute of Engineering & Technology, Meerut, Uttar Pradesh.
23. Sunil V.K. Gaddam, Vijaya Lakshmi, Design of Framework to Prevent the
Unauthorized Administrative or Super User Transactions, in the proceedings of
the National Conference on "Emerging Technologies MKCE- Confluence '07", in
March 15, 2007 conducted by M. Kumarsamy College of Engineering,
Thalavapalayam, Karur, Tamil Nadu
24. Sunil V.K. Gaddam, K. Sreekanth, Implementaion of Personal Number Service
using VoIP, in the proceedings of the National Conference on "TechnoZion 07",
during January 26-27, 2007 conducted by National Institute of Technology (NIT),
Warangal, A.P.
25. Sunil V.K. Gaddam, Arun Kumar, Data Security and Authentication, in the
proceedings of the National Conference on "Recent Trends in Electronics and
Communications - NCRTEC-2007", in January 25, 2007 conducted by G.Pulla
Reddy Engg College, Kurnool, A.P.
26. Sunil Vijaya Kumar G., Prof. Manohar Lal, Dr. Rajesh C. Phoha, "Network
Management A New Paradigm: Part - I", in the proceedings of IEEE ACE 2002,
Organised by IEEE Calcutta Section at Science City, Kolkata in December 2002.
27. Sunil Vijaya Kumar G., Prof. Manohar Lal, Dr. Rajesh C. Phoha, "Network
Management A New Paradigm: Part - II", in the proceedings of IEEE ACE
2002, Organised by IEEE Calcutta Section at Science City, Kolkata in December
2002.
28. G. Sunil Vijaya Kumar, Dr. Rajesh C. Phoha, Prof. Manohar Lal, "Internet
Management", in the proceedings of 15th National Convention of Computer
Engineers on E-Goverance: Challenges and prospects (ego 2000), Organised by
The Institution of Engineers (India), Kerala State Centre, Trivendrum, Kerala,
October 2000.
(xvi)
29. G. Sunil Vijaya Kumar, Prof. Manohar Lal, "OSI Management" in the proceedings
of 15th National Convention of Computer Engineers on E-Goverance: Challenges
and prospects (ego 2000), Organised by The Institution of Engineers(India),
Kerala State Centre, Trivendrum, Kerala, October 2000.
30. G. Sunil Vijaya Kumar, Prof. P.C. Saxena, "Management of Telecommunications
Management (TMN)" in the proceedings of 15th National Convention of Computer
Engineers on E-Goverance: Challenges and prospects (ego 2000), Organised by
The Institution of Engineers(India), Kerala State Centre, Trivendrum, Kerala,
October 2000.
31. G. Sunil Vijaya Kumar, Dr. Rajesh C. Phoha, Prof. Manohar Lal, "Integrated
Network Management Architecture", in the proceedings of the National
Conference on Quality, Reliability and Management (NCQRM 2000), September
2000
32. G. Sunil Vijaya Kumar, Dr. Rajesh C. Phoha, Prof. Manohar Lal, "Integrated WebBased Network Management Architecture", in the proceedings of the National
Conference on Quality, Reliability and Management (NCQRM 2000), Conducted
by Priyadarshini Engineering College, Vaniyambadi, T.N., September 2000.
33. G. Sunil Vijay Kumar, "INMP: A new paradigm proposal to avoid loopholes in
SNMP Client Server", in the Proceedings of an all India seminar on IT Application
in Engineering and Technology, conducted by Institution of Engineers (India) &
KSRM College of Engineering, Cuddapah, A.P., March 2000.
34. G. Sunil Vijay Kumar, "Web-Based Network Management: An Integrated
Management Solution", in the Proceedings of an all India seminar on IT
Application in Engineering and Technology, conducted by Institution of Engineers
(India) & KSRM College of Engineering, Cuddapah, A.P., March 2000.
35. G. Sunil Vijay Kumar, M. Srinivasulu, "Design of Data warehousing for Business
Applications", in the Proceedings of an all India seminar on IT Application in
Engineering and Technology, conducted by Institution of Engineers (India) &
KSRM College of Engineering, Cuddapah, A.P., March 2000.
(xvii)
LIST OF FIGURES
Figure
No.
Page
No.
1.1
08
1.2
08
1.3
09
1.4
09
1.5
10
1.6
10
1.7
12
2.1
26
2.2
Enrollment process
27
2.3
27
2.4
29
2.5
30
2.6
35
2.7
A sample fingerprint
38
2.8
39
2.9
40
2.10
41
2.11
44
2.12
45
3.1
50
3.2
51
55
3.3
6.1
6.2
6.3
7.1: (a)
102
7.1: (b)
102
(xix)
7.2
8.1: (a)
8.1: (b)
8.1: (c)
8.1: (d)
8.2: (a)
8.2: (b)
8.2: (c)
8.2: (d)
8.2: (e)
8.2: (f)
8.2: (g)
8.3: (a)
8.3: (b)
8.3: (c)
8.3: (d)
8.3: (e)
8.3: (f)
8.3: (g)
8.4: (a)
8.4: (b)
8.4: (c)
8.4: (d)
8.4: (e)
104
116
116
116
116
119
119
119
119
119
119
119
119
119
119
119
119
119
119
120
120
120
120
120
8.4: (f)
8.4: (g)
8.5: (a)
8.5: (b)
8.5: (c)
8.5: (d)
8.5: (e)
8.5: (f)
8.5: (g)
8.6: (a)
8.6: (b)
8.6: (c)
8.6: (d)
8.6: (e)
8.6: (f)
8.6: (g)
8.7: (a)
8.7: (b)
8.7: (c)
8.7: (d)
8.7: (e)
8.7: (f)
8.7: (g)
120
120
120
120
120
120
120
120
120
121
121
121
121
121
121
121
122
122
122
122
122
122
122
8.8: (a)
8.8: (b)
8.8: (c)
8.8: (d)
8.8: (e)
8.8: (f)
8.8: (g)
8.9: (a)
8.9: (b)
8.9: (c)
8.9: (d)
8.9: (e)
8.9: (f)
8.9: (g)
8.10: (a)
8.10: (b)
8.10: (c)
8.10: (d)
8.10: (e)
8.11: (a)
8.11: (b)
8.11: (c)
122
122
122
122
122
122
122
123
123
123
123
123
123
123
124
124
124
124
124
124
124
124
8.11: (d)
8.11: (e)
8.12: (a)
8.12: (b)
8.12: (c)
8.12: (d)
8.12: (e)
8.13: (a)
8.13: (b)
8.13: (c)
8.13: (d)
8.13: (e)
8.14: (a)
8.14: (b)
8.14: (c)
8.14: (d)
8.15: (a)
8.15: (b)
8.15: (c)
8.15: (d)
124
124
125
125
125
125
125
125
125
125
125
125
126
126
126
126
126
126
126
126
8.16: (a)
8.16: (b)
8.16: (c)
8.16: (d)
8.17: (a)
8.17: (b)
8.17: (c)
8.17: (d)
8.18
8.19
8.20
8.21
8.22
8.23
8.24
8.25
8.26
8.27
8.28
8.29
8.30
8.31
8.32
127
127
127
127
127
127
127
127
128
129
129
130
130
131
131
132
133
133
134
134
135
136
136
8.33
137
8.34
138
8.35
139
8.36
140
8.37
141
LIST OF TABLES
Table No.
Page No.
2.1
22
2.2
32
4.1
68 - 70
8.1
117
(xxv)
CHAPTER 1
INTRODUCTION NETWORK SECURITY
"To competently perform rectifying security service, two critical incident response
elements are necessary: information and organization." Robert E. Davis
A Computer Network is an interconnected collection of autonomous computers which
use a well-defined, mutually agreed set of rules and conventions known as protocols for
interacting with one-another meaningfully in the form of messages and for allowing
resource-sharing preferably in a predictable and controllable manner [1]. Networks
make the information available, from one corner of the world to another, almost
instantaneously. However, the growing use of the Internet, a network of Networks, by
individuals and organizations has presented formidable problems of identity fraud,
organised crime, money laundering, theft of intellectual property and a myriad of
cybercrimes. The world is witnessing attempts at hacking of crucial information systems
like that of defence installations including the Pentagon of USA, which may endanger the
security of even a nation. Since incidents of September 11, 2001 and even earlier,
security has been in the forefront of American and other nations concern. Hence, the
study of methods of analysis of security requirements and needs of such systems and
consequent design, implementation and deployment is the primary scope of the discipline
named as Network Security. The network security includes all the issues related to
security of internets, including the Internet, which is a single huge global network of
networks.
In this chapter we discuss briefly about the network security, security goals and attacks,
types of network threats, security mechanisms, security services and security techniques.
There after, we discuss identity proof and authentication mechanisms.
Finally, we
describe the aims and objectives of this thesis, original contributions and summary or
outline of the thesis, in which, we give the gist of each chapter very briefly.
Access Control & Availability: Ensuring that authorized users have access to
information and associated assets when required i.e., services must be accessible
and available to intended users.
Sabotage: Any action that aims to reduce the availability and/ or correct
functioning of services or systems
service can be realized with the help of cryptographic algorithms and protocols as well
as with conventional means: One may keep an electronic document confidential by
storing it on the disk in an encrypted format as well as locking away the disk in a safe.
But to make information more secure, usually a combination of cryptographic and other
means are used.
The field of information security has started to evolve in response to the rapid growth of
Internet and evolving threats to it, and is becoming an important discipline with a sound
theoretical basis. The discipline is divided in to five supporting pillars [9]:
The identification and authentication is listed first because it is crucial to the entire
process and facilitates the other four pillars of security. If an individuals identity is
unknown, access cannot be authorized since system confidentiality cannot be enforced
nor integrity safeguarded. Similarly, non-denial is impossible without identification and
authentication since the system is unable to log an identity against specific transactions.
Consequently, identification and authentication should always be viewed as the first step
to successfully enforcing information security [9].
To conclude, we can say that, attacks are the reasons, mechanisms are the tools, and
services are our goals.
1.6.1 Cryptography
Some security mechanisms listed in the previous section can be implemented using
cryptography. The word cryptography is derived from a Greek word, which means secret
(crypto) writing (graphy). However, we use the term to refer to the science and art of
transforming messages to make them secure and immune to attacks [6]. Cryptography is
the science of information and communication security. In other words, it is the science
of information protection against unauthorized parties by preventing unauthorized
alteration of use. In the present day context, it refers to the tools and techniques used to
make messages secure for communication between the participants and make messages
immune to attacks by hackers. For private communication through public network,
cryptography plays a very crucial role. The role of cryptography can be illustrated with
the help of a simple model of cryptography [10] (as shown in figure 1.1 and figure 1.2).
The message to be sent, through possibly an unreliable medium, is known as plaintext,
which is encrypted before sending over the medium. The encrypted message is known as
ciphertext, which is received at the other end of the medium and then it is decrypted to
get back the original plaintext message.
Encryption Key
Plain Text
Cipher Text
Decryption Key
Although in the past, Cryptography referred only to the encryption and decryption of the
messages using secret keys, now-a-days, it is defined as involving three distinct
mechanisms: symmetric-key encipherment, asymmetric-key encipherment, and hashing
[6].
1.6.1.1 Symmetric-key Encipherment
It is also called as Secrete-Key Encipherment or Secret-Key Cryptography. An encryption
algorithm is used for converting the plaintext to ciphertext, operating on a key, which is
essentially a specially generated number (value). To decrypt a secret message (ciphertext)
to get back the original message (plaintext), a decrypt algorithm uses a decrypt key. In
8
symmetric key cryptography, single secret-key is shared, i.e. the same key is used in both
for encryption and decryption [10] (as shown in Figure 1.3 and figure 1.4). Both the
sender and receiver would have to know the key beforehand or it would have to be sent
along with the message.
Plain Text
Cipher Text
Plain Text
Requirement of large number of unique keys. For example, for n users the
number of distinct keys required is n (n-1)/2.
9
using public-key. Then, receiver uses his own private-key to decrypt the message [10] (as
shown in Figure 1.5 and figure 1.6). Any one can encrypt using the public key, but only
the holder of the private key can decrypt. Security depends on the secrecy of the private
key only.
Plain Text
Cipher Text
Plain Text
A big random number is used to make a public-private key pair. As public-key is known
to all, the security in the Asymmetric-key methods depends on private-key.
Asymmetric-key Encipherment Primitives:
x
Functionality: DecryptKs (EncryptKp (x)) = x, for any public-private key pair (Kp,
Ks) where Kp is the public-key, Ks is the secret-key and x is any message.
Asymmetric key algorithms are complex and require more execution time as compared to
Symmetric key algorithms. For n users, only 2n keys are required in public key
cryptography.
x
Advantages:
o The pair of keys can be used with any other entity
o The number of keys required is small
Disadvantages:
o It is not efficient for long messages
o Association between an entity and its public key must be verified
Well-known example for asymmetric-key encipherment is: Rivest, Shamir and Adleman
(RSA) Algorithm.
1.6.1.3 Hashing
Hashing is an algorithm that digests data and represents its bits and bit patterns by a
numerical equivalenta Hash Value. Single bit change in the data, may change half of
the bits in the resultant hash-value. Hash function is used for one-way cryptography.
Hash functions have no key since the plaintext is not recoverable from the ciphertext, as
depicted in figure 1.7. In hashing, a fixed-length message digest (MD) is created out of a
variable-length message. In other words, Hash function takes a message of any length as
input and produces a fixed length string as output termed message digest or a digital
fingerprint.
11
Plain Text
Hash Function
Cipher Text
The digest is normally much smaller than the message. It is a non-reversible algorithm,
i.e., hash value cant reproduce data. For the method to be useful, both the message and
the digest must be sent to receiver. Message Authentication Code (MAC) Hash algorithm
is Symmetric Key-dependant hash for more security. Hash is mainly used in the Digital
Signature process and for data integrity. Hashing is used to provide check values for
providing data integrity [6].
Hash Function Primitives:
x
1.6.2 Steganography
In addition to the Cryptography, another technique that is used for secret communication
is Steganography. The word Steganography, with origin in Greek, means covered Writing,
in contrast with Cryptography, which means secret writing. Cryptography means
concealing the contents of a message by enciphering where as Steganography means
concealing the message itself by covering it with something else.
Any form of data, such as text, image, audio or video can be digitized, and it is possible
to insert secret binary information into the data during digitization process. Such hidden
information is not only used for secrecy, but it can also be used for protecting copyright,
preventing tampering or addition of extra information [6].
The cover of secret data can be text and then it is called Text Cover. Secret data can also
be covered under a coloured image, and then it is called as Image Cover. Like-wise, we
can have Audio Cover and Video Cover and more. Both audio and video data can be
compressed and the secret data can be embedded during or before the compression.
Possession: The possession of a specific item or token, such as key, member card,
smart card [9], identity card, magnetic-stripe cards, optical-stripe cards, printed
barcodes [14], identity document, etc. Most of the tokens mentioned above are in
widespread use for granting access to physical assets (for example, doors into
buildings) and logical assets (for example, corporate networks or bank accounts).
The combination, i.e. fusion, of two or three of the aforementioned attributes can be used
to further increase the security level. All of these attributes have their specific advantages
and disadvantages. The use of the third attribute, viz., biometrics, if required, in
combination with the others, has significant advantages, because, without sophisticated
means, biometrics is difficult to share, steal or forge and is generally not easily forgotten
or lost.
In this thesis, we are mainly focusing on identification and authentication using biometric
techniques along with some of other security techniques.
compromised. This concept ensures that the original biometric template doesnt exist in
the system database. As such, it is not in danger of being exposed. Thus, the privacy issue
is completely nonexistent.
Aim of the thesis:
We embarked upon the research work with the following aims:
x
15
2)
3)
Studying the comparative analysis of our proposed methods with the algorithms
of previous approaches, and finally discussing the security analysis of the four
methods against the imposter attacks.
against the impostor attacks. And the last chapter presents the conclusion of current
research work and future work.
17
18
19
CHAPTER 2
BIOMETRIC SYSTEM SECURITY
"Biometrics is certainly the most secure form of authentication. It's the hardest to
imitate and duplicate." Avivah Litan
2.1 INTRODUCTION
In this chapter, we discuss the importance of Biometric Systems in respect of security
authentication, especially in view of the problems with existing traditional systems and
compare various authentication mechanisms. Then we give brief introduction to
biometric systems where, we discuss the characteristics of biometric systems, biometric
system components, and modes of operation and finally information flow in the system.
After that, we describe biometric technologies and classification, biometric modalities
and comparison of biometric technologies, performance measurements of biometric
systems. Later on, we discuss merging of biometrics and cryptographic techniques to
have reliable network security, and finally we discuss in detail about fingerprint
technology as one of biometric technologies.
are many ways to compare and contrast them. However, the comparison has been focused
here on the basis of criteria of an identification system failing either in granting access to
an authorized person or in rejecting a legitimate authorized user.
Problems with Passwords:
1. Passwords can be obtained or cracked using a variety of techniques, including:
a) Common password usagea lot of people use common passwords like
guest, password, pword, help, aaa, 1234 etc. Similarly, people
often create passwords from pertinent information about themselves, like the
name of a child or pet, which might be easily guessed [16].
b) Exhaustive or brute force attack [16]this is an attack where all possible
passwords are exposed;
c) Dictionary attacka variant of the brute force attack that uses words from a
specific list (for example, the English dictionary) [17];
d) Using programs/tools to crack the passworda lot of programs and tools are
available to crack and access passwords [17].
2. Passwords can be disclosed. If the password is disclosed to an individual, he/she
will be able to gain access to areas, information etc., which are meant to be
confidential.
3. Passwords can be forgotten. Although this is not a security threat directly, it does
place an additional burden upon an organizations administration in respect of
retrieving the information. If an individual has forgotten his/her password, he/she
needs to be issued with a new one.
Problems with Tokens:
1. Tokens can be forged and used without the knowledge of the original bearer. For
example, a forger can steal an identity and create a fake ID document using
another persons information. Armed with the forgery, fraudulent transactions can
be authorized without the original bearers knowledge.
21
2. Tokens can be lost, stolen or given to someone else. In any of these instances, an
illegitimate person will be able to fraudulently transact with the system by
impersonating the original bearer [28].
Problems with Biometrics:
1. Biometrics can be forgedfor example, a forged signature could be accepted by a
signature recognition system if performed skillfully enough [18].
2. Biometrics can be destroyeda biometric characteristics ability to be read by a
system can be reduced. An individuals fingerprints, for example, can be affected
by cuts and bruises [14] and can even be destroyed by excessive rubbing on an
abrasive surface or through exposure to certain chemicals like acids etc.
In this section, all three authentication mechanisms are shortly described with their
respective drawbacks. All these mechanisms have been summarized and contrasted in
Table 2.1 below [28, 30]:
S. No.
1.
2.
3.
4.
5.
6.
7.
8.
9.
Criterion
Technology
User Friendly
Forgery
Can be Stolen
Can be lost/ damage
Can be Forgotten
Transportability
System Price
Hygienic Reservation
Identification Mechanisms
Knowledge Possession Biometrics
Trivial
Moderate
Difficult
Yes
Yes
Depends
Yes
Yes
Yes
Yes
Yes
No
No
Yes
No
Yes
Yes
No
Yes
Yes
No
Marginal
Moderate
High
No
No
Yes
24
Accessibility: It measures how easy the particular biometric trait is to get to and
measure. Foot geometry, for example, would not be very accessible since
individuals would have to remove their shoes first.
Sensor Module: The sensor module registers an individual into the biometric
system database. During this phase, a biometric reader scans the raw image of the
user biometric trait and produces its digital representation.
Feature Extraction Module: This module processes the raw image data (sample)
obtained during sensor module and extracts certain features to generate a compact
representation of the biometric trait called the template or a feature-set, which is
then stored in the system central database or a smartcard issued to the individual.
Matching Module: This module compares the current input with the template. If
the system performs identity verification, it compares an extracted feature-set or
the characteristics of the current input to the users master template and produces
a score or match value (one-to-one matching). A system performing identification
matches the current characteristics against the master templates of many users,
already stored in database, resulting in multiple match values (one-to-many
matching).
25
Decision Maker: This module accepts or rejects the user based on a security
threshold and matching score.
System Database: This module collects and stores all biometric templates
(including brief profiles of users), obtained (generated) during enrollment process
into the system database. This database is also called as template database.
Depending on the application, the system database may be either centralized
database (physical database) that resides in the system or distributed database
(virtual database) with the record of each individual being carried on the magnetic
card issued to the individual.
26
Enrollment: In the enrollment process, users initial biometric samples are collected,
assessed, processed, and stored for ongoing use in a biometric system, as depicted in
figure 2.2.
Verification is a 1:1 matching process, where the user claims an identity and the system
verifies whether the user is genuine or not. If the users input and the template of the
claimed identity have a high degree of similarity, then the claim is accepted as genuine
otherwise, the claim is rejected and the user is considered as fraud, as depicted in figure
2.3.
Identification is a 1: N matching process, where the users input is compared with the
templates of all the persons enrolled in the database and the identity of the person whose
template has the highest degree of similarity with the users input is processed by the
biometric system. If the highest similarity between the input and all the templates is less
than a fixed minimum threshold, the system rejects the input, which implies that the user
presenting the input is not one among the enrolled users, as shown in figure 2.3.
Only biometrics can provide negative identification (i.e., I am not he) capability. Like
any security system, biometric systems are not foolproof. However, Biometrics can help
in protecting individual privacy and can guard personal & sensitive information because
biometrics provides stronger identification than password.
28
today is the signature, although not in biometric systems. Other possible behaviors that
can be used are how one speaks, types on a keyboard, or walks. Because of the inevitable
modest variations of all behavioral traits, many systems use an adaptation mechanism to
update the reference template in order to compensate for slight changes of the biometric
trait over time. Generally, behavioural biometrics work best with regular use [30].
Biometrics
Physiological
Biometrics
Behavioural
Biometrics
Fingerprint
Face
Voice
Gait
Hand
Eye
Signature
Keystroke
Ear Shape
DNA
Multimodal
(Combination)
Figure 2.5: Typology of biometric mechanisms
The above classification can be further divided in to sub-categories. For example,
HandPalm prints, Hand geometry, Hand veins etc., EyeIris recognition, Retina scan
etc., EarEar canal, Ear shape recognition etc., FaceFace recognition, Facial
thermogram etc., and likewise, KeystrokeKeystroke dynamics, Keystroke analysis etc.,
There are important differences between physiological and behavioral methods. First, the
degree of intra-personal variation in a physiological characteristic is smaller than in a
behavioral characteristic. Apart from injuries the iris pattern remains the same over time,
whereas speech characteristics change and are influenced by many factors, e.g. the
emotional state of the speaker. Developers of behavior based systems, therefore, have a
harder job in compensating for those intra-personal variations. Second, due to the intra30
Acceptability refers to what extent people are willing to accept the biometric
system.
On this basis, biometrics were applied in many high end applications, with governments,
defence and airport security being major customers. However, there are some areas in
which biometric applications are moving towards commercial application, namely,
32
network/PC login security, web page security, employee recognition, time and attendance
systems, and voting solutions. While biometric systems have their limitations, they have
an edge over traditional security methods in that they cannot be easily stolen or shared.
Besides bolstering security, biometric systems also enhance user convenience by
alleviating the need to design and remember passwords. According to A. Jain [36], U.
Uludag [34], the table 2.2 shows the comparison of various biometric technologies, the
perception based on High=100, Medium=75, Low=50.
Failure to Enroll Rate (FTE or FER): FTE is the percentage of time that users
are unable to enrol in the biometric system [52]. In other words, it is the
percentage of data input that is considered invalid and fails to input into the
system. Failure to enroll happens when the data obtained by the sensor are
considered invalid or of poor quality.
Failure to Capture Rate (FTC or FCR): FTA is the percentage of time the
biometric system is unable to capture a biometric sample when one is presented
33
False Accept Rate (FAR): It is the number of times an impostor user is falsely
granted access to the system divided by the total number of trials.
FAR(O )
False Reject Rate (FRR): It is the number of times genuine users are falsely
rejected divided by the number of trials.
FRR(O )
x
Equal Error Rate (EER): It is the rate at which both accept (FAR) and reject
(FRR) errors are equal. The lower the EER, the more accurate the system is
considered to be.
Matching Errors:
False Match Rate (FMR): FMR is the rate at which a template is falsely matched
False Non-Match Rate (FNMR): FNMR is the rate at which a template is falsely
FRR across various thresholds. In biometric systems, the FAR and FRR can
typically be traded off against each other by changing those parameters. ROC
curves are one of the ways to evaluate the performance of a biometric system.
x
x
Template Capacity: the maximum number of sets of data which can be input in to
the system.
x
FMR and FNMR are calculated over the number of comparisons while FAR and FRR are
calculated over the number of transactions. Another difference is that FAR and FRR also
account for FTA rates [61].
x
The accuracy of a biometric system is only as good as its sensor and the degrees of
freedom of the biometric trait being measured [64]. The accuracy of a biometric system is
represented by its FARFalse Accept Rate and its FRRFalse Reject Rate. The error
rates are a function of the threshold as shown in Figure 2.6. These two scores can be
plotted against each other through out all possible threshold values to show performance.
This plot is called the ROCReceiver Operating Characteristic curve [62]. The two
35
errors are complementary in the sense that if one makes an effort to lower one of the
errors by varying the threshold, the other error rate automatically increases [53].
Biometrics has the potential to identify individuals with a high degree of assurance, thus
providing a foundation for trust. Cryptography, on the other hand, concerns itself with the
projection of trust: by taking trust from where it exists to where it is needed.
Cryptography is an important feature of computer and network security [21]. Using
biometrics for security purposes becomes popular, but using biometrics by means of
cryptography is a new research topic. Many traditional cryptographic algorithms are
available for securing information, but all of them are dependent on the secrecy of the
secret or private key. To overcome this dependency, biometric features consider secrecy
of both keys and data. There are various methods that can be deployed to secure a key
with a biometric.
1. The first method involves remote template matching and key storage. In this
method a biometric image is captured and compared with a corresponding
template. If the user is verified, the key is released. The main problem here is
using an insecure storage media [21].
2. The second method hides the cryptographic key within the enrolment template
itself via a secret bit-replacement algorithm. When the user is successfully
authenticated, this algorithm extracts the key bits from the appropriate locations
and releases the key [22].
3. The third method is of using data derived directly from a biometric fingerprint
image. In this manner fingerprint templates are used as a cryptographic key [23,
36
Biometrics and cryptography should not be seen as competing technologies rather these
two are potentially complementary security technologies. Therefore, they have to be
symbiotic rather than competitive. Biometric Fingerprint technology is chosen because of
its information strength, namely the uniqueness for random sequences, needed for
cryptographic key generation [37]. This thesis puts forth a fresh methodology for the
secure storage of fingerprint template by generating Secured Feature Matrix and keys for
cryptographic techniques applied for data Encryption or Decryption with the aid of
cancelable biometric features. If a Biometric Key is missing or stolen, it is lost
perpetually and possibly for every application where the biometric is utilized, since a
biometric is permanently linked with a user and cannot be altered. In this thesis, we
propose a technique to produce cancelable key from fingerprint so as to surmount these
problems. The flexibility and dependability of cryptography is enhanced with the
utilization of cancelable biometric features. There are several biometric systems in
existence that deal with cryptography, but the proposed cancelable biometric system
introduces a novel method to generate Cryptographic Key. We have as well discussed
about the Security analysis of the projected Cancelable Biometric System.
Among all biometric traits, fingerprints have one of the highest levels of reliability [66]
and have been extensively used by forensic experts in criminal investigations [67].
Fingerprint analysis, also known in the US as dactylography, is the science of using
fingerprints to identify a person.
the matching accuracy using fingerprints has been shown to be very high [39].
Fingerprint identification is well established and a mature science [70].
2.9.2 Fingerprint Uniqueness
A fingerprint refers to the flow of ridge patterns in the tip of the finger. Ridges are the
lines across fingerprints (raised skin) and valleys or furrows are the spaces between
ridges (lowered skin) on the surface of a fingertip. When an inked imprint of a finger is
made, the impression created is of the ridges while the furrows are the unlinked areas
between the ridges. A sample fingerprint is shown in figure 2.7. For a person, fingerprints
are formed or determined during the first seven months (in the third and fourth month) of
foetal development and are unique. The pattern of the ridges and valleys, called
minutiae, is unique for each individual [40, 47]. These are the basis for most of the
fingerprint identification and are acceptable even in a court of law. Even identical twins
will have differing fingerprint patterns and so are the prints on each finger of the same
person. Two like fingerprints would be found only once every 1048 years [78]. That's
why a proverb says: Faces can lie but fingerprints never.
The skin excretes oils and perspiration through sweat glands, flowing along the tops of
the ridges. When a surface is touched the fingerprint is transferred. Smooth, clean
surfaces record better quality fingerprints but fingerprints can also be found on irregular
surfaces such as paper. There are three basic categories of fingerprint [70]:
38
Visible prints (also called patent), such as those made in oil, ink or blood
Latent prints which are invisible under normal viewing conditions; and
Plastic prints which are left in soft surfaces such as newly painted ones.
There are more than forty methods available for collecting fingerprints using powders;
chemicals such as iodine, ninhydrin, and silver nitrate; digital imaging, dye stains and
fumes [70].
Sir Edward Henry (1850 - 1931), as the Inspector General of Police for Bengal Province
in India, developed a classification system which was officially adopted by British India
in 1897. In December 1900, Britains Belper Committee recommended that the
fingerprints of criminals be taken and classified by the Indian System [49]. The Henry
Classification System organizes ten-print fingerprint records by pattern type. Finger
ridges and patterns can be continuous, interrupted, forked, and of other formations.
Fingerprints are classified and identified by the relationship of these formations,
described as minutiae. These patterns are classified into three major categories based on
their central pattern [44]. The patterns are the arch, loop, and whorl, which are shown in
figure 2.8. They are further divided into various subgroups [50].
Arch: a ridge that runs across the fingertip and curves up in the middle. Tented
Whorl: an oval formation, often making a spiral pattern around a central point.
Principal types are a plain whorl and a central pocket loop whorl.
x
Loops: These have a stronger curve than arches, and they exit and enter the print
on the same side. Radial loops slant toward the thumb and lunar loops away from
the thumb.
Further, we have the following subgroups:
Whorl.
2.9.5 Types of Minutiae
Minutiae features, also known as Galton features are particular patterns consisting of
ridge endings (terminations) or ridge bifurcations. Minutiae points are local ridge
characteristics that appear as either a ridge bifurcation or a ridge ending or local
discontinuities in the fingerprint pattern [40, 46], as shown in Figure 2.9.
Figure 2.9: A fingerprint image with the core and four minutiae points
40
A total of 150 different types of minutiae have been identified. In practice only ridge
ending and ridge bifurcation minutiae types are used in fingerprint systems [36]. Figure
2.10 depicts some minutiae representations [47, 48].
x
Islands: ridges slightly longer than dots, occupying space between two
A complete fingerprint consists of about 100 minutiae points on average. The measured
fingerprint-area consists on average of about 30-60 minutiae points depending on the
finger and on the sensor area. These minutiae points are represented by a cloud of dots in
a coordinate system. They are stored together with the angle of the tangent of a local
minutiae point in a fingerprint-code or directly in a reference template. A template can
consist of more than one fingerprint-code to expand the amount of information and to
expand the enrolled fingerprint area. In general this leads to a higher template quality
and therefore to a higher similarity value of the template and the sample. The template
sizes vary from 100 bytes to 1500 Bytes depending on the algorithm and the quality of a
fingerprint. Nevertheless, very rarely there are fingerprints without any minutiae-points
41
that lead to a failure to enroll rate (FER). It is also difficult to extract the minutiae points
accurately when the fingerprint has got a low quality [40].
The existing popular fingerprint matching techniques can be broadly classified into two
categories: (a) minutiae-based and (b) correlation-based. Minutiae-based matching and
Correlation-based matching are also called as Minutiae Matching and Pattern Matching
respectively. Minutiae-based techniques attempt to align two sets of minutiae points and
determine the total number of matched minutiae [68, 69, and 35]. Correlation-based
techniques, on the other hand, compare the global pattern of ridges and valleys (furrows)
to see if the ridges in the two fingerprints align [71, 72, 73, and 74]. The minutiae points
define the local structure, while the ridge pattern along with the core and delta points
define the global structure or global configuration.
During the minutiae-based method, the ridges in the fingerprint are compared by their
unique details. Minutia points on the individuals finger are located and processed to
extract these points. They are then compared with a registered template. In comparison to
the minutia matching method, the correlation-based method compares all of the fingers
characteristics. Sub-areas of the ridge thickness, curves, or density are some of the
fingers characteristics. The area around the minutia, with low curvature, or combination
of ridges is taken from the fingerprint. The extracted area is then processed and compared
with a registered template [41, 45].
A typical minutiae extraction technique performs the following sequential operations on
the fingerprint image: (i) fingerprint image enhancement, (ii) binarization (segmentation
into ridges and valleys), (iii) thinning, and (iv) minutiae detection. Several commercial
[62] and academic [75, 76] algorithms follow these sequential steps for minutiae
detection. On the other hand, the simplest correlation-based technique is to align the two
fingerprint images and subtract the input from the template to see if the ridges
correspond. However, such a simplistic approach suffers from many problems including
the errors in estimation of alignment, non-linear deformation in fingerprint images, and
42
noise [77]. Currently the computer aided fingerprint recognition is using the minutiae
matching.
The performance of minutiae-based techniques rely on the accurate detection of minutiae
points and the use of sophisticated matching techniques to compare two minutiae sets
which undergo non-rigid transformations. The performance of correlation-based
techniques is affected by non-linear distortions and noise present in the image. In general,
it has been observed that minutiae-based techniques perform better than correlationbased ones [65].
In addition to the above techniques, there is another fingerprint matching technique called
feature-based technique that captures both the local and the global details in a fingerprint
as a compact fixed length feature vector. It uses orientation and frequency of ridges, ridge
shape and texture information etc. This technique suffers from low discriminative ability.
Challenges in Fingerprint Matching: Fingerprint matching is difficult due to large
intra-class variations caused by sensor noise, partial overlap, non-linear distortion, and
small inter-class variations (similarities in the global structure and ridge orientations).
Challenge is to handle poor quality fingerprints and fingerprints having little overlap.
Fingerprint Pattern Recognition System is shown in figure 2.11. The user places
his/her finger against a reader. The reader then scans the fingerprint and it is sent into a
database. Once in the database, the fingerprint is compared, verified, and identified [42,
78].
Fingerprint Identification: Fingerprint identification is based on two basic premises: (i)
Persistence, the basic characteristics of fingerprints do not change with time i.e.,
43
Matcher
Template
Database
Yes / No
(Threshold)
Authentication
Preprocessor
Feature
Extractor
Preprocessor
Enrollment
identification (or verification) process, the system captures finger data from a finger with
sensing devices, extracts features, identifies (or verifies) the features by comparing with
templates in the database, and then outputs a result as Yes or True, only when the
features correspond to one of the templates [51].
Most of fingerprint systems utilize optical or capacitive sensors for capturing
fingerprints. These sensors detect difference between ridges and valleys of fingerprints.
Optical sensors detect difference in reflection. Capacitive sensors, by contrast, detect
difference in capacitance. Some systems utilize other types of sensors, such as thermal
sensors, ultrasound sensors [51].
44
Capturing
(With High Quality)
Feature
Extraction
Recording
Matcher
(Comparison)
Result
Referring
Finger
Presenting
System Database
(Template Database)
Fingerprint System
Enrollment
Verification or Identification
The enrollment module is responsible for enrolling individuals into the biometric system
database. During the enrollment phase, the biometric characteristic of an individual is
first scanned by a biometric reader to produce a digital representation (feature values) of
the characteristic. The data capture during the enrollment process may or may not be
supervised by a human depending on the application. A quality check is generally
performed to ensure that the acquired sample can be reliably processed by successive
stages. In order to facilitate matching, the input digital representation is further processed
by a feature extractor to generate a compact but expressive representation, called a
template. Depending on the application, the template may be stored in the central
database of the biometric system or be recorded on a distributed database (for example
smart card issued to the individual). Usually, multiple templates of an individual are
stored to account for variations observed in the biometric trait and the templates in the
database may be updated over time [52].
Advantages of Fingerprint Biometrics:
1. Can be placed on a smart card for an added degree of authentication
2. Low instances of false acceptance (a rate that fraudulent users are allowed access
3. Low cost
4. Integration is easier
5. Fingerprint readers are small in size.
Disadvantages of Fingerprint Biometrics:
1. Higher risk of false rejection (a rate that authentic users are denied or prevented
2.11 SUMMARY
46
CHAPTER 3
CANCELLABLE BIOMETRICS
Cancellable biometrics stores a non-invertible transformed version of the biometric
data and so if the storage is compromised the biometric data remains safe.
Reihaneh Safavi-Naini
3.1 INTRODUCTION
Now-a-days, biometric security systems have a number of problems because of the fact
that the biometric data of a person is generally stored in the system itself. The problems
would arise especially when that data is compromised. Standard password based security
systems are having the ability of cancelling the compromised password and reissuing
another one. But the biometrics cannot be changed or cancelled. Thus, advantage of
biometrics based security becomes disadvantage also in this particular situation. The
concept of cancellable biometrics can upgrade the existing biometric security system so
that it gains the advantages of the password based security systems, and at the same time
not losing the inherent superiority [79]. In this chapter, we discuss briefly problems with
existing biometric technologies and then will be showing that cancellable biometric
system is the solution for these. Finally, we explore the cancellable biometrics in detail.
implementations necessitate large-scale capture and storage of biometric data [80]. Most
of the existing biometric systems require central biometric template storages. Motivation
for this central storage comes from two different angles [79]:
1. The first motivation is the fact that the cost of enrollment phase is relatively high
[81]. Since every user has to go through this phase, and if templates are stored in
different storages, then the number of systems required to handle these
independent storages is large, and this process may be repeated a number of times.
Obviously, repeating one process many-times is inefficient and inconvenient for
the user. Thats why; the central template storage place is a good solution to
avoid the extra cost and inconveniences.
2. The second motivation factor is standardization. The central biometric template
storage would force all users of biometric authentication system, to use the same,
standardized methods. The entire process of authentication would have to be
standardized, from sensors to algorithms to security policies. The Standardization
would solve the compatibility problem over different services within the group
and enables a possibility for adding a new service to the group.
However, despite its obvious advantages, the use of biometrics has several potential
problems related to security and privacy [79], because of the fact that the biometric data
of every user in that system is stored in a centralized template database. They are outlined
below:
1. Identity theftthe attacker can steal the biometric data from the central database
and can use the same data for constructing an artefact which further can be used
to impersonate the original user. The artefact may be anything like an artificial
finger, eye, face mask, photography, or something else that may be depending on
the type of the biometrics in database. In other words, biometrics (even
fingerprints) can be recorded and misused without a users consent
2. Irrevocabilitythe nature of biometric sample is such that they are permanent
and consequently any user can not alter the acquired template. For instance, a
fingerprint of right index finger, once given, cannot be modified. The major
48
49
exposed in its original form. In this chapter, we explore the cancellable biometrics [53] in
detail in the following sections.
distorted in the signal domain prior to feature extraction. The distorted version does not
match with the original biometric, while the two instances of distorted faces match
among themselves.
In respect of feature domain distortion, consider figure 3.2, which shows how each
feature (e.g., minutiae position) is transformed using a non-invertible function Y
f (X ) .
f ( X 0 ) as shown. However, if
This concept ensures that the original biometric template doesnt exist in the system
database. As such, it is not in danger of being exposed. In this way, the privacy issue is
completely nonexistent. Even if an attacker is able to get to a transformed template, it
will be completely useless for him/her. Nobody can use it to construct an artefact which
could enable him/her to impersonate the user. Moreover, the template couldnt be used
for identification purposes. The existence of transformation function allows simple
control over the services which have access as well as those which dont have. The
authorized services will have the knowledge of the transformation function, but the others
51
will not [79]. Cancellable biometrics is a relatively new direction of research, spurred on
by the privacy invasion and biometrics non-revocable issues. To formally define a
cancellable biometrics, Maltoni et al. [80, 36, and 85] had outlined four principal
objectives as follow:
1. Diversity: The ability to generate multiple templates from the same biometric to
ensure that cancellable biometric is such that not the same cancellable templates
have to be used in two different applications.
2. Revocability/ Reusability: Templates are easily revoked and reissued when
compromised. i.e., straightforward revocation and re-issue are allowed in the
event of compromise.
3. Non-invertibility: Original biometric data cannot be recovered from the
transformed or encrypted templates, i.e., One-way transformation function be
used for template computation to prevent recovery of biometric data.
4. Performance: The scheme should not weaken the recognition performance
remarkably i.e., the formulation should not deteriorate the recognition
performance for sure.
But concept of cancellable biometric is not created only to address the privacy issues.
The fact, that the stored biometric templates are created by using a transformation
function on the original biometric templates, enables creation of new templates by using a
different transformation function on the original biometric templates of the user. If one
can generate a new biometric template, the old one can be cancelled. Biometric security
systems which implement the concept of cancellable biometrics can enjoy all the benefits
we were used to in classic password based security systems (revocability and ability to
reissue), along with preserving the benefits of biometric systems. Biometric templates are
bound to the user so that these cannot be given to someone else. These cannot be stolen
or forgotten. And these have a greater resilience to brute force attack since these have
greater information size [79].
52
0 m( f t (b1 ), f t (b 2 ))
0.
0.
53
2. Diversity: Two different distortions of the same biometric trait should not match:
m( f t (b), f d (b)
0 , where t z d .
For the transform to be repeatable, the minutiae positions have to be measured w.r.t. the
same coordinate system each time. Prior to the transformation, each fingerprint needs to
be registered. One way this can be accomplished is by precisely estimating the position
and orientation of the core and delta and expressing the minutiae with respect to these
points. Though there are several approaches, determining these singular points is a
difficult problem. Another problem even after registration is the intra-user variability of
biometric signals. The features after transformation should be robust w.r.t. to this
variation. The transform has to further satisfy the following conditions: (i) the
transformed version of the fingerprint should not match the original and the original
should not be recoverable from it. This preserves the privacy of the transformed template.
(ii) Multiple transforms of the same fingerprint should not match, which prevents crossmatching between databases.
The important steps that are involved in cancellable transformation are registration,
transformations (on the signal level and on the feature level), and selection of
transformation function. We discuss these steps briefly in the following sections.
55
3.6 REGISTRATION
The first important step in the application of a cancellable transform is the process of
registering the image. For the transform to be repeatable, the minutiae positions have to
be measured with regard to the same coordinate system. This can be accomplished by
estimating the position and orientation of the singular points (core and delta) and
expressing the minutiae positions and angles with respect to these points. There have
been several approaches for the detection of singular (core and delta) points in literature.
The most recent approach is based on complex filtering proposed by Nilsson et al. [86].
Their technique relies on detecting the parabolic and triangular symmetry associated with
core and delta points. The filtering is done on complex images associated with the
orientation tensor instead of the gray-scale image [87].
3.7 TRANSFORMATIONS
After global registration, the features can be transformed consistently across multiple
instances. The requirements of cancel ability put several constraints on the transformation
[84]:
1. The minutiae position after transformation has to be outside the tolerance box of
the matcher. A minimum amount of translation during the transformation needs to
be ensured.
2. The transformation should be locally smooth to ensure that small changes in the
minutiae position lead to small changes in the minutiae position after
transformation.
3. The transformation should not be globally smooth. Otherwise, the minutiae
positions after transformation are highly correlated with the positions before
transformation and can be inverted.
Such a transform can be implemented in several ways [84].
56
The transformation of samples can be performed right after the sensor, on the signal level.
The data on which the transformation is performed can be a picture of the face,
fingerprint, and picture of the iris or another kind of biometric sample. An example of
such transformation is grid morphing. Grid morphing changes the picture, for instance, a
picture of a face. First, a grid is positioned on a face so that it is aligned with face features
like eyes, nose and chin. Then the grid is morphed so that the face is morphed with it.
The result is another face that cannot be linked to original face. More information on grid
morphing can be found in [88, 89].
These kinds of transformations change the original biometric data in a way that existing
algorithms for feature extraction still function on them after the transformation. Actually
it is very important that they do not diminish the power of existing algorithms. The result
of signal level transformations is actually another biometric data but not linkable to an
actual person. The rest of the biometric security system is actually not even aware of the
transformation of the signal.
One of the prerequisites for this kind of biometric system to function is that the applied
transformation can be used to repeatedly transform the signal during the authentication
phase in the same way. The problem of repeatability arises. The original biometric data is
usually represented by a picture, but it could be any other human feature like scent or
sound. No matter what kind of biometrics is used, in order to repeatedly apply the
transformation in the same way, the signal has to be normalized. Some features of the
biometrics have to be found prior to transformation. For instance, position of the face on
picture, or position and angle of the iris need to be found and the picture has to be
normalized in a way that the element found is centered and in equal rotation. Only after
that kind of pre-processing the transformation can be applied. The grid morphing
example mentioned above has a grid that has to be aligned with the features of the face.
Only after the eyes, nose, chin and other relevant features are found, the grid can be
positioned, and the transformation can be applied. If the grid is not aligned the same way
every time a transformation is applied, the resulting image will not be comparable to the
57
stored biometric template of the user, and the authentication will fail. This process can be
very difficult and sometimes impossible.
3.7.2 Transformation on the Feature Level
Besides transformation on the signal level, transformation can be applied on the feature
level. The feature level of the biometric sample is represented by a list of features
describing the biometric sample. It is usually represented by a list of numbers, like
coordinates, angles or sizes. These numbers can represent fingerprint minutiae or sizes of
fingers and palm in hand geometry biometrics. Transformation on feature level doesnt
need the normalization which is crucial for transformations on signal level, since the
sample is already processed and all the features are extracted into a normalized form [79].
Some feature level transformations change the biometric template so that the existing
algorithms for matching still function on them without any need for adapting. One
example of such function would be a transformation of features that simply changes their
position in coordinate space. But some transformations change data into a form
completely different from any known biometric data, like hash functions [90]. Such data
cannot be matched using the same algorithms but require new algorithms, created only
for that purpose. An example of feature level transformation is applying a high order
polynomial function on every minutiae in biometric template [79].
One of the main goals of cancellable biometrics is ensuring the biometric data of the
person so that it can never be compromised. Two transformation functions may look
similar, but one of them should be with the addition of converting some features to zero
or any other randomly chosen number [91]. That way, even if the attacker recreates the
original template by inverting the transformed template, he wouldnt get the users true
identity because some of the features were irreversibly changed.
3.8 SELECTION OF TRANSFORMATION FUNCTION
The function that is used during the transformation phase has to have certain
characteristics discussed below:
x
In order to have the option of cancelling and reissuing the template, we dont
want a limited number of transformation functions which could be applied,
58
If we store the transformation function in the same place where we store the
biometric templates, then it can be stolen along with the template. It is necessary
that an attacker having the template and the transformation function that created it
cannot get to the original template. The only way to ensure this is that the
transformation function is non-invertible, or it has large enough number of inverts
that would discourage a brute force attack. If the function is not noninvertible, it
should be carefully hidden from the attacker. One way to hide it would be to place
it on a SmartCard and not in a shared storage [79].
Transformation function can enlarge the template size in bytes, which is desirable
because the time needed for a brute force attack on a security system (trying all possible
combinations until we hit the one that will allow access) increases exponentially by the
size of the template size.
Transformed biometric templates should not diminish the uniqueness of a biometric data
[59].
x
Two different transformation functions applied on the same sample must differ
(return false if compared).
59
between samples greater. By increasing the distance between biometric samples we can
achieve lower false accept rate (FAR) without increasing false reject rate (FRR) [92].
We can conclude that the transformation function actually represents the essence of the
concept of cancellable biometrics. As such it must ensure that it does not diminish the
positive characteristics of biometric security systems. By choosing the right type of
function we can even enhance the system by producing higher uniqueness [79].
3.9 SUMMARY
In this chapter, we have discussed briefly the problems with existing biometric
technologies and explored cancellable biometrics in detail. Biometric authentication
schemes raise security concerns because biometric data is permanently associated with its
owner and therefore can not be replaced even if it is compromised. One of the most
promising solutions to this problem is cancellable biometrics [53], where system does not
store the original biometric data; rather, it stores only the version transformed by a noninvertible transform using a one-way function [54]. Then, the verification/ identification
is done on this transformed data without any need to recover or use the original data,
thereby making the original data safe even if the system is compromised. This concept
ensures that the original biometric template doesnt exist in the system database. As such,
it is not in danger of being exposed. The privacy issue is this way completely nonexistent.
The presented concept of cancellable biometric templates is a good solution to most of
the perceived problems of todays biometric security solutions. The ability to cancel and
reissue a biometric template is a giant step towards increasing the usability of biometric
security systems.
Because of the nature of data being transformed, it is probably easier to apply the
transformation on the feature level. Choosing the appropriate transformation function is
the hardest task in implementation of cancellable biometrics. The transformation function
can ensure greater uniqueness among samples. A large family of functions must be
chosen so that it is not limited in number of variations. It must be noninvertible. It should
increase the template size. Finally, every system implementing cancellable biometrics
should be carefully planned and tested to ensure that all of the mentioned goals are
achieved.
60
CHAPTER 4
LITERATURE SURVEY
"Literature is analysis after the event." Doris Lessing
4.1 INTRODUCTION
In this chapter, to begin with, we will briefly discuss history of biometrics, how it has
evolved, and how it has become a major challenging research topic and continues to be so
even today. Also we will give the time-line of biometrics, and the history of research in
biometrics. Later on, we will discuss cancellable biometrics and the relevant research in
this area.
61
The above quotation may be the first recorded military use of a security protocol in which
the authentication relies on a property of the human being in this case, his accent.
There had been less formal uses even before and after this incident.
1. When Isaac tried to identify Esau by his bodily hair, but got deceived by Jacob.
He went to his father and said, "My father." "Yes, my son," he answered. "Who
is it?" Jacob said to his father, "I am Esau your firstborn. I have done as you told
me. Please sit up and eat some of my game so that you may give me your
blessing." Isaac asked his son, "How did you find it so quickly, my son?" "The
LORD your God gave me success," he replied. Then Isaac said to Jacob, "Come
near so I can touch you, my son, to know whether you really are my son Esau or
not." Jacob went close to his father Isaac, who touched him and said, "The voice
is the voice of Jacob, but the hands are the hands of Esau." GENESIS 27:18
22 [BIBLE-NIV]
2. When people identified Peter by his accent and by his face, he tried to deny the fact.
Now Peter was sitting out in the courtyard, and a servant girl came to him.You
also were with Jesus of Galilee," she said. But he denied it before them all. "I
dont know what youre talking about," he said. Then he went out to the gateway,
where another girl saw him and said to the people there, "This fellow was with
Jesus of Nazareth." He denied it again, with an oath: "I dont know the man!"
After a little while, those standing there went up to Peter and said, Surely you
are one of them, for your accent gives you away. MATTHEW 26:6973
[BIBLE-NIV]
Peter knocked at the outer entrance and a servant girl named Rhoda came to
answer the door. When she recognized Peter's voice, she was so overjoyed she
ran back without opening it and exclaimed, "Peter is at the door!" ACTS
12:1314 [BIBLE-NIV]
62
With some of the above examples, we can conclude that biometrics identify people by
measuring some aspect of individual anatomy or physiology (such as hand geometry or
fingerprint), some deeply ingrained skill, or other behavioural characteristic (such as
handwritten signature), or something that is a combination of the two (such as voice).
authorities worldwide during the 1890s, but soon became obsolete once it was recognized
that people could indeed share the same physical measurements. Because of the amount
of time and effort that went in to painstakingly collecting measurements and the overall
inaccuracy of the process, Bertillonage was quickly replaced when fingerprinting
emerged on the scene as a more efficient and accurate means of identification.
Fingerprint, as a means of identification, proved to be infallible. It was accepted that each
individual possessed a uniquely identifiable and unchanging fingerprint. This new system
of identification was accepted as more reliable than Bertillonage.
Meanwhile, the quest for a physical identifier that was unique to each individual gained
significant ground when British anthropologist, Sir Francis Galton, who had been privy to
Faulds research through his uncle, Charles Darwin would also be credited as making
considerable advancement to fingerprint identification. Galton ascertained that no two
fingerprints were alike, not even for a set of identical twins. He worked on the principle
that fingerprints were permanent throughout life, and that no two people had identical
fingerprints. Galton calculated the odds of prints from two people being identical to be 1
in 64 billion and also identified characteristicsknown as minutiaethat are still being
used today to demonstrate that two impressions made by the same finger match. Minutiae
are points of interest formed by the endings or forking of the friction skin ridges on each
finger and are defined as one of the following:
x
It is the arrangement of all the minutiae in terms of their location, orientation of ridge
flow and type (i.e. ridge ending or bifurcation) that make an individuals fingerprints
unique. The flow of the friction skin ridges also form the patternsthe whorl, arch and
loop of each fingerthat were identified by Galton. Galtons patterns provided the basis
of the first fingerprint file established in 1891 by Juan Vucetich, an Argentine police
officer, who became the first to use a bloody fingerprint to prove the identity of a
murderer during a criminal investigation.
Sir Edward Henry (1850 - 1931) developed a classification system which was officially
adopted by British India in 1897. In 1897, Sir Edward Henry, a British police officer
64
serving as Inspector General of the Bengal Police in India, also developed an interest in
the use of fingerprints for identifying criminals, even though the Bengal Police was at
that time using Bertillonage. Based on Galtons observations, Henry and colleagues
established a modified classification system, based on physiological characteristics,
allowing fingerprints captured on paper forms using an ink pad to be classified, filed and
referenced for comparison against thousands of others [93]. In 1900 Henry presented a
paper entitled Fingerprints and the Detection of Crime in India. Shortly after, Henrys
book The Classification and Uses of Finger Prints was published [70]. The Henry
Classification System organises ten-print fingerprint records by pattern type. The system
assigns each individual finger a numerical value (starting with the right thumb and ending
with the left pinky) and divides fingerprint records into groupings based on pattern types.
Finger ridges and patterns can be continuous, interrupted, forked, and other formations
[93]. Fingerprints are classified and identified by the relationship of these formations,
described as minutiae. The system makes it possible to search large numbers of
fingerprint records by classifying the prints according to the patterns. These patterns are
divided into five basic groups, with various subgroups [70]: arch, whorl, loops, composites
and accidentals. In December 1900, Britains Belper Committee recommended that the
fingerprints of criminals be taken and classified by the Indian System. In 1901, Henry
was called back to England and was given the post of Assistant Commissioner of Police
in charge of Criminal Identification at New Scotland Yard. In 1903, Henry became
Commissioner of Police [70].
In 1901, Henrys fingerprinting system had been adopted in the UK and was introduced
in England by Scotland Yard. In 1902, the New York Civil service began testing the
Henry method of fingerprinting with the Army, Navy, and Marines, all adopting the
method by 1907. From this point on, the Henry System of fingerprinting became the
system most commonly used in English speaking countries to become a standard method
of identity detection and verification in criminal investigations [93].
With the advent of computers and digital technology in the 1970s, fingerprinting took on
a new dimension. As a result, the UKs fingerprint service now records 120,000 sets of
fingerprints each year a volume of records that was simply untenable before
computerization. Within a century, biometrics had evolved from tape measure, ink and
65
pad techniques requiring vast manual filing and archiving resources, to an automated
biometric digital scanning process using computerized storage, automated search and
find/match techniques, plus extensive archiving and access systems with worldwide links.
Such technology now provides for the capture and processing of biometrics information
and has transformed fingerprinting techniques and procedures.
In the past three decades biometrics has moved from a single method (fingerprinting) to
more than ten discreet methods. Companies involved with new methods number in the
hundreds and continue to improve their methods as the technology available to them
advances. Prices for the hardware required continue to fall making systems more feasible
for low and mid-level budgets. However, as the industry grows so does the public
concern over privacy issues. Laws and regulations continue to be drafted and standards
are being developed. While no other biometric has yet reached the level of use of
fingerprinting, some are beginning to be used in both legal and business areas [93].
European explorer Joao de Barros recorded that the first known example of
biometrics in practice was a form of fingerprinting being used in China during the
14th century. Chinese merchants used ink to take children's fingerprints for
identification purposes.
Elsewhere in the world up until the late 1800s, identification largely relied upon
"photographic memory".
In the 1880s, an anthropologist and police desk clerk in Paris named Alphonse
Bertillon sought to fix the problem of identifying convicted criminals and turned
66
classification
for
fingerprint
identification
based
on
physiological
characteristics.
In 1902 the New York Civil service began testing the Henry method of
fingerprinting with the Army, Navy, and Marines all adopting the method by
1907. From this point on, the Henry System of fingerprinting became the system
most commonly used in English speaking countries.
By the 1920s, fingerprint identification was used by law enforcement, the U.S.
military and the FBI as a form of identification.
Although finger printing is still in use today, computer aided techniques began
developing rapidly in the last quarter of the twentieth century. These techniques
sought to measure our voices, our hands, fingers, iris is and faces. Once ideas
were proposed, development was rapid. In 1985, the idea that irises are unique
was proposed; development of an iris identification system began in 1993; in
1994 the first iris recognition algorithm was patented, and the year after that, a
commercial product measuring irises became available.
In 2001, Super Bowl in Tampa, Florida, each facial image of the 100,000 fans
passing through the stadium was recorded via video security cameras and checked
electronically against mug shots from the Tampa police. No felons were identified
and the video surveillance led many civil liberties advocates to denounce
biometric identifying technologies.
In 2005, Rep. Robert Andrews (D-NJ) introduced the Iris Security Scan Security
Act of 2005, intended to give States grants to use iris scan records of convicted
criminals for various purposes.
Since July 7th, 2005, British law enforcement is using biometric face recognition
technologies and 360-degree "fish-eye" video cameras to identify terrorists.
Description
First systematic capture of hand images for identification purposes is recorded
Bertillon develops anthropometrics to identify individuals
Galton develops a classification system for fingerprints
The Tragedy of Puddnhead Wilson is published
Henry develops a fingerprint classification system
NY State Prisons begins using fingerprints
Bertillon System collapses
Concept of using the iris pattern for identification is proposed
68
1960s
1960
1963
1965
1969
1970s
1970
1974
1976
1977
1980s
1985
1985
1986
1987
1988
1988
1991
1992
1993
1993
1994
1994
1994
1994
1995
1996
1996
1997
1998
1999
1999
2000
2000
2000
2001
2002
2002
2002
2003
2003
2003
2004
2004
2004
2004
2004
2005
2005
By looking at the timeline of biometric technology, we can conclude that the true
biometric systems began to emerge in the later half of the twentieth century, coinciding
with the emergence of computer systems. Over the last quarter century or so, people have
developed a large number of biometric devices. But the best established biometric
techniques predate the computer age altogether namely the use of handwritten
signatures, facial features, and fingerprints.
In recent years, there were and are so many cases that have proved that biological
characteristics are the most powerful tools to authenticate person's identity. The emphasis
now is to automatically perform reliable identification of persons in unattended mode,
often remotely (or at a distance).
compact and distinguishable (noticeable) features. Thus a longer and highly stable bit
stream can probably be produced. Experiments were carried out on database containing
face images to demonstrate the practicability of the framework.
Cancellable biometrics proffers a greater level of privacy by facilitating more than one
template for the same biometric data and thus providing for the non-linkability of users
data stored in diverse databases. The measurement of the success of a particular
transformation and matching algorithm for fingerprints was described by Russell Ang et
al. [97], in 2005. A key dependent geometric transform was employed on the features
obtained from a fingerprint, so as to produce a key-dependent cancellable template for the
fingerprint. Besides, they also have studied the performance of an authentication system
that utilizes the cancellable fingerprint matching algorithm for detection purposes.
Experimental evaluation of the system was carried out and the results illustrated that it
was possible to bring a good performance when the matching algorithm remains
unaltered.
A cancellable biometric approach called PalmHashing was proposed by Connie Tee et al.
[98] in 2005, in order to address the non-revocable biometric issue. This technique hashes
palmprint templates with a set of pseudo-random keys to acquire a unique code known as
the palmhash. It is possible to store the palmhash code in portable devices such as tokens
or smartcards for authentication. Moreover, PalmHashing also provides numerous
advantages over other modern day approaches including clear separation of the genuineimpostor populations and zero Equal Error Rate (EER) occurrences. They outlined the
implementation facts besides emphasizing its capabilities in security-critical applications.
Hao, F. et al. [99] in 2006, presented a realistic and secure way to incorporate the iris
biometric into cryptographic applications. They deliberated on the error patterns within
iris codes and developed a two-layer error correction technique that merges Hadamard
and Reed-Solomon codes. The key was produced from the iris image of the subject
through the auxiliary error correction data that do not disclose the key and can be saved
in a tamper-resistant token-like smart card. The evaluation of the methodology was
performed with the aid of 70 different samples from eyes. It was established that an errorfree key can be reproduced reliably from genuine iris codes with a success rate of 99.5%.
71
It is possible to produce up to 140 bits of biometric key, more than adequate for 128-bit
AES.
On basis of recent works displaying the likelihood of key generation by means of
biometrics, the application of handwritten signature to cryptography was analyzed by M.
Freire-Santos et al. [100] in 2006. A cryptographic construction called the fuzzy vault
was employed in the signature-based key generation scheme. The analysis and evaluation
of the usability of distinctive signature features appropriate for the fuzzy vault was
carried out. Results of experimental evaluation were reported. The reports also included
the error rates to release the secret data with the aid of both random and skilled forgeries
from the MCYT online and offline signature database.
A fuzzy commitment method working on lattice mapping for cryptographic key
generation from biometric data was proposed by Gang Zheng et al. [101] in 2006. This
method, despite providing high entropy keys as output, as well obscures the original
biometric data, so that it becomes infeasible to recover the biometric data even if the
stored information in the system is open to an attacker. Simulation results illustrated that
the methods authentication accuracy was analogous to the well known k-nearest
neighbour (KNN) classification.
Biometric characteristics are immutable and hence their compromise is permanent. To
address this problem, A.T. Beng Jin and Tee Conniea [102] in 2006, have proposed the
cancellable biometrics. Also they described biometric templates that can be cancelled and
replaced (restored). BioHash is a cancellable biometric that combines a set of userspecific random vectors with biometric features. The main drawback of BioHash is its
great degradation in performance when the legitimate token is stolen and used by the
impostor to claim as the legitimate user. They employed a modified probabilistic neural
network as the classifier to alleviate this problem.
Teoh AB et al. [103] in 2007, have presented a two-factor cancellable formulation that
facilitates data distortion in a revocable but non-reversible manner by first converting the
raw biometric data into a fixed-length feature vector, followed by the projection of the
feature vector onto a sequence of random subspaces that were obtained from a userspecific pseudo-random-number (PRN). The process was revocable and making the
72
and is used to achieve the mathematical foundation of BioHash. On the basis of this
model, they have described the characteristics of BioHash in pattern recognition besides
security perspective and have offered few methods to solve the stolen-token issue.
Huijuan Yang et al. [110], in 2009, have presented a non-invertible transform to
perpendicularly project the distances between a pair of minutiae to a circle and to
generate the characteristics. Additional local features like relative angles between the
minutiae pair and global features like orientation, ridge frequency and total number of
minutiae of the randomly sampled blocks around each minutia were also employed to
obtain better performance. Finally, the Bin-based Quantization (BQ) generates the
cancellable templates. The feature extraction and cancellable template generation are
controlled by a secret key to ensure revocability and security. An experimental result
shown on FVC 2002 data set and the scheme is providing better performance.
B. Prasanalakshmi and A. Kannammal [111] in 2009, proposed a novel technique to
generate an irrevocable cryptographic key from the biometric template. The biometric
trait considered in their proposal was the palm vein. The proposed technique uses the
minutiae features extracted from the pattern generated. The features include bifurcation
points and ending points. Since other cryptographic keys are probable to theft or guess,
keys generated from the biometric entity are more preferable as biometric keys are
attached with the user. Minutiae patterns generated from the palm vein are converted to
cancellable templates which in turn can be used for irrevocable key generation.
H. A. Garcia-Baleon et al. [112] in 2009, proposed an approach for cryptographic key
generation which is on the basis of key-stroke dynamics and k-medoids algorithm.
Training-enrollment and user verification are the stages in the aforementioned approach.
The approach checks the identity of individuals off-line by not using a centralized
database. From the simulation results, a false acceptance rate (FAR) of 5.26% and a false
rejection rate (FRR) of 10% are obtained. The cryptographic key obtained from the
approach may be applied in diverse encryption algorithms.
75
4.7 SUMMARY
This chapter covered the history of biometricshow it has evolved, how it has become a
major challenging research topic and continues to be so even today. Further we have
given the time-line of biometrics. Finally, we have discussed the relevant research
publications in the area of cancellable biometrics. In the next chapter, we are going to
cover elaborately the cancellable biometrics and challenges in generating key generation
and their algorithms. The next chapter provides the base and motivation for our proposed
work.
76
CHAPTER 5
THEORETICAL BACKGROUND
"Theoretical principles must sometimes give way for the sake of practical advantages."
William Pitt
5.1 INTRODUCTION
Theoretical background of our proposed cancellable biometric key generation system is
discussed in this chapter. To make biometrics systems more robust, the concept of
cancellable biometrics is proposed. In order to safeguard the privacy and to prevent
disclosure of any information saved in databases for personal identification or
verification, cancellable biometrics template is preferred to be non-invertible. Here, we
provide the background information related to cancellable biometric systems and biocryptographic techniques. In addition, the main concepts we have used in our proposed
systems, are discussed concisely in the subsequent sections. This includes the concepts
used for the pre-processing of the input fingerprint image, Region of Interest (ROI)
selection and the extraction methods together with the minutiae extraction algorithms and
the encryption/decryption techniques.
l 0
l 0
O(l ) N (l )
(1)
Also, as the objective is to obtain an output picture with a uniformly flat histogram this
should be the same for a randomly selected level p . So, covering up to the level q in the
new histogram necessitates transforming up to p level in the cumulative histogram,
p
l 0
l 0
O(l ) N (l )
(2)
The cumulative histogram up to level p should be a fraction of the overall sum as the
output histogram is uniformly flat. So dividing the number of points by the range of
levels in the output image gives the number of points per level in the output picture,
N (l )
N2
N max N min
(3)
N (l )
qu
l 0
N2
N max N min
(4)
Equating this to the cumulative histogram of the input image as per Equation 2,
qu
N2
N max N min
O(l )
(5)
l 0
N max N min
N2
O(l )
(6)
l 0
An output image having a roughly flat histogram is provided by the mapping function
that is obtained by phrasing Equation 6 as an equalizing function (E) of the level ( q ) and
the image ( O ) as,
E ( q, O )
N max N min
N
u O(l )
(7)
l 0
N x, y
E (O x, y , O)
(8)
5.4.2 Filters
It is sometimes desirable to have circuits capable of selectively filtering one frequency or
range of frequencies out of a mix of different frequencies in a circuit. A circuit designed
to perform this frequency selection is called a filter circuit, or simply a filter. A common
need for filter circuits is in high-performance stereo systems, where certain ranges of
audio frequencies need to be amplified or suppressed for best sound quality and power
efficiency. You may be familiar with equalizers, which allow the amplitudes of several
frequency ranges to be adjusted to suit the listeners taste and acoustic properties of the
listening area. You may also be familiar with crossover networks, which block certain
ranges of frequencies from reaching speakers. A tweeter (high-frequency speaker) is
inefficient at reproducing low-frequency signals such as drum beats, so a crossover
81
circuit is connected between the tweeter and the stereos output terminals to block lowfrequency signals, only passing high-frequency signals to the speakers connection
terminals. This gives better audio system efficiency and thus better performance. Both
equalizers and crossover networks are examples of filters, designed to accomplish
filtering of certain frequencies.
Another practical application of filter circuits is in the conditioning of non-sinusoidal
voltage waveforms in power circuits. Some electronic devices are sensitive to the
presence of harmonics in the power supply voltage, and so require power conditioning for
proper operation. If a distorted sine-wave voltage behaves like a series of harmonic
waveforms added to the fundamental frequency, then it should be possible to construct a
filter circuit that only allows the fundamental waveform frequency to pass through,
blocking all (higher-frequency) harmonics. We will be studying the design of several
elementary filter circuits in this lesson. To reduce the load of math on the reader, I will
make extensive use of SPICE as an analysis tool; displaying Bode plots (amplitude
versus frequency) for the various kinds of filters. Bear in mind, though, that these circuits
can be analyzed over several points of frequency by repeated series-parallel analysis,
much like the previous example with two sources (60 and 90 Hz), if the student is willing
to invest a lot of time working and re-working circuit calculations for each frequency.
REVIEW:
x
A Bode plot is a graph plotting waveform amplitude or phase on one axis and
frequency on the other.
Filters are variously named by Low-pass filters, High-pass filters, Band-pass filters,
Band-stop filters, Resonant filters, Digital Filter, FFT Filter, Low-Pass Filter, Smoothing
Filter, Audio Filter, High Frequency Noise Reduction Filter, Lagging-Phase Filter and
more. Filtering is an operation which removes high-frequency fluctuations from a signal.
82
Low-pass filtering is another term for the same thing, but is restricted to methods which
are linear: i.e. if you want to filter a signal x(t ) y (t ) , it does not matter whether you
apply the filter before or after adding the two signals. Such linear operations can be
described by a frequency response. All methods described here are linear, with the
exception of curve fitting. The following sub-sections describe the two well-known
filtering mechanisms viz., Gabor filter and Wiener filter respectively.
5.4.2.1 Gabor Filter
Gabor filters have been successfully used for feature extraction in many machine vision
applications. Gabor filters have the ability to perform multi-resolution decomposition due
to its localization both in spatial and spatial-frequency domain. Texture segmentation
requires simultaneous measurements in both the spatial and the spatial-frequency
domains. Filters with smaller bandwidths in the spatial-frequency domain are more
desirable because they allow us to make finer distinctions among different textures. A
robust tool in the image processing field for texture analysis is Gabor Transform the
impulse response of which is a Gaussian function multiplied harmonic function. In the
spatial domain, a 2D Gabor filter is represented as a sinusoidal plane wave modulated
Gaussian kernel function. The kernel function of a Gabor filter is represented as:
<( X )
kv 2
k 2X 2
V2
exp( v
){exp(iKX ) exp(
)}
2
2
2V
(9)
Where, the real part and imaginary part of the oscillation function exp(iKX ) consist of
2
kv X 2
) is a Gauss function. It
2V 2
limits the availability of oscillation function to local range by restricting the scope of the
oscillation function. The direct current component, exp(V 2 / 2) also called direct
current compensation can protect the filter from getting influenced by the extent of direct
current. This component can make the filter insensitive to illumination intensity by
preventing the influence of the absolute value of image grey-level [124].
The kernel function of 2D Gabor filter that has two parts, namely real and imaginary part
is a compound function,
83
G k ( x, y )
G r ( x, y ) Gi ( x, y )
(10)
kv 2
k 2 (x 2 y 2 )
V2
) * [cos(k v cos(M u ) x k v sin(M u ) y ) exp(
)]
exp( v
2
2V 2
V2
(11)
kv 2
k 2 (x2 y 2 )
V2
exp( v
) * [sin(k v cos(M u ) x k v sin(M u ) y ) exp(
)]
2
2V 2
V2
(12)
Construction of Wiener filter is possible in the frequency domain also. One method to
derive such a filter uses the overlap-add to transform the time-domain wiener filter into
the frequency domain and it has precisely the same performance as its time-domain
counterpart. But, more commonly the clean speech spectrum is directly estimated from
the noisy speech spectrum to construct the frequency-domain wiener filter. The filter thus
obtained is different from the time-domain wiener filter in two respects: the time-domain
wiener filter is a sub-band technique where the sub-band filters are independent of the
other frequency band filters and can be non-causal while the frequency domain wiener
filter is a full-band technique and causal [134].
The frequency-domain sub-band Wiener filter can be represented as,
H o (iZ k )
where J X [ H (iZ k )]
>
(13)
spectrum and its assessment at frequency Z k . The Wiener filter can be straightforwardly
realized by equating the result of the differentiation of J X [ H (iZ k )] with respect to
H (iZk ) to zero.
84
H o (iZ k )
E[| X (n, iZ k ) | 2 ]
E[| Y (n, iZ k ) | 2 ]
Px (Z k )
Py (Z k )
(14)
Here, the power spectral densities (PSDs) of x(n) and y (n) are represented by
Px (Z k )
1
E[| X ( n, iZ k ) | 2 ]
L
and
Py (Z k )
1
E[| Y (n, iZ k ) | 2 ]
L
respectively.
The
The background grey-level and the contrast between the objects and the background
frequently differ inside the single image because of irregular illumination and other
reasons. Since a threshold that performs properly in one region of the image might
perform defectively in other regions, achieving acceptable results by means of global
thresholding is unlikely in such cases. This variation can be avoided using an adjusting or
alterable threshold that is a gradually changing function of location in the image.
Adaptive thresholding can be performed by analyzing the grey-level histograms of
pixel non-overlapping blocks obtained by dividing an
85
interpolating the resulting threshold values calculated from the blocks to construct a
thresholding surface for the whole image. Reliable estimation of the histogram and
setting of a threshold necessitates that the blocks should be of appropriate size to contain
an adequate number of background pixels in each block [132].
A two-pass operation can also be used for implementing Adaptive thresholding [131].
Based on the histogram of each block a threshold is calculated prior to the first pass by
selecting, for example, the value positioned in the middle of the background and the
object peaks. The unimodal histograms containing blocks can be discarded. In the first
pass, a grey-level threshold that is fixed within each block but varies for different blocks
is used to define object boundaries. The interior mean grey-level of each of the objects
thus defined is computed, though the objects are not extracted from the image. In the
second pass, each object is designated with its own threshold that is situated in the middle
of its internal grey-level and the background grey-level of its major block [123].
5.4.4 Morphological Operations
Black pixel: its value will be 0 for an 8 bits/pixel indexed image in greyscale
86
White pixel: its value will be 255 for an 8 bits/pixel indexed image in greyscale
The dilation: The dilation process is carried out by placing the structuring element B on
the image A and moving it over the image as is done for convolution. But the performed
operation is different and it is most excellently explained in a succession of steps:
*
No change is made if the image pixel is white at the location of the origin of the
structuring element in the image, so skip to the next pixel.
Each pixel of the image that comes under the structuring element must be made
black, if the origin of the structuring element occurs at a black pixel of the
image.
The erosion: The process of erosion is same as the dilation process except that the pixels
are changed to 'white' instead of 'black'. The structuring element is moved over the image
and the following steps are performed:
*
No change is made, if the image pixel is white at the location of the origin of the
structuring element, and hence, skips to the next pixel.
The black pixel in the image which lies at the centre of the structuring element
is made white if a black pixel occurs at the location of the origin of the
structuring element and at least one black pixel of the structuring element lies
upon a white pixel in the image.
This utilizes the binarization and thinning methods detecting adjacent ridge information
of minutiae to compute the minutiae scores.
5.4.5.1 Binarization
Image binarization is an important process for image analysis. The inherently bi-level
nature of the image has led to many of the image analysis algorithms being designed for
use on bi-level images. If the image binarization is improperly done, then the follow-on
steps cannot proceed appropriately. So there is a necessity to do the binarization process.
Converting the greyscale fingerprint image into a binary image is essential in majority of
the available methods. A priori enhancements immensely benefit certain binarization
87
The ridge thinning process is utilized to remove the redundant pixels until the ridges
become one pixel wide. In image analysis and understanding thinning is considered to be
one of the most significant pre-processing steps. Several thinning methods that have been
created for binary image exhibit reasonably good results [128]. Though thinning has
substantial number of its own specific applications like thinning non-uniform brightness
objects or tinned edge identification, the grey-level thinning has been not so far seriously
investigated as the generalization of bi-level thinning. Precise standard evaluation of
thinning algorithms has not yet been developed in the available literature. Properties such
as topology, shape, connectivity, and sensitivity to boundary noise are commonly
considered essential for a good skeleton. In other words, the following characteristics
must be present in a good thinning algorithm [129]:
1) The resulting skeleton and the object must be topologically same.
2) It must run close to the medial axes of the object regions.
3) It must have a thickness of either one pixel or the least thickness.
4) It should maintain both foreground and background connectivity.
5) It should be noise insensitive to tiny protrusions and indentations in the borders.
6) It should restrain pervasive erosion and should not perform total deletion.
7) It should not necessitate iteration greater than the least possible amount.
8) It should prevent bias in some directions by symmetrical deletion of pixels.
88
Similar to other algorithms, Advanced Encryption Standard (AES) may also be utilized in
diverse ways to perform encryption. Different methods are appropriate for different
settings. Though AES is secure, the result may become insecure unless the appropriate
method is employed in the appropriate manner for each and every circumstance. Though
it is extremely simple to employ a system that uses AES as its encryption algorithm, an
exceedingly greater amount of skill and expertise is needed to perform it properly for a
specified situation. Data is processed in blocks of 128 bits by AES which is a symmetric
encryption algorithm. Unlike decimal digits which can take 10 possible values in
operation, a binary digit or a bit can take only two possible values either zero or one. A
128-bit block is converted into a new block of the same size under the influence of a key
by encryption. The reverse transformation namely decryption uses the same key that is
used for encryption and hence AES is symmetric. Key is the only thing that must be kept
secret for security. AES can be configured to use keys of different lengths and the names
of the three commonly used configurations AES-128, AES-192 and AES-256 signify the
length in bits of the key which they use. The strength of the algorithm in terms of the
time needed for an attacker to carry out a brute force attack i.e., to find the right key by
performing a complete search of all possible key combinations, becomes twofold with the
increase of an extra bit in the key [130].
5.6 SUMMARY
In this chapter, we have briefly presented all the well-known existing concepts that we
have utilized in our proposed systems. The major concepts discussed include fingerprint
image pre-processing and the minutiae extraction methods. In addition to this, we have
also briefly discussed the encryption algorithm. This brief introduction is presented to
facilitate easy understanding of the working of our proposed systems.
90
CHAPTER 6
MOTIVATION FOR THE RESEARCH
Research is to see what everybody else has seen, and to think what nobody else has
thought. Albert Szent-Gyorgyi
6.1 INTRODUCTION
This chapter presents the motivation for the research and its significance in biometric
field, especially in the field of cancellable biometrics. At first, the necessity of
cancellable biometrics and its advantageous benefits are discussed in detail. Then, the
two significant contributions presented in recent time to develop the cancellable
templates of the fingerprint images are presented. Further, the detailed steps involved in
generating the cancellable template, are discussed. Finally, the conclusion of this chapter
is summarized in a concise manner.
the same biometric data leading to higher level of privacy by keeping transformed user
data in different databases.
Here, we briefly describe biometric based applications, especially fingerprint system.
Biometric-based authentication applications comprise of workstation and network
access, data protection, remote access to resources, transaction security, Web security and
more. One of the most viable existing biometric technologies is fingerprint recognition.
The incapability to normalize the fingerprint data is the major concern in generating the
hash functions for fingerprint minutiae. The values of the hash functions are intended to
be orientation/position-dependent when the fingerprint data is not normalized. The
method to avoid this problem is to have both hash functions and matching algorithm to
deal with the transformations of the fingerprint data. It is infeasible to apply hash
functions in regard to the minutiae set of the entire fingerprint. Substantial alterations in
hash values are produced even with the minor difference in minutia sets of two prints of
the same finger. Further, the higher order hash values are likely to vary in a large measure
with even a minor variation in positions of the minutia points. The two additional factors
that govern the security given by non-invertible transform are system module where, the
transformation is applied (e.g., fingerprint scanner, client, server, or third party certifier)
and the location where, the fingerprint template is presented (e.g., client, server, third
party certifier, or smartcard). The creation of the hash function guarantees the noninvertibility and so in principle, this method is very attractive [148].
Next, we briefly discuss security systems based on merging of cryptography and
biometric techniques, viz., crypto-biometric systems [99, 139, 104, and 101] that have
been extensively developed for solving the key management problem of cryptographic
systems and providing security against the stored templates in biometric systems. This
research work has been motivated by a significant number of previous researches
presented in the literature regarding cancellable biometrics and cryptographic key
generation. With these factors, two important contributions [54, 143] available in the
literature have motivated us to continue the research work in efficient key generation
using cancellable fingerprint template. Radha et al. [54], in order to address biometric
authentication problem, have proposed several techniques based on both cryptographic
93
and biometric techniques. In particular, they have explained the advantages of cancellable
biometrics over other approaches. Also, a case study of applying the technique to a
fingerprint database is made by them. Also, relative merits of several other methods, like
Cartesian, polar, and functional transformation, were studied and compared empirically.
On the other hand, S. Tulyakov et al. [143] have proposed a method which uses
innovative symmetric hash functions to secure fingerprint templates. The features are
unordered for fingerprint minutiae and thus, these symmetric functions can be utilized for
any biometric modality. Their [54, 143] description specifies that implementation of a
highly performing and secured authentication system is comparable to straight matching
systems. In the following sub-sections, we briefly discuss about the two important works
[54, 143] that was proposed by Radha et al. and S. Tulyakov et al.
be globally smooth to make it cryptographically secure, else it would be easy to invert it.
Hence, they propose functional surface folding transform that is locally smooth but not
globally smooth. Their proposed function has many positions in the actual space which
are mapped to the same position in the transformed space known as folds [53].
Abstractly, standard hash functions achieve non-invertibility through this same property
and essentially, this creates obscurity in reversing the transform which results in the
desired non-invertibility. To achieve non-invertibility, they performed the steps such as,
registration and transformation.
Registration Prior to Transformation: The procedure of registering the image is the
foremost vital step in the use of a cancellable transform. The minutiae locations have to
be measured with respect to the same coordinate system so that transform becomes
repeatable. It is achieved by taking estimation of the location and the orientation of the
singular points (core and delta) and representing the minutiae positions and angles with
reference to these points. Though there have been various approaches to find out the core
and delta [146, 147], precise estimation of it, is a tedious task. The minutiae feature
points are transformed reliably across many instances, when the global registration has
been established once through the usage of a singular point position. Though the basic
notion of cancellable biometrics is to permanently transform the minutiae feature
locations and orientations, the transform itself can be achieved in various ways.
Cartesian Transformation: While considering the Cartesian transformation, rectangular
coordinates, which points to the position of the singular point, are used for the
measurement of minutiae positions. The x-axis is aligned with respect to the point of
reference of the singular point. The space are divided into cells of equal size i.e., this
coordinate system is separated into fixed size cells as shown in figure 6.1. The
permutation of transformation is not a strict one because of the condition of irreversibility
need more than one cell to be mapped to similar cell. When looking into this case, a
Mapping Matrix
governs the cell mapping. The positions of the cells after applying
95
Figure 6.1: Cartesian transformation which maps each cell to some random
cell with collisions.
Figure 6.2: Polar transformation where each sector is mapped into some other
random sector after transformation.
96
at a distance from the core. Thus, unconstrained mapping is not practicable in polar
coordinates. This raises a state that the occurrence of minutiae pairs under a tolerance
distance of each other previous to transformation and, not being a match after
transformation since the huge divergence that occurs away from the core. Therefore, in
the polar transformation, governing of mapping is done by a translation key
which defines the cell transformation. The locations of the sectors after and before
transformation is related as,
6.2.2 S. Tulyakov et al.s Work on Symmetric Hash Functions for Secure Fingerprint
Biometric Systems
Since the difficulty in the fingerprint data normalization, it is quite hard to produce hash
functions for fingerprint minutiae. The hashing functions are intended to be
orientation/position-dependent if the fingerprint data is not normalized. This difficulty
can easily be managed by having hash functions as well as the matching algorithm which
deals with transformations of the fingerprint data. The authors [143] accomplished
matching only on localized sets of minutia in order to override these difficulties. Figure
6.3 describes the biometric matching performed on the hashed feature sets proposed in
[143].
From the authors [143], the task of fingerprint matching can be done using the two
important steps, (i) Localized minutiae sets (ii) Hashing localized minutiae sets.
Localized minutiae sets: Global matching of two fingerprints is taken as a group of the
localized matching with analogous transformation parameters
and , where
is rotation
and is transformation. The localized set is found out by a specific minutiae and a few of
its neighbours, as in the base fingerprint matcher [145]. In order to evade the global
alignment, S. Tulyakov et al. [143] used concepts alike to Germain et al. [144] and Jea et
al. [145] in order to combine the outcomes of localized matching into the fingerprint
recognition algorithm. Matching of minutia triplets by the usage of attributes such as
angles and distances between minutia points is done in localized matching. For every
97
and the orientation difference between the central minutia and its nearest neighbours. For
localized matching, they record only restricted information about the matched
neighbours, and hence, minutiae positions cannot be reinstated from the transformed data.
where
sets. A minor alteration in the input such as lost information, noise or a modification in
the order of the input and so on, can cause a substantial variation in the hash value. Some
classes of hash functions can be generated such that they are invariant to the order in
which the input pattern is offered to the hash function which are known as order
98
independent
or
symmetric
hash
functions.
Considering
an
input
sequence
different hash value while the second is left unaffected. S. Tulyakov et al. represent
minutiae points as complex numbers
one finger that coming from different scanners and different positioning of the finger on
the scanner will have separate position, rotation and scale. A complex function
can be used to represent the transformation of one fingerprint to another.
represents the minutiae point
located at co-ordinates
and
are used to
characterize the scalar rotation and translation parameters of the accidental shift of points
related to the registration and authentication scans. In this approach, the authors build
hash functions and the equivalent matching algorithm such that the accidental shifting is
considered.
6.3 SUMMARY
This chapter discusses earlier research work in the biometric field, especially in
cancellable biometrics. Initially, we have presented the advantages of the cancellable
biometrics and their importance. Later, we explained about the two significant
contributions made by the researchers in respect of developing the cancellable templates
of the fingerprint images. This chapter provides motivation to pursue research work in
this field.
99
CHAPTER 7
SUGGESTED NEW APPROACHES
TO CANCELLABLE BIOMETRIC BASED SECURITY
"The new strategic environment requires new approaches to deterrence and defence."
Peter Flory
7.1 INTRODUCTION
As discussed in previous chapters, Biometric security systems have a number of
problems because of the fact that the biometric data of a person is generally stored in the
system itself. Cancellable Biometrics is one of the solutions for solving these problems.
There are several methods that have been proposed in the literature related to Cancellable
Biometrics. But we consider two most significant methods which are proposed by Ratha
et al. and S. Tulyakov et al. These approaches have already been discussed in chapter 6
elaborately, and the discussion provides motivation for us to continue the work in this
field. In this chapter, we present our research work consisting of two new algorithms, viz.,
New-Fangled Approach for Cancellable Biometric Key Generation and Development of
Bio-Crypto Key from Fingerprints Using Cancellable Templates etc. for secure
fingerprint biometric systems.
biometric features. There are several biometric systems in existence that deal with
cryptography, but the proposed cancellable biometric system introduces a novel method
to generate Cryptographic Key. We discuss about the security analysis of the proposed
Cancellable Biometric System as well. In the following sections, we discuss our proposed
algorithms one by one.
7.3
FIRST
PROPOSED
METHOD:
NEW-FANGLED
APPROACH
FOR
operations [151, 152] are used to extract Region of Interest (ROI). In a morphological
operation, the value of each pixel in the output image is based on a comparison of the
equivalent pixel in the input image with its neighbours. By selecting the size and shape of
the neighbourhood, we can construct a morphological operation that is sensitive to
specific shapes in the input image.
7.3.1.1 Pre-processing
i) Histogram equalization: This method usually increases the local contrast of many
images, especially when the usable data of the image is represented by close contrast
values. Through this adjustment, the intensities can be better distributed on the histogram.
Moreover, histogram equalization increases the perceptional information of the image by
permitting the pixel values to expand the distribution of an image.
(b)
(a)
Figure 7.1: (a) Original fingerprint image (b) Histogram equalized image
The original histogram of a fingerprint image will be of bimodal type, and the histogram
after the equalization converts all the range values from 0 to 255 and the visualization
effect is improved. Here, the Figure 7.1 depicts the original fingerprint image and its
corresponding histogram equalized image.
ii) Gabor filtering: The Gabor filter is applied to the fingerprint image obtained in the
previous step by spatially convolving the image with the filter.
A two-dimensional Gabor [150] filter consists of a sinusoidal plane wave of a specific
orientation and frequency, modulated by a Gaussian envelope. Gabor filters are employed
as they have frequency-selective and orientation-selective properties. These properties
permit the filter to be tuned to give maximal response to ridges at a specific orientation
102
and frequency in the fingerprint image. So, a properly tuned Gabor filter shall be used to
effectively retain the ridge structures while reducing noise. The even-symmetric Gabor
filter is the real part of the Gabor function, which is yielded by a cosine wave modulated
by a Gaussian.
A Gaussian function multiplied by a harmonic function defines the impulse response of
the linear filter, the Gabor filter. Because of the multiplication-convolution property
(Convolution theorem), the Fourier transform of a Gabor filter's impulse response is the
convolution of the Fourier transform of the harmonic function and the Fourier transform
of the Gaussian function as given below:
g ( x, y; O , T ,\ , V , J ) exp(
x'2 J 2 y'2
x'
) cos(2S
\ )
2V 2
O
Where,
x'
x cos T y sin T
y'
x sin T y cos T
and
In this equation, O represents the wavelength of the cosine factor, T represents the
orientation of the normal to the parallel stripes of a Gabor function, < is the phase offset,
and J is the spatial aspect ratio, and specifies the ellipticity of the support of the Gabor
function.
7.3.1.2 Region of Interest (ROI) Selection
i) Binarization: Nearly all minutiae extraction algorithms function on binary images
where there are only two levels of interest: the black pixels that denote ridges, and the
white pixels that denote valleys. Binarization is the process that translates a grey level
image into a binary image. This improves or enhances the contrast between the ridges
and valleys in a fingerprint image, and consequently makes it possible for effectual
extraction of minutiae points.
One practical property of the Gabor filter is that it has a DC component of zero, which
means the resultant filtered image has a mean pixel value of zero. Hence, straightforward
103
binarization of the image can be achieved using a global threshold of zero. The
binarization process involves analyzing the grey-level value of each pixel in the enhanced
image, and, if the value is greater than the global threshold, then the pixel value is set to a
binary value one; otherwise, it is set to zero. The result is a binary image holding two
levels of information, the foreground ridges and the background valleys.
ii) Adaptive Thresholding: The adaptive thresholding method [157] is on the basis of
the analysis of statistical parameters. This includes arithmetic mean, geometrical mean
and standard deviation of the sub-band coefficients. Local adaptive thresholding scheme
has been the most commonly used, by researchers due to the fact that it binarizes and
improves the poor quality of the images for locating the meaningful textual information
[158].
iii) ROI extraction by morphological operations: For ROI extraction from the binary
fingerprint image, we apply the morphological opening and closing operations on the
greyscale or binary image, using a structuring element. The structuring element is a
single structuring element object, as opposed to an array of objects for both open and
close. Hence as a result, the morphological operators will throw away the leftmost,
rightmost, uppermost and bottommost blocks out of the bound, so as to get the tightly
bounded region just containing the bound and inner area.
7.3.1.3 Minutiae Extraction
The last image enhancement step normally performed is thinning. Thinning is a
morphological operation that successively erodes away the foreground pixels until they
are one pixel wide. Ridge Thinning is to eliminate the redundant pixels of ridges till the
104
ridges are just one pixel wide [153] uses a Ridge Thinning algorithm, which is used for
Minutiae points extraction in our approach. The image is divided into two distinct
subfields in a checkerboard pattern. In the first sub-iteration, delete pixel p from the first
subfield if and only if the conditions G1, G2, and G3 (defined below) are all satisfied. In
the second sub-iteration, delete pixel p from the second subfield if and only if the
conditions G1, G2, and G3' (defined below) are all satisfied.
Condition G1:
X H ( P) 1
Where
4
X H ( P)
i 1
bi
1 if x 2i 1 0 and x 2i
0 otherwise
1 or x 2i 1
x1 , x2 ,..., x8 are the values of the eight neighbours of p , starting with the east neighbour
and numbered in counter-clockwise order.
Condition G2:
2 d min{n1 ( p ), n2 ( p )} d 3
Where,
4
n1 ( p)
2 k 1
x2k
k 1
4
n2 ( p)
2k
x 2 k 1
k 1
Condition G3:
( x2 x3 x8 ) x1
Condition G3:
( x6 x7 x ) x5
0
105
The steps involved in the generation of Secured Feature Matrix are discussed in this subsection. We assume that the extracted minutiae points co-ordinates are maintained in a
vector.
^pi `
1,, N p
^xi : pxi ` i
>
1,, Lk
Then
the
initial
key
vector (VK ) is
converted
in
to
matrix
BK m of
Evk ( BK m )
The key used in the AES encryption is the generated key from the whole process. Once
the key is generated, AES encryption further to generates a secured feature matrix, but
initially encryption process doesnt occur in the key generation from the whole process.
106
The key is generated as follows. First of all, the Secured Feature Matrix is decrypted by
AES Decryption to form the deciphered matrix.
Then the resultant matrix BK m is given as
BK m
Dvk ( SFm )
Where, BK m
a sqrt L
sqrt L
ij
^K i : pk ` i
1,, Lk
Where, pk SMij , SMij BKmi, j : i size, j size, 1 i sqrtLk SM ij is an extracted matrix
formed from the key matrix. Then the final key vector is formed as
FBK v
0, otherwise
The extracted final key vector is more secured and it is non-reversible. That means the
final key cannot be traced back from the template. This irreversible property makes the
key unbreakable, because we processed through minutiae points and secured feature
matrix.
templates with added security and 3) Cryptographic key generation from the Secured
cancellable template.
The resultant cryptographic key thus generated is irrevocable and unique to a particular
cancellable template, making the generation of new cancellable templates and
cryptographic keys feasible. The experimental results portray the effectiveness of the
cancellable template and the cryptographic key generated. The various steps and
techniques used in the proposed approach are detailed in this section.
In this proposed approach also, the process of extracting the minutiae points from the
fingerprint is composed of three processing steps namely,
1) Pre-processing
2) Region of Interest (ROI) selection
3) Minutiae extraction
Histogram Equalization [154] and Wiener Filters [155] have been made use to achieve
image enhancement in fingerprint images. Subsequently, the locally adaptive threshold
method [151] is applied to perform binarization on the fingerprint image. Morphological
operations [151, 152] are then utilized to extract the Region of Interest (ROI) from the
fingerprint image. Eventually, minutiae points are extracted using the Ridge Thinning
algorithm [153].
7.4.1.1 Pre-processing
The first level of pre-processing is same as discussed in section 7.3.1.1. However, in this
method, instead of Gabor Filter, we have used Wiener Filter.
i) Histogram equalization: It same as discussed in section 7.3.1.1.
ii) Wiener Filtering: Wiener filter can be defined as a Mean Squared Error (MSE)-
optimal stationary linear filter for images degraded by additive noise and blurring. In
order to perform wiener filtering, [156] we assume that the signal and noise processes are
second-order stationary (in the random process sense). Generally, the wiener filters are
108
made use of in the frequency domain. When the stationary nature of the concerned
signals is presumed, the average squared distance between the filter output as well as a
desired signal is lessened by means of computing the coefficients of a wiener filter [155].
This can be accomplished with ease in the frequency domain. As a consequence, we get
the frequency domain equation:
S( f )
W ( f )Y ( f )
where S ( f ) is the wiener filter output, Y(f ) is the wiener filter input, W ( f ) is coefficient
of wiener filter and W ( f )
PDY ( f )
The last level of minutiae extraction is similar to the one discussed in section 7.3.1.3.
In this sub-section, we have presented the steps involved in the generation of the secured
cancellable template from the extracted minutiae points. The steps involved are as
follows:
The extracted minutiae points form the set P and their corresponding x , y co-ordinates
form the set M p as represented below:
P
Mp
[ P1 P2 P3 Pn ]
[ x1 y1 x2 y 2 xn y n ]
[r1 r2 r3 rn ]; where n | M p |, ri
109
random(); 1 d i d n
Then, exponential values are computed for each individual element in the vector R N and
stored in ER N .
[e r1 , e r2 , , e rn ]
ER N
For every element in ER N , choose a set of x subsequent prime numbers (very large) to form
a row of the matrix PN . Every row of the matrix PN will have distinct number of elements.
The number of elements x will be equal to the coordinate value of the elements in M p .
PN
( P1
( P1
( P
1
( P1
P2 P3 Pn
x1 )
P2 P3 Pn
y1 )
P2
P2
P3 Pn xn )
P3 Pn yn )
Subsequently, a prime number pair is selected randomly from the two succeeding rows of PN ,
with a prime number from each row, and the pair is multiplied to obtain the transformed
point TP . The transformed points are stored in a vector PFV .
PFV
( Pl * Pm )
Since each transformed point TP is formed by the multiplication of two large prime numbers
Pl and Pm , it is almost computationally infeasible to determine the factors Pl and Pm from TP ,
DV
(TPi TP j ) 2
[ d1 d 2 d 3 d kv ]
110
The vector DV is then transformed into a matrix to form the cancellable template TM .
TM
| DV ij |
kv u kv
Henceforth, the cancelable template, serves as the source for the generation of the
cryptographic key. This necessitates the secure storage of the cancelable template such
that it is either un-modifiable or un-accessible to the people other than authorized users.
Hence, the resultant cancelable template TM is encrypted with the AES algorithm to
form the encrypted cancelable template, CTM , i.e.,
CTM
Enc [TM ]
The generated cancellable template TM is irreversible; also, the security of the cancellable
template created is increased by the strength of AES.
The steps involved in the generation of the cryptographic key from the secured
cancellable template are as follows: Initially, the encrypted cancellable template is
decrypted with the AES Decryption algorithm to obtain the cancellable template TM , i.e.,
TM
Dec(CTM )
and P(v)
Tij
4X 4
(vi : P (v)),
i 1,, n
; i, j : i size, j size;1 i, j n
Based on the values in I k and the threshold, the individual values of the final key vector
FK v are computed. The vector FK v is created using the following equation,
111
FK v
; if I k (i ) ! mean( I k )
; else
The final key FK v generated is also irrevocable and complex consisting of 256 bits. The
irreversible property makes the key almost unbreakable, because it is very intricate to
compute the cancellable template from the final cryptographic key FK v generated.
Experimental results and security analysis of the above two proposed methods will be
discussed in the next chapter (Chapter No. 8).
7.5 SUMMARY
Our first proposed method, Biometrics-based Key Generation will be shown to perform
better than traditional systems in usability domain. Precisely, it is not possible for a
person to lose his/her biometrics, and the biometric signal is intricate to falsify or steal.
The proposed cancellable biometric Crypto System is an all-new technique for the
authentication that yields the synergistic control of biometrics. The proposed system
employs intentional distortion of fingerprint in a repeatable fashion and the fingerprint
thus obtained is utilized in the cryptographic key generation. When the old transformed
template for finger print is stolen, it is possible to obtain a new template for
fingerprint, just by altering the parameters of the distortion process. Subsequently,
enhanced privacy for the user results, as his true fingerprint is not used. A notable
enhancement in terms of decrease in the consumed time is attained with the elimination
of some steps that are redundant with the mixture of the proposed methodology.
Integration of the proposed technique with the existing cryptographic methodologies is
uncomplicated and it simplifies key-generation and key-release issues in a remarkable
manner. This methodology can be further made efficient and sophisticated with the
combination of any of the evolving cryptographic systems.
112
Our second proposed method, viz., Biometrics-based Key Generation also outperforms
traditional cryptographic systems, mainly because, it is impossible for a person to lose
his/ her biometrics, and also the biometrics is intricate to falsify or steal. Also, we have
presented an efficient approach for generation of irrevocable cryptographic keys from
fingerprint biometrics using cancellable biometric templates.
Each of the two methods is composed of three phases namely: 1) Minutiae points
extraction from the fingerprint image, 2) Cancelable template generation with added
security and 3) Cryptographic key generation from Secured Cancelable template.
However, in the first of these methods, Gabor filters are used where as in the second
method Wiener filters are used. The resultant cryptographic key thus generated is
irrevocable and unique to a specific cancelable template, availing better protection and
replacement features for lost or stolen biometrics. The experimental results, to be
discussed in later chapters have portrayed the effectiveness of the proposed method in
generating an irrevocable cryptographic key.
113
CHAPTER 8
EXPERIMENTAL RESULTS AND ANALYSIS
"It is quite possible to work without results, but never will there be results without
work." Sunil V. K. Gaddam
8.1 INTRODUCTION
This chapter describes the experimental results and the performance analysis of the
proposed methods vis--vis the previous methods. This chapter is significant in the sense
that it concludes the research in terms of effectiveness and advantages of proposed
methods over the previous ones. The results and discussion are carried out using the well
known fingerprint databases to clearly evaluate the performance of the methods. In
addition to this, the well-accepted evaluation metrics are employed here to directly
compare with the previous ones.
graph in between the FNMR vs. FMR that are the two chief metrics used in this
experimentation. From the graphs, we made a conclusion that signifies the relative
effectiveness of the approach among the various methods used for computing the secure
cryptographic key. Furthermore, the idea of cancellable biometrics is the main concern in
the proposed research so that the effectiveness should be additionally analyzed in terms
of non-invertibility. Thus, the non-invertibility in securing the biometric key is
extensively analyzed with the help of different transformations carried out within the
approaches and the conclusion is made from the extensive analysis.
FVC 2002 contains four different databases (DB1, DB2, DB3 and DB4), which were
collected by the following sensors/technologies (given in table 8.1): DB1: optical sensor
"TouchView II" by Identix, DB2: optical sensor "FX2000" by Biometrika, DB3:
capacitive sensor "100 SC" by Precise Biometrics, and DB4: synthetic fingerprint
generation. Here, each database is 110 fingers wide (w) and 8 impressions per finger deep
(d) so, in total, it contains 880 fingerprints. These fingerprints are stored in two sets such
115
as, set A and set B. Set A contains the fingers from 1 to 100 and set B is a representative
of the whole database. The 110 collected fingers were ordered by quality and then the 8
images from every tenth finger were included in set B that is numbered from 101 to 110.
Here, we make use of the set B from each dataset to conduct an experimental study. The
ultimate aim is to directly compare the performance of the proposed approaches with the
previous methods over these well-accepted fingerprint databases. The sample of
fingerprint images taken from the four fingerprint databases is shown in figure 8.1.
(a)
(b)
(c)
(d)
Figure 8.1: Sample of fingerprint images taken from four fingerprint databases
Sample images of (a) DB1 (b) DB2 (c) DB3 (d) DB4
116
Sensor Type
Image Size
DB1
Optical Sensor
DB2
Optical Sensor
DB3
DB4
SFinGe v2.51
10x8
500 dpi
10x8
569 dpi
100x8
10x8
500 dpi
10x8
500 dpi
1,2,..,10; j
0.
In
general, three types of rejection may happen for each fingerprint Fij and the three kinds
of rejections are especially summed and their totality is stored in REJ ENROLL . (1) F (Fail):
the enrollment cannot be possible by the algorithm, (2) T (Timeout): the enrollment goes
above the maximum allowed time, and (3) C (Crash): the algorithm crashes during
fingerprint matching.
FNMR(t )
117
Again, each fingerprint key K 1i , i 1,2,..,10 is matched with the first fingerprint image
from different fingers F1k (i k d 10) and the corresponding Impostor Matching Score
( ims ) is computed. The number of matches (denoted as NIRA - Number of Impostor
Recognition Attempts) is ((10 u 9) / 2)
FMR (t )
0.
Furthermore, the FMR (t ) (False Match Rate) and FNMR (t ) (False Non-Match Rate) are
calculated from the above distributions for t ranging from 0 to 1. Then, the ROC curve is
plotted FMR vs. FNMR for varying threshold t . The plotted ROC curve is extensively
used in the contest to compare the performance of different algorithms. One more
parameter used for comparison is, Equal Error Rate ( EER ) that is computed as the point
where, FNMR (t )
FMR (t ) .
The experimental results of the proposed scheme for efficient cancellable biometric key
generation [160] is presented in this sub-section. The proposed approach is implemented
in Matlab (Matlab 7.10). At first, the fingerprint images are pre-processed using
histogram equalization and Gabor filtering that are enhanced the fingerprint images to
easily extract the minutiae points. Subsequently, the binarization is applied on the
118
enhanced fingerprint images and then, the region of interest is determined. After that, the
minutiae points are extracted after applying the ridge thinning algorithm. Based on the
co-ordinates of minutiae points, the secured feature matrix is computed and eventually,
the 256- bit key is generated from the secured feature matrix. The intermediate results of
the proposed scheme for the sample images from DB1 to DB4 are clearly depicted in the
figure 8.2 to figure 8.5 respectively.
120
8.5.2 Experimental Results of the Proposed Efficient Approach for Cryptographic Key
Generation from Fingerprint (Bio-Crypto Key)
This section presents the experimental results of the proposed efficient approach for
cryptographic key generation from fingerprint [161]. The proposed approach is
programmed in Matlab (Matlab7.10). Initially, the fingerprint images obtained from the
FVC 2002 dataset are pre-processed using histogram equalization and Wiener filtering to
achieve image enhancement. Then, the Region of Interest is selected from the enhanced
fingerprint images through the use of binarization, Adaptive Thresholding and
morphological operations. Afterward, the location of the minutiae points are extracted
after applying the ridge thinning algorithm. Subsequently, Secured Cancellable Template
is computed based on the co-ordinates of minutiae points and the 256- bit key is
generated from the secured Cancellable Template. The intermediate results of the
proposed approach for the sample images from DB1 to DB4 are clearly depicted in the
figure 8.6 to figure 8.9 respectively.
Figure 8.6: Intermediate results of Bio-Crypto Key Generation Approach from DB1
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae
points (g) Generated 256-bit key
121
Figure 8.7: Intermediate results of Bio-Crypto Key Generation Approach from DB2
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key
Figure 8.8: Intermediate results of Bio-Crypto Key Generation Approach from DB3
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key
122
Figure 8.9: Intermediate results of Bio-Crypto Key Generation Approach from DB4
(a) Input Fingerprint Image (b) Histogram equalized image (c) Wiener Filtered Image
(d) Region of Interest (ROI) (e) thinned image (f) Fingerprint Image with minutiae points
(g) Generated 256-bit key
8.5.3 Experimental Results of the Method Introduced by Nalini K. Ratha et al.
This section presents the experimental results of the method proposed by Nalini K. Ratha
et al. [54] to generate cancellable fingerprint template. This approach has been
implemented using Matlab (Matlab7.10). Initially, the minutiae points are extracted from
the fingerprint template and the new transformation key is generated from the minutiae
points. The first important step is the process of registering the image. Then, the minutiae
positions and orientation of the singular points (core and delta) is measured with regard to
the same coordinate system and they are expressed by minutiae positions and angles with
respect to these points. In the Cartesian transformation, the minutiae positions are
measured in rectangular coordinates with reference to the position of the singular point.
In the polar transformation, the minutiae positions are measured in polar coordinates with
reference to the core position and the angles that are measured with reference to the
orientation of the core. Figure 8.10 to figure 8.13 represent the intermediate results of this
method for the sample images from DB1 to DB4.
123
124
125
al. [143] to secure and personalize the hash for the fingerprint data. This approach has
been implemented using Matlab (Matlab7.10). At first, minutiae features are extracted
from the fingerprint images obtained from an online scanner. For each minutiae point,
nearest neighbour is identified to constitute minutiae subsets and the hashes of the
minutiae subsets are obtained using Symmetric hash functions. The computed hash
values for the minutiae subsets are stored in the database. During verification, new hash
values are generated and are matched with those stored in the database. Figure 8.14 to
figure 8.17 indicate the intermediate results of this approach for the sample images from
DB1 to DB4.
127
At the same way, the verification task is performed on the other databases, such as, DB2,
DB3 and DB4. The same graphs are plotted as per the values obtained after matching the
feature of the fingerprint images as shown in figures 8.19, 8.20 and 8.21. By analyzing
the graphs plotted for DB2, the FMR falls to zero value once the threshold is above than
0.6. At the same time, the FNMR is increased significantly from the initial value and it
stabilizes once the value is reached to 0.6. For DB3 and DB4, the corresponding value
that the FMR falls to zero is 0.7. Furthermore, the equal error rate for the proposed
algorithm is computed for all the databases. The corresponding values obtained from the
graph is given as, EER= 0.5 (DB2), EER= 0.6 (DB3) and EER= 0.6 (DB4).
8.6.2 Performance Analysis of the Proposed Efficient Approach for Cryptographic Key
Generation from Fingerprint (algorithm 2)
This section presents the performance analysis of the proposed efficient cancellable
biometric key generation scheme [161].
130
Initially, the features are extracted from the fingerprint images using the proposed
algorithm and the matching process is carried out by varying the threshold values. For
different thresholds, the FMR as well as FNMR is computed from the genuine and
impostor matching score obtained after matching. The graph plotted for the
corresponding values of different databases is shown in the following figures (8.22 to
8.25) that provide the EER for all the databases. The values obtained are given as, EER=
0.542(DB1), EER= 0.45(DB2), EER= 0.6(DB3) and EER= 0.55(DB4).
132
133
134
The recognition performance of the approach proposed by the S. Tulyakov et al. [143] is
discussed in this sub-section. By applying the procedure, the cancellable template is
constructed and then, the cancellable template is used to generate the key vector of the
fingerprint images in the fingerprint database. Then, matching against the genuine
fingerprint and impostor fingerprints is carried out to find the FMR and FNMR of the
approach in fingerprint recognition. The graph is drawn for values obtained to find the
efficiency of the approach in different databases. From the graphs plotted (shown in
figures 8.30 to 8.33), the equal error rate of the approach in various databases is found
out to ensure the accuracy in fingerprint recognition. The values obtained are given as,
EER= 0.5 (DB1), EER= 0.6 (DB2), EER= 0.5 (DB3) and EER= 0.6 (DB4).
135
136
concerning input variations and noise, while the threshold t get decreased. On the other
hand, if t is raised to make the system more protected, then FNMR (t ) increases.
Therefore, the report system performance at all operating points (threshold, t) is more
desirable. This can be achieved by plotting a Receiver Operating Characteristic (ROC)
curve. The ROC curve is a plot of FMR (t ) against (1 FNMR (t )) for diverse decision
thresholds, t .
137
8.7.1 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB1)
The performance of the proposed approaches is extensively compared with the previous
approaches using ROC curve. For DB1, the ROC curve of the different approaches is
plotted in log-log scale. The ROC curve plotted for the DB1 is given in the figure 8.34.
From the graph, it clearly denotes that the proposed approach has a lower FNMR that
signifies the better security of the proposed system. Compared with the previous
approaches, the proposed two approaches can provide better security against the impostor
attacks in due to its less FNMR. On the other hand, the previous two approaches have
provided the less FMR compared with the proposed approaches. Hence, although the
secure system is slightly worst in terms of FMR near the point of equal error, it is
significantly better in terms of FNMR.
8.7.2 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB2)
This section presents the comparative analysis of the proposed approaches with the
previous approaches in DB2. The approaches are tested on DB2 and the ROC curve is
plotted. From the graph (shown in figure 8.35), all the algorithms almost provide the
identical results except the approach proposed by the S. Tulyakov et al. that keeps lesser
FMR. On the other hand, the proposed algorithms proved to be very accurate and
138
exhibited a good trade-off between FMR and FNMR. This ensures that the security of the
proposed approaches is good compared with the previous approaches.
139
8.7.4 Comparison of the Proposed Methods with the Previous Approaches Over FVC
2002 Database1 (DB4)
The comparison for particular approaches requires analysis of receiver operating curve
(ROC), which can be developed by varying a range of threshold values in between 0 and
1. Figure 8.37 shows the performance of the proposed two approaches in the verification
task. From the figure, the bigger range of threshold value yield the better performance, as
a large range of operating points, t with zero errors can be obtained. When the threshold
is fixed to notable range, the proposed two approaches are doing fine in their
performance of verification task. In addition to, FNMR and FMR can be improved
further by providing the secure recognition of genuine users and correctly rejecting the
impostor attacks. Compared with the previous two approaches, the proposed approaches
are more appropriate in providing the security against the anticipated attacks.
140
8.8 SUMMARY
This chapter presented the experimental results and the analysis of the proposed methods
with the previous methods. The experimental analyses were carried out utilizing the wellknown fingerprint databases to noticeably evaluate the performance of the approaches.
Subsequently, ROC graph was plotted in between the FNMR vs. FMR to signify the
relative effectiveness of the approach among the various methods. Finally, this chapter
concludes that the recognition performance is improved by using the proposed method
significantly.
141
CHAPTER 9
SECURITY ANALYSIS
Systems methods will neither be trustworthy nor successful unless the general
research regarding systems methodology incorporates security analysis design as an
explicit objective. Richard Baskerville
9.1 INTRODUCTION
The main consideration of the research work is to ensure security against the impostor
attacks. Hence, for proving the security, the analysis should be carried out with respect to
the non-invertiblity and the transformation used. For this purpose, a number of
transformation functions have been developed for building revocable or non-invertible
biometric templates. In the following sub-sections, we discuss the different
transformations used by the methods to provide the security in fingerprint matching and
the security analysis.
column of the matrix. Therefore, the complete information content has an upper bound of
bits. If studying the fairly accurate strength of the transformation
process, each resultant cell after the transformation could have been initiated from
possible source cells. Therefore, a brute force attack would have to attempt nearly
possibilities corresponding to
bits.
142
bits of data
is the number of unique directions. For an effective brute force attack, the
of the
average fingerprint will be having about 35- 40 minutiae and also that
bits.
that forms an algebraic basis in the set of polynomial symmetric functions which
basically take a random algebraic basis of symmetric polynomials of degree less than or
equal
to
Then,
the
and
hash
functions
of
the
transformed
minutiae,
Accordingly, hashes of transformed minutia can be conveyed using the original hashes,
143
These equations permit matching localized minutia sets, and getting equivalent
transformation attributes.
9.4
SECURITY
ANALYSIS
(CANCELABLE
OF
THE
PROPOSED
FIRST
BIOMETRIC
KEY
GENERATION
METHOD
SCHEME
FOR
CRYPTOGRAPHY)
The main aim of the cancellable transformation is to deliver cancellable skill a "noninvertible" transform. Usually, it lowers the discriminative power of the original
template. Thus, the cancellable templates and the secure templates of an individual in
diverse applications will be different. In the proposed algorithm [160], security is more
strengthened by AES encryption. Once after the template is formed and minutiae points
are acquired, a feature matrix is generated by a sequence of steps. The feature matrix is
then encrypted using AES. Reinforced by AES, the feasibility of decrypting the ciphered
feature matrix is almost negligible. Anticipating a worst-case scenario, even if a hacker
succeeds in decrypting the AES encrypted data with an intention to obtain the feature
matrix, the chances of reorganizing the minutiae points and the templates are almost nil.
Furthermore, there is no possibility of conjecturing the steps to generate the feature
matrix and it is little chance to restructure the template by any means. The key thus
formed cannot be traced back to the origin i.e. to the template and moreover the key itself
cannot be regenerated falsely using the template. This irreversible aspect makes the key
armoured and reliable and even resistant to brute force attacks. This shatter-proof
property emanates from the very essence of preserving the confidentiality of the battery
of operations in transforming minutia points to a Feature matrix.
144
KEY
FROM
FINGERPRINTS
USING
CANCELABLE
TEMPLATES)
In the second Bio-Crypto Key Generation Algorithm [161], an efficient approach for
irrevocable cryptographic key generation is designed using a secured cancellable
template obtained from fingerprint biometrics. In this, the security of the proposed
cancellable template generation is enhanced by means of utilizing the well-known RSA
factorization concepts along with the exponentiation concept. The advantage of using the
RSA algorithm is chosen by finding the product of two distinct prime numbers from the
minutiae points set and it was almost computationally infeasible to determine the factors
of the prime numbers. Moreover, the security is more enhanced by computing the
exponential value of the prime factors. The complexity of this algorithm is analyzed
whose complexity is of exponential and hence computationally infeasible. Henceforth,
the cancellable template, even though irrevocable, serves as the source for the generation
of the cryptographic key. The resultant cryptographic key thus generated has been
irrevocable and unique to a specific cancellable template, availing better protection and
replacement features for lost or stolen biometrics.
9.6 SUMMARY
This chapter discussed the security analysis of the proposed methods along with the
existing methods. In order to signify the effectiveness of the proposed methods, the
security analysis was carried out in terms of different transformations utilized in the
various methods to prove the non-invertibility. Finally, the conclusion was made from the
extensive analysis for ensuring the security against the impostor attacks.
145
CHAPTER 10
CONCLUSION AND SCOPE FOR FUTURE WORK
I am turned into a sort of machine for observing facts and grinding out
conclusions. Charles Darwin
"Research is the process of going up alleys to see if they are blind."
Marston Bates
10.1 CONCLUSION
Then,
comparison is carried out using the ROC curve that signifies the efficiency of the
proposed methods. Also ROC curve provides a good trade-off between accuracy and
efficiency. In addition to the above, the EER obtained by the proposed approaches is
significantly less as compared with the previous approaches over the four fingerprint
databases. Finally, the security of the approaches is extensively discussed in terms of
non-invertibility and binding bio-crypto key in ensuring the security preservation.
This work opens up new avenues for future work. This research can be extended in
various directions and some of these are summarized below:
x
Since the proposed approach achieved a good EER, the present work can be
extended further for improving the accuracy of biometric-based security
systems by designing more reliable matching strategies.
Even though the minutiae points are extracted efficiently, further extension
can be carried out utilizing the feature extraction methods that should be
suitable for noisy fingerprint images.
This methodology can be further made efficient and sophisticated with the
combination of some of the evolving cryptographic systems.
The proposed work can be promoted by stabilizing the bio-crypto key via error
correction methods.
147
GLOSSARY
TECHNICAL TERMINOLOGY
Acceptability: It is about how readily individuals adopt a biometric system or about how
intrusive the individual feels the systems is, based on the traits in the question.
Access Control & Availability: Ensuring that authorized users have access to information
and associated assets when required i.e., services must be accessible and available to
intended users.
Accessibility: It measures how easy the particular biometric trait is to get to and measure.
Foot geometry, for example, would not be very accessible since individuals would have
to remove their shoes first.
Active Attacks: Active attacks involve attempts on security leading to deletion,
modification, insertion, redirection, blockage or destruction of data, device or links.
Adaptive Thresholding: Thresholding is called an adaptive thresholding, when a different
threshold is used for different regions in the image. This may also be known as local or
dynamic thresholding.
Arch: a ridge that runs across the fingertip and curves up in the middle. Tented arches
have a spiked effect.
Authentication: It is the process of verifying the claimed identity of a user. i.e., sender
and receiver want to confirm the identity of each other.
Authorisation: authorizing access to resources.
Authorization Violation: An entity uses a service or resource it is not intended to use.
Availability: It ascertains how many different/unique, independent samples the system
could potentially acquire from an individual.
148
149
150
151
Forgery of Information: An entity creates new information in the name of another entity.
Gabor filter is a linear filter whose impulse response is defined by a harmonic function
multiplied by a Gaussian function. Because of the multiplication-convolution property,
the Fourier transform of a Gabor filter's impulse response is the convolution of the
Fourier transform of the harmonic function and the Fourier transform of the Gaussian
function. Gabor filters have the ability to perform multi-resolution decomposition due to
its localization both in spatial and spatial-frequency domain. These are used for feature
extraction in many machine vision applications.
Hacker: a person specialized in a topic and enjoys exploring it for the sake of learning
and overcoming barriers. Applied to IT, the term refers to a person whose ability to
understand computer systems, their design and programming, allows him/her to master
the systems for a particular use.
Histogram Equalization is a technique frequently used in Image Processing in order to
improve the image contrast and brightness and to optimize the dynamic range of the
greyscale.
Identification and Authentication is the process of verifying the identity of a user
through the use of specific credentials (e.g., passwords, tokens, biometrics), as a
prerequisite for granting access to resources in a network system.
Identification is a 1: N matching process, where the users input is compared with the
templates of all the persons enrolled in the database and the identity of the person whose
template has the highest degree of similarity with the users input is processed by the
biometric system. If the highest similarity between the input and all the templates is less
than a fixed minimum threshold, the system rejects the input, which implies that the user
presenting the input is not one among the enrolled users.
Image Binarization is an important process for image analysis. The inherently bi-level
nature of the image has led to many of the image analysis algorithms being designed for
use on bi-level images.
Information Security is the quality or state of being secure, i.e., to be free from danger.
152
153
154
i.e., straightforward revocation and re-issue are allowed in the event of compromise.
Ridge thinning process is utilized to remove the redundant pixels until the ridges become
of the biometric to stay constant or un-changeable over time. It becomes important when
biometric trait can be physically changed, either intentionally or accidentally.
Sabotage: Any action that aims to reduce the availability and/ or correct functioning of
services or systems.
Scenario evaluations determine the performance of a complete biometric system in an
environment that models a real-world target application. These test results can only be
repeatable if the modelled scenario is controlled.
Security is freedom from risk or danger.
Security Attack is the actual realization of a threat is called an attack i.e., any action that
Confidentiality, Authentication, Data Integrity, Access Control & Availability, and Nonrepudiation. Security goals can be defined, depending on the application environment, or
in a more general and technical way.
Security Mechanism is a process (algorithm, protocol or device) that is designed to
enhances security of data processing and information transmission, and makes use of the
security mechanisms.
Security Threat in a communication network is any possible event or sequence of actions
Specific Security Mechanisms include encryption, digital signature, access controls, data
system.
Thresholding is a non-linear operation that converts a grey-scale image into a binary
image where the two levels are assigned to pixels that are below or above the specified
threshold value.
Uniqueness, no two persons should be the same in terms of the characteristic.
Universality, every person should have the characteristic.
Verification is a 1:1 matching process, where the user claims an identity and the system
verifies whether the user is genuine or not. If the users input and the template of the
claimed identity have a high degree of similarity, then the claim is accepted as genuine
otherwise, the claim is rejected and the user is considered as fraud.
White Pixel: its value will be 255 for an 8 bits/pixel indexed image in greyscale
Whorl: an oval formation, often making a spiral pattern around a central point. Principal
according to a least-squares test, is the one most similar to a desired form of signal
output. The Wiener filter purpose is to reduce the amount of noise present in a signal by
comparison with an estimation of the desired noiseless signal. It is based on a statistical
approach.
156
BIBLIOGRAPHY
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
157
[10] (n.d.) Material on Network Security, Module 8, Version 2 CSE IIT, Kharagpur,
http://nptel.iitm.ac.in/courses/Webcoursecontents/IIT%20Kharagpur/Computer%20networks/pdf/M8L1.pdf
[11] B. Miller, Vital signs of identity, IEEE Spectrum, 31(2):2230, 1994.
[12] Jain, A., Bolle, R. M. and Pankati, S. Introduction to Biometrics, In Biometrics Personal Identification in Networked Society, Kluwer Academic Publishers
Boston/Dordrecht/London, Ch. 1, pp. 1-41, 1999.
[13] John D. Woodward, Christopher Horn, Julius Gatune, Aryn Thomas, Biometrics:
A look at facial recognition, Documented briefing, RAND, 2003.
[14] Davis D., High-Tech passport spark stiff competition, Card Technology, 8(4)
page 22-24, April, 2003.
[15] Wayman J.L., Alyea L., Picking the Best Biometric for Your Applications,
National Biometric Test Centre Collected Works, vol. 1, J.L. Wayman, Ed. San
Jose, CA: National Biometric Test Centre, 200, page 269-275.
[16] Pfleeger C.P., Security in Computing, second edition, ISBN 0-13-337486-6
Prentice Hall, PTR.
[17] Tiwana A., Web Security, ISBN 1-55558-210-9 Digital Press An imprint of
Butterwoth-Heinemann
[18] Wayman J.L., Biometric Identification Technologies in Election, ProcessSummary Report, National Biometric Test Centre Collected Works, vol. 1, J.L.
Wayman, Ed. San Jose, CA: National Biometric Test Centre, 200, page 269-275.
[19] (n.d.) Material on Biometrics, Wikipedia, http://en.wikipedia.org/wiki/Biometric,
accessed 2 October, 2005
[20] Schneier, Bruce, Inside risks: the uses and abuses of biometrics, Communications
of the ACM, Volume 42, Issue 8, pp. 136, August, 1999.
[21] P. Reid, Biometrics and Network Security, Prentice Hall, PTR, 2003.
158
159
160
161
162
163
165
[98] T. Connie, A. Teoh, M. Goh, and D. Ngo, Palmhashing: A novel approach for
cancellable biometrics", Information processing letters, vol. 93, no. 1, pp. 1-5,
2005.
[99] F. Hao, R. Anderson, and J. Daugman, "Combining Crypto with Biometrics
Effectively", IEEE Transactions on Computers, vol. 55, pp. 1081-1088, 2006.
[100]M. F. Santos, J. F. Aguilar, and J. O. Garcia, Cryptographic key generation using
handwritten signature", Proceedings of SPIE, vol. 6202, pp. 225-231, Orlando, Fla,
USA, Apr. 2006.
[101]G. Zheng, W. Li, and C. Zhan, Cryptographic key generation from biometric data
using lattice mapping", Proceedings of the 18th International Conference on Pattern
Recognition, vol. 4, pp. 513-516, 2006.
[102]Andrew Teoh Beng Jin, Tee Conniea, "Remarks on Bio-Hashing based cancellable
biometrics in verification system", Neurocomputing, Vol: 69, No: 16-18, pp. 24612464, 2006.
[103]A. B. Teoh, and C. T. Yuang, "Cancellable biometrics realization with multispace
random projections", IEEE Transactions on Systems, vol. 37, no. 5, pp. 1096-1106,
2007.
[104]J. G. Jo, J. W. Seo, and H. W. Lee, Biometric digital signature key generation and
cryptography communication based on fingerprint", First Annual International
Workshop 2007, LNCS 4613, pp. 38-49, Springer-Verlag, 2007.
[105]B. Chen, and V. Chandran, Biometric Based Cryptographic Key Generation from
Faces", Proceedings of the 9th Biennial Conference of the Australian Pattern
Recognition Society on Digital Image Computing Techniques and Applications, pp.
394-401, 2007.
[106]E. Maiorana, P. Campisi, J. O. Garcia, and A. Neri, Cancellable biometrics for
HMM-based signature recognition", 2nd IEEE International Conference on
Biometrics: Theory, Applications and Systems, pp. 1-6, 2008.
166
[107]Beng, A.,
Jin Teoh,
Kar-Ann Toh,
167
168
169
[138]R.M. Bolle, J.H. Connel, N.K. Ratha, Biometrics perils and patches, Pattern
Recognition, vol. 35, pp: 27272738, 2002.
[139]Uludag U., Pankanti S., Prabhakar S. and Jain A.K., Biometric cryptosystems:
Issues and challenges, in Proceedings of the IEEE, vol. 92, no. 6, pp. 948960,
2004.
[140]Kong A., Cheung K., Zhang D., Kamel M. and You J., An analysis of Bio-Hashing
and its variants, Pattern Recognition, vol. 39, no. 7, pp. 13591368, 2006.
[141]Sakata K., Maeda T., Matsushita M., Sasakawa K. and Tamaki H., Fingerprint
Authentication Based on Matching Scores with Other Data, in Proc. Int. Conf. on
Biometrics, LNCS 3832, pp. 280286, 2006.
[142]Andrew B. J. Teoh, Yip WaiKuan, Sangyoun Lee, "Cancellable biometrics and
annotations on Bio-Hash", Journal Pattern Recognition, Vol. 41, No. 6, June, 2008.
[143]Sergey
Tulyakov,
Faisal
Farooq,
PraveerMansukhani,
VenuGovindaraju,
170
171
[158]Yahia S. Halabi, Zaid SASA, Faris Hamdan, Khaled Haj Yousef, "Modelling
Adaptive Degraded Document Image Binarization and Optical Character System",
European Journal of Scientific Research, Vol. 28, No.1, pp.14-32, 2009.
[159](n.d) Material on RSA Factoring Challenge,
http://en.wikipedia.org/wiki/RSA Factoring _Challenge.
[160]Sunil V. K. Gaddam and Manohar Lal, "Efficient Cancellable Biometric Key
Generation Scheme for Cryptography", International Journal of Network Security,
Vol: 10, No: 3, pp: 223-231, 2010.
[161]Sunil V. K. Gaddam and Manohar Lal, "Development of Bio-Crypto Key from
Fingerprints Using Cancellable Templates", International Journal on Computer
Science and Engineering (IJCSE), Vol. 3, No. 2, pp. 775-783, 2011.
[162]Maio, D. Maltoni, D. Cappelli, R. Wayman, J.L. Jain, A.K., FVC 2002: Second
Fingerprint Verification Competition, in Proceedings of the 16th International
Conference on Pattern Recognition, vol. 3, pp. 811 814, 2002.
[163]Julien Bringer, Herve Chabanne, Bruno Kindarji, Anonymous Identification with
Cancellable Biometrics, in the Proceedings of the 6th International Symposium on
Image and Signal Processing and Analysis, 2009.
172
www.get-morebooks.com
Kaufen Sie Ihre Bcher schnell und unkompliziert online auf einer
der am schnellsten wachsenden Buchhandelsplattformen weltweit!
Dank Print-On-Demand umwelt- und ressourcenschonend produziert.
www.morebooks.de
VDM Verlagsservicegesellschaft mbH
Heinrich-Bcking-Str. 6-8
D - 66121 Saarbrcken
info@vdm-vsg.de
www.vdm-vsg.de