Professional Documents
Culture Documents
Sympathetic
Biometric Systems
CHAPTER 16
Why we need such systems ?
we can now investigate how to provide informational and personal privacy in biometric deployments.
International Biometric Group (IBG)
IBG’s BioPrivacy Best Practices define what steps institutions can take when deploying biometrics to ensure
that biometric deployments do not intrude on individual privacy and are instead either privacy neutral or
privacy sympathetic.
It provides some of he best practices that the deployers need to follow, but in the real scenario they don’t
make it full proof practice and therefore the loop hole exists.
Challenges for privacy-sympathetic system
design
Limit System Scope
Biometric deployments must not be expanded to perform broader verification identification-related functions than
originally intended.
Any expansion or retraction in scope should be accompanied by full public disclosure, under the oversight of an
independent body, allowing individuals to opt out of system usage if possible. A fundamental risk with any system of
identification is that the system can be employed for purposes beyond those that were originally
intended.
From a privacy perspective, function creep must be disallowed, even if the purposes of the system expansion are seen as
innocuous.
The scope of a biometric system can be limited by legislation, by internal or third-party oversight, and by the type of
data collected.
Systems can also be designed that preclude the artificial introduction of images or biometric data, requiring that a live
fingerprint or facial image be presented in order for a decision to be rendered. However, because it is difficult to design
a system that categorically cannot be used for purposes beyond its original intent, auditing, oversight, and transparency
are essential.
If a system is being misused, drawing attention to this misuse and enabling policies whereby system usage can
be suspended are required.
Scope limitation may be more difficult in countries with authoritarian governments, where frameworks to ensure public-
and private-sector accountability may be lacking.
Do Not Use Biometrics as a Unique
Identifier
The use of biometric information as a unique identifier should be extremely limited
Unique identifiers facilitate the gathering and collection of personal information from various databases and can represent
a significant threat to privacy.
Though biometric templates are not ideal unique identifiers—a user’s biometric verification differs every time he or she is
authenticated—the enrollment template is normally a fixed value, used in all subsequent verifications. If a user’s static
enrollment template were shared between various agencies or companies to enable verification to a range of systems, it
could be used as a highly effective unique identifier.
The unique identifier issue will become more problematic if a biometric technology is developed that generates the same
template every time a user interacts with a system. This type of template could be used as an identifier across multiple
databases and applications, and any single verification template could be linked with all of a user’s verification templates.
Limit Retention of Biometric Information
Biometric information must only be stored for the specific purpose of usage in a biometric system and should not be stored
any longer than necessary.
Biometric information and associated account data should be destroyed, deleted, or otherwise rendered useless when the
system is no longer operational. However, data such as transactional logs can be kept for auditing purposes.
Different storage limitations apply to enrollment and verification data. While enrollment data, by definition, must be
retained in order for the system to be operational, verification data need only be retained for as long as necessary to
perform a match. Once a decision is rendered, there is no need to store the biometric verification attempt, and well-
designed systems will dispose of verification data once a decision is rendered.
A hash of the verification template may be stored to prevent compromised templates from being used in replay attacks.
System design can accomplish some of this task by deleting biometric information when an associated account is deleted
or updated.
Evaluate a System’s Potential Capabilities
When determining the risks a specific system might pose to privacy, the system’s potential capabilities must be assessed
in addition to the risks involved in its intended usage.
Best Practices require that the impact of the deliberate misuse of a biometric system be considered when assessing
whether a deployment is privacy invasive, neutral, or sympathetic.
Although systems with the potential to be used in a privacy-invasive fashion can still be deployed if accompanied by
proper precautions, their operations must be monitored and protections must be in place to prevent misuse by internal
or external parties.
Limit Storage of Identifiable Biometric Data
Whenever possible, biometric data in an identifiable state, such as a facial image, fingerprint, or vocal recording, should be
stored or used in a biometric system only for the initial purposes of generating a template.
After template generation, the identifiable data should be destroyed, deleted, or otherwise rendered useless. This is to prevent
the storage of fingerprints and facial images, as opposed to finger-scan and facial-scan templates.
Templates are resistant to misuse because they cannot be identified as biometric information and cannot be used to re-create
original biometric information.
Forensic systems and some public-sector programs store identifiable data in order to resolve borderline matches; in addition,
employee background screens, which involve the acquisition of multiple fingerprint images, store identifiable data for future
processing or auditing purposes.
In this type of system, physical access and operational controls are necessary to ensure that identifiable data cannot be
compromised.
Limit Collection and Storage of Extraneous
Information
The non-biometric information collected for use in a biometric verification or identification system should be
limited to the minimum necessary to make identification or verification possible. Biometric databases generally
comprise an index and a biometric template, with direct or indirect links to other databases as necessary.
Storing names or account information is not only bad database design—this data will normally exist elsewhere
and does not need to be collected and stored again—but also significantly increases the likelihood that biometric
data may be associated with other personal information.
Make Provisions for System Termination
A method must be established by which a system used to commit or facilitate privacy-invasive biometric matching,
searches, or linking can be depopulated and dismantled.
The responsibility for making such a determination would rest with an independent auditing group and would be subject to
appropriate appeals and oversight.
This protection would apply primarily to public-sector systems, as they are most likely to be used in a privacy-invasive
fashion and are more in need of independent oversight and monitoring.
By contrast, private sector deployments found to be privacy invasive will likely be modified or terminated as the result of
pressure from investors, consumers, and the general public.
IBG BioPrivacy Best Practices: Data
Protection
Use Security Tools and Access Policies to Protect Biometric Information
Implement Logical and Physical Separations between Biometric and Nonbiometric Data
Use Security Tools and Access Policies to Protect
Biometric Information
Biometric information should be protected at all stages of its life cycle, including storage, transmission, and
matching. The protections enacted may include encryption, private networks, secure facilities, administrative
controls, and data segregation. The protections necessary within a given deployment are
determined by a variety of factors, including the location of storage, the location of matching, the type of biometric
used, the capabilities of the biometric system, whether processes take place in a trusted environment, and the risks
associated with data compromise.
Protect Postmatch Decisions