You are on page 1of 6

Copyright © 2005 Information Systems Audit and Control Association. All rights reserved. www.isaca.org.

How Can Security Be Measured?


By David A. Chapin, CISA, CISM, CISSP, IAM, and Steven Akridge, JD, CISM, CM, CISSP, IAM

the most items for the least amount of money, possibly

T
raditional security metrics are haphazard at best; at worst bypassing the most expensive items. The assumption is that
they give a false impression of security that leads to aggregating risk reduction controls is a better buy. Thus, there is
inefficient or unsafe implementation of security measures. a tendency to buy large quantities of security tools and avoid the
This paper presents an approach whereby maturity and quality more expensive, less glamorous controls. The more difficult
are combined to provide a more complete and orderly picture of controls tend to be organizational in nature requiring cultural
an organization’s security posture. The approach will be referred change (such as a disaster recovery plan) rather than specific
to as the Security Program Maturity Model. turnkey solutions [such as firewalls and intrusion detection
Security metrics—the measurement of the effectiveness of systems (IDSs)]. Management thinks it is buying more security
the organization’s security efforts over time—have always for less money.
been difficult to evaluate. How can an organization determine However, who is to say that more security is purchased?
whether it is secure? The measure of the quality of the security How can the organization measure the relative protection
program can be truly tested only when the organization is gained by each purchase? Is the organization purchasing the
stressed by a crisis. Yet, this situation is exactly what the security safeguards in the right order? Is the organization being
security effort is designed to prevent. exposed to more risk because of the unsystematic approach
Management needs some measure of how secure the toward implementation?
organization is. Organizations need to ask themselves: Building security programs from the ground up allows for
• How many resources does it take to be “safe”? the development of new approaches toward these traditional
• How can the cost of new security measures be justified? security metrics problems. A fresh look at these problems
• Is the organization getting its money’s worth? enables the development of a comprehensive solution for any
• When does the organization know it is “safe”? industry.
• How does the organization compare its posture with others in This newer, more systematic approach toward security
the industry and with best practice standards? metrics will:
The traditional answer to these questions relates to risk • Generate reproducible and justifiable measurements
assessment and the residual risk the organization is willing to • Measure something of value to the organization
take based on business needs and budget limits. Risk • Determine real progress in security posture
management may beg the question, not necessarily leading to a • Apply to a broad range of organizations while producing
stronger security stance. similar results
Imagine, for instance, a risk assessment that lists a threat • Determine the order in which security controls should be
matrix and the cost to mitigate the risks. Some items on the list applied
would be of insignificant cost. Other items will be very • Determine the resources needed to apply to the security
expensive (figure 1). Often, management may decide to mitigate program

Figure 1—Weighing the Cost of Security Controls


Traditional Security Metrics—
Is it better to buy more for the same amount of money?
What to Count?
A measurement, by itself, is not a metric. Time has to be
brought into the picture, and a metric alone is not the answer
to all the organization’s problems. People have to think
through and analyze the time meaning of the metrics. The trick
is to develop metrics that are simple and provide useful
management information matching security-related, goal-
$ Control setting objectives. The metrics have to enlighten the
organization by showing some type of progress.
$ Control Obviously, the task of security metrics is to count or
$$$ Control measure something. But what should be counted? How can
$ Control security be measured? Figure 2 shows samples of traditionally
used security metrics. Many organizations count incidents
handled, e.g., cyberviruses caught or logged events. How does
this provide a measure of the quality of the security program?
How does this show progress?

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005


people bring to an organization are their brainpower and the
Figure 2—Traditional Security Metrics time they spend using it. But from a security standpoint,
people’s time may not be a valuable metric. For instance, when
Metric Presumed Measurement Pitfalls measuring time spent on security investigations, if more time is
Number of Effectiveness of automated Why are so many spent, is that necessarily an indication of improved security
computer antivirus controls viruses getting
viruses/ through in the first posture? It could be that time is inefficiently used investigating
malicious code place? How many security incidents because the organization’s procedures are
caught got through that never weak—triggering more incidents to be handled by the security
got caught? group when they could be prevented in other ways, such as
Number of Activity of monitoring What threshold causes better training.
security security events an incident or Finally, another classic metric is cost to the business from
incidents and investigation to get
investigations triggered? Are incidents damage by a security incident. This assumes something bad
triggered because of has happened in the first place. While it may measure the
organizational process effectiveness of disaster response, it is not necessarily a good
flaws? measure of the quality of the security program. Some incidents
Cost of True business loss related What residual risks did are a function of the residual risk the business is willing to
security to security failures the business choose to take, combined with unlucky circumstances. Alternatively,
breaches take? Is it a measure of
a crisis or disaster other incidents may be the result of poor security practices
response, but not opening the door for disaster. How can these be differentiated?
necessarily a function of Or, maybe one of these two events occurred and the crisis
reasonable safeguards management was so good, it caused minimum business impact.
in place? Safeguards provide only a certain degree of security; there will
Time and True business cost of Are the tools, always be some risks.
materials running a security assignments or
assigned to program procedures inefficient, The key to security metrics is obtaining measurements that
security causing people to have the following ideal characteristics:
functions waste time? • They should measure organizationally meaningful things.
Compliance Level of compliance How does compliance • They should be reproducible.
with security matching security relate to effectiveness? • They should be objective and unbiased.
rules program goals What is the order of • Over time, they should be able to measure some type of
compliance? Once
compliance is achieved, progression toward a goal.
is the security program In practice, almost all published security metrics are
“finished”? missing one or more of these characteristics. Traditional
security metrics were a “take what you can get” affair, i.e.,
Incident totals are unreliable measures for this reason: whatever metrics were available were grabbed and reported.
imagine a small town with one police officer. He does no other This way of thinking should change.
police work other than patrolling the highway with a radar gun, A more systematic approach is needed for the development of
pulling over hundreds of speeders. Now imagine a large town metrics that directly fit the previously mentioned characteristics.
with many police officers. They do not use radar guns and have
caught very few speeders but have a large defensive driving
Security Program Maturity
program and an active anti-drunk-driving program. Is the small
One piece of the security metrics puzzle is to measure
town safer than the large town? The count of speeders is only
progress of the security program against a maturity model. This
as good as the sensing mechanism, but that number has no
approach directly targets at least two of the four characteristics
depth to it. What about the small town with nonspeeders who
listed above: it measures organizationally meaningful things and
are drunk—are they not potentially
progression toward a goal.
more dangerous?
The few published security maturity models are
Now compare this to an antivirus tool in an information
summarized in figure 3. For some reason, each has only five
systems environment. The fact that it reports a large number of
levels of maturity. Each model seems to suffer from its own
virus infections may make the security team feel good that its
biases about the definition of maturity.
antivirus tool is working, but what does it really say about
It is herewith proposed that a new standard for maturity be
security? Why are so many viruses getting through in the first
used. Maturity should be a measure of only the program’s
place? How many get through and remain undetected? How
progress over time, not necessarily the quality of the elements
does that measure the quality of the security program? It
of the program. This definition of maturity has several
should be considered a great success if the antivirus tool never
important characteristics:
finds any viruses because none ever gets into the environment!
• It provides the blueprint for a complete security program.
Another traditional security metric is time spent on a task—
• It tells management the order in which to implement security
how long people spend doing security-related functions. In
elements.
some cases, from a project management point of view, this
• It leads toward the use of best practice standards
may be a valuable metric because the only two real resources
(e.g., ISO 17799).1

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005


• As long as a standard is used, it provides a way to compare With security, the result should be more like an insurance
one organization’s security program to another. policy. It is not like manufacturing a product. Instead, the
Following this maturity standard, previous models suffer corporation is socializing infrastructure and culture.
from three key deficiencies: This new approach toward a detailed security maturity
1. They confuse quality with existence. One must learn to walk model (called the Security Program Maturity Model) takes a
before one can run. The quality of how well one walks is not management systems approach. Therefore, it follows the
necessarily an indication of one’s running ability. ISO 17799 standards for developing a complete security
2. The existing models need to be customized specifically for program. It involves the existence or nonexistence of a large
the organization. Therefore, it is difficult to directly compare number of elements (figure 4).
results from one organization to another.
3. The current models tend to come from engineering or project Figure 4—General Outline of the
management perspectives. Thus, they focus on program Security Maturity Model
elements meeting certain engineering-style specifications.
They do not necessarily drive the program toward a particular No. of
ISO 17799 Elements Items Covered
organizational goal and therefore work poorly for a security
Categories Measured by Elements
program. The incorporation of total quality management
1. Overall Business need, strategy,
(TQM) and six Sigma philosophies are examples of this. security 11 commitment, roles and
management responsibilities, policies and procedures
2. Asset Valuation, risk assessment, business
Figure 3—Published Security Maturity Models classification 5 ownership, labeling and handling,
and control inventory
Model Description Comments 3. Personnel Hiring and termination, roles and
NIST CSEAT IT Five levels of progressive Focused toward levels security 8 responsibilities, screening, training,
Security Maturity maturity: of documentation reporting, review
Model2 1. Policy 4. Physical and Perimeters, environmental hazards,
2. Procedure environmental risk assessment, access controls,
3. Implementation security 12 safety, asset removal and destruction,
4. Testing monitoring, incident handling,
5. Integration awareness, cooperation
Citigroup’s Five levels of progressive Focused toward 5. Access Perimeters, risk assessment, access
Information maturity: organizational awareness control 11 controls, authentication, need to know,
Security 1. Complacency and adoption user responsibility, access updating,
Evaluation 2. Acknowledgment monitoring, mobile computing, incident
Model 3. Integration handling
(CITI-ISEM)3 4. Common practice
5. Continuous 6. System Standards, life cycle model, review,
improvement development gap analysis, requirements planning,
and 9 testing integrity and certification,
COBIT® Five levels of progressive Focused toward auditing maintenance code repository, release management,
Maturity maturity: specific procedures retirement
Model4 1. Initial/ad hoc
2. Repeatable but intuitive 7. Communi- Standards, all methods of
3. Defined process cations and e-communications, operations
4. Managed and operations procedures, monitoring, backups,
measurable management 16 exception handling, updates and
5. Optimized patches, help desk, change
management, cryptographic systems,
SSE-CMM Five levels of progressive Focused toward security
media handling, malicious code, system
Model5 maturity: engineering and software
acceptance, documentation library,
1. Performed informally design
capacity planning
2. Planned and tracked
3. Well-defined 8. Organizational Security function, monitoring,
4. Quantitatively controlled security 11 advisory, auditing, forum, awareness
5. Continuously improving training, segregation of duties,
CERT/CSO Five levels of Focused toward penetration and vulnerability testing,
Security progressive maturity: measurement of quality incident handling, cooperation
Capability 1. Exists relative to levels of 9. Business Risk assessment, prioritization,
Assessment6 2. Repeatable documentation continuity 7 backups, business continuity/
3. Designated person management disaster recovery planning, testing,
4. Documented updates
5. Reviewed and updated 10. Compliance Regulatory, contractual, intellectual
10 property, labeling and handling, record
Measures using four retention, auditing, sanctions
levels:
1. Initial
2. Evolving
3. Established
4. Managed

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005


Since very little judgment is involved, the results are This is why a separate measurement tool must be employed to
reproducible and objective. Quality or effectiveness of the measure quality or effectiveness of the actual implementation.
element implementation is not measured, though certain
elements (such as auditing programs), if executed, can lead Security Posture
toward other quality controls. This is similar to health An improvement to the maturity model is the security
department inspections of restaurants. Inspection scores can posture, which essentially modifies the maturity model based
tell which restaurants to avoid, but they tell nothing about on the quality of the implementation of each element. An
whether one will get a good meal at those that pass inspection. advantage of adding a quality measure is that, unlike the
Maturity level is an important measure when comparing one maturity score, security posture is not a static achievement
organization to another. If an organization’s maturity score is level. It is dynamic and can change based on the quality of the
75 percent, would it want to network into another organization continued execution of the program elements. It requires active
whose score is only 25 percent? management of the security program to maintain a certain
Maturity level leads an organization to better understand its security posture.
security program relative to its peers. It provides a yardstick Quality is a subjective measure. But, since it is separate
with which to assess the degree of trust that can be placed with from maturity, the two values cannot be confused, as in other
interconnected computer systems between different models. A three-tiered factor (high, medium and low), as
organizations. shown in figure 6, is suggested. With detailed threshold
This security maturity model is also a blueprint for the order descriptions, it is possible to be objective and produce results.
that program elements should be implemented. Figure 5 provides This scheme is similar to the US National Security Agency
an example of the step-by-step approach toward the (NSA) Infosec Assessment Methodology (IAM) criticality
implementation of elements. In a maturing program, the elements model, where items also have three tiers of quality.7 But, unlike
are executed based on the outcome of previous implementation the NSA IAM model, which is only focused on data and
steps. Consequently, it directly tells management when to “buy” systems, a richer picture of the entire organization’s security
security. It answers the question posed in figure 1. program is provided.

Figure 5—Sample Elements From Figure 6—Sample of a Quality Measure From


Security Maturity Model Security Maturity Model
Order Program Maturity Elements— Program If maturity element is implemented, then…
Performed 2. Asset Classification and Control Maturity Low-quality Medium-quality High-quality
1 2.1 Valuation is performed to identify and understand Element Threshold Threshold Threshold
information assets to protect. 2.4 Information Procedures Assets Pervasive
2 2.2 Risk assessment is performed to identify and quantify Assets developed but partially classification
threats to information assets. classification not classified throughout
3 2.3 Information assets have defined system custodians labeling and implemented entire
and business owners. handling organization
procedures
4 2.4 Information assets classification labeling and developed
handling procedures are developed.
5 2.5 An asset management inventory program is installed
to handle assets on an ongoing basis. An ideal security quality metric could be used as a dashboard
display for management. It could give a near real-time view of
It also avoids the pitfall of implementing security measures the organization’s security posture (see figures 7 and 8). It
in the wrong order, introducing security risks precisely because should be measured on a weekly basis; ownership of individual
safeguards are unsystematically implemented. For example, in program maturity elements should be assigned to specific
figure 5 the organization could actually be harmed if an active
inventory management system (element 2.5) is implemented Figure 7—Simulated Example of Improving
before valuing the assets and performing a risk assessment Program Maturity With Time
(elements 2.1 and 2.2). If it is completed in the wrong order, it Maturity

could take years of wasteful redesign work to the inventory ISO 17799 Categories
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

system to properly categorize and ultimately protect the assets. 1. Overall Security Managementt (11)

Since this model is essentially a detailed compliance tool, 2. Asset Classification and Control (5)

management can possibly misinterpret program maturity. A 3. Personnel Security (8)

high level of maturity may give the false impression of project 4. Physical & Environmental Security (12)

management closure. It might indicate to management that the 5. Cyber Access Control (11)

organization is now “safe” and there is no more need to 6. System Development and Maintenance (9)

7. Communications and Operations Management (16)


support the security effort, while it may instead be an
8. Organizational Security (11)
indication of a safe posture only if the elements have been
9. Business Continuity Management (7)
executed at a high-quality level. But, on the other hand, just
10. Compliance (10) Time 1
because an organization has completed a particular security Time 2
Overall ISO 17799
element, it does not necessarily mean it is doing a good job of it.

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005


Figure 8—Simulated Example of Improving Quality Table 9—Simulated Example Showing a
With Time Using a Security Posture Metric Management Dashboard Comparison of
Security Performance by Department
Quality—Time 1
Depart- Maturity Maturity Quality of Implemented
High
ment Elements Level Security Elements
Owned
(implemented
Low elements in bold) high medium low
1 1.1, 3.2, 4.1, Three (3) of
7.5 four (4)
implemented—
Medium
75 percent
2 7.1, 7.2, 7.3, Twelve (12) of
7.4, 7.6, 7.7, 15 implemented—
Quality—Time 2 7.8, 7.9, 7.10, 80 percent
7.11, 7.12,
High
Low 7.13, 7.14,
7.15, 7.16
3 4.2, 4.3, 4.4, Eight (8) of 11
4.5, 4.6, 4.7, implemented—
4.8, 4.9, 4.10, 73 percent
4.11, 4.12

Endnotes
Medium 1
ISO/IEC 17799:2000(E), Information Technology—
Code of Practice for Information Security Management,
departments. Thus, in sorting the elements by departments, December, 2000
management can gain a security understanding specifically 2
National Institute of Standards and Technology (NIST)
customized by the way the organization is structured. Computer Security Expert Assist Team (CSEAT) IT
Figure 9 provides a simulated example of one such fictional Security Model, csrc.nist.gov/cseat/CSEAT_IT_Security_
organization. At a glance, it can be seen how each department Maturity_Levels.htm
rates in program maturity and quality. For example, department 3
Dunbar, Thomas M.; “Information Metrics @ Citigroup,”
2 is more mature, yet its quality is lower than the other two 14 June 2000, Computer System Security and Privacy
departments, and while departments 1 and 3 are at nearly the Advisory Board (CSSPAB) workshop “Approaches to
same level of maturity, the quality of department 1’s Measuring Security,” http://csrc.nist.gov/ispab/june13-
implementation is higher. 15/Citigroup.pdf
These appraisals need active management on an ongoing 4
IT Governance Institute (ITGI), Control Objectives for
basis. By using security metrics in this manner, the organization Information and related Technology (COBIT), USA, 2000,
incorporates security deeply into its structure. Security metrics www.itgi.org. Also see www.auckland.ac.nz/security/
then become a meaningful gauge of organizational performance, InfomationSecurityMaturityAssessment.htm, draft version 0.1,
because they were designed to meet the initial objectives for 2003.
metrics. An organization can easily demonstrate security posture 5
Systems Security Engineering Capability Maturity Model,
improvements over time. Moreover, as security elements become (SSE-CMM), Carnegie Melon University, 1999,
adopted in a more systematic way, management can begin to www.sse-cmm.org
understand the costs and benefits of an organized, mature and 6
“CERT Security Capability Assessment Tool,” CSO Online,
high-quality security program. CXO Media Inc. and Carnegie Melon University, 2003,
They are sorted by ISO 17799 categories with the numbers www.csoonline.com/surveys/securitycapability.html
in parentheses representing the actual number of program 7
G. Miles, et al., Security Assessment: Case Studies for
elements used for the maturity score. Implementing the NSA IAM, Sygress Publishing Inc., 2004
All the quality measures for existing program elements are
aggregated at two different times.

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005


David Chapin, CISA, CISM, CISSP, IAM
is a private security consultant. His current interests include
security infrastructure design, security management,
information assessment and organizational security. He holds
three patents in business processes. He can be contacted at
dchapin@earthlink.net.

Steven Akridge, JD, CISM, CM, CISSP, IAM


is a private security consultant. His current interests include
security management, organizational security, assessment
methodology, forensics, cryptography and security intelligence.
He has served on numerous security industry committees,
including the Federal Public Key Infrastructure Steering
Committee, the Partnership for Critical Infrastructure Security
and the National Association of State CIOs. He may be
contacted at steveakridge@cissp.com.

Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary
organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.

Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit
and Control Association and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal
does not attest to the originality of authors' content.

© Copyright 2005 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights reserved. ISCATM Information Systems Control AssociationTM

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the
association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles
owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume,
and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the
association or the copyright owner is expressly prohibited.

www.isaca.org

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 2, 2005

You might also like