You are on page 1of 175

i | P a g e

2010 The Center for Internet Security



The Center for Internet
Security

The CIS
Security
Metrics
November 1st
2010
Organizations struggle to make cost-effective security investment
decisions; information security professionals lack widely accepted and
unambiguous metrics for decision support. CIS established a consensus
team of one hundred (150) industry experts to address this need. The
result is a set of standard metrics and data definitions that can be used
across organizations to collect and analyze data on security process
performance and outcomes.

This document contains twenty-eight (28) metric definitions for seven (7)
important business functions: Incident Management, Vulnerability
Management, Patch Management, Application Security, Configuration
Management, Change Management and Financial Metrics
CIS Security
Metrics v1.1.0
The CIS Security Metrics v1.1.0 November 1, 2010
i i | P a g e
2010 The Center for Internet Security
Contents
Contents ............................................................................................................................................... i i
Terms of Use Agreement.................................................................................................................. vi i i
CIS Terms of Use............................................................................................................................ vi i i
Background............................................................................................................................................1
Consensus Gui dance .........................................................................................................................1
Management Perspecti ve and Benefi ts...........................................................................................1
Busi ness Functions................................................................................................................................3
Metri c Categori es..................................................................................................................................4
Incident Management ..........................................................................................................................6
Updati ng Data over Ti me..................................................................................................................6
Data Attributes ..................................................................................................................................6
Securi ty Incidents Tabl e ................................................................................................................6
Securi ty Incident Classificati on Tabl e ..........................................................................................8
Securi ty Events Tabl e ....................................................................................................................9
Securi ty Incident Impact Analysis Tabl e ......................................................................................9
Securi ty Incident Reporti ng Tabl e ............................................................................................. 11
Technol ogi es Tabl e..................................................................................................................... 12
Securi ty Incident Effect Rati ng Tabl e ........................................................................................ 13
Securi ty Incident Cri ticali ty Rati ng Tabl e .................................................................................. 13
Classificati ons ............................................................................................................................. 15
Priori ty......................................................................................................................................... 18
Sources........................................................................................................................................ 19
Di mensi ons ................................................................................................................................. 19
Automation................................................................................................................................. 19
Visuali zati on................................................................................................................................ 19
Defined Metrics .............................................................................................................................. 21
Mean-Ti me-To-Incident-Discovery ........................................................................................... 21
Mean Ti me between Securi ty Incidents ................................................................................... 24
The CIS Security Metrics v1.1.0 November 1, 2010
i i i | P a g e
2010 The Center for Internet Security
Mean Ti me to Incident Recovery .............................................................................................. 26
Cost of Inci dents......................................................................................................................... 29
Mean Cost of Incidents .............................................................................................................. 33
Mean Inci dent Recovery Cost.................................................................................................... 36
Vulnerabili ty Management ............................................................................................................... 40
Data Attributes ............................................................................................................................... 40
Technol ogi es Tabl e..................................................................................................................... 41
Vulnerabili ty Informati on Tabl e ................................................................................................ 42
CVSS Score Tabl e ........................................................................................................................ 43
Identifi ed Vul nerabili ti es Tabl e ................................................................................................. 46
Identifi ed Vul nerabili ti es Tabl e ................................................................................................. 47
Technol ogi es Tabl e..................................................................................................................... 47
Classificati ons and Dimensi ons ................................................................................................. 49
Severi ty of Vulnerabili ti es.......................................................................................................... 50
Technol ogy Val ue (CTV, ITV, ATV)............................................................................................. 50
Sources........................................................................................................................................ 51
Di mensi ons ................................................................................................................................. 51
Automation................................................................................................................................. 52
Visuali zati on................................................................................................................................ 52
Management and Operati onal Metrics ........................................................................................ 53
Percent of Systems wi thout Known Severe Vul nerabili ti es .................................................... 53
Mean-Ti me to Mi tigate Vul nerabili ti es..................................................................................... 56
Mean Cost to Mi tigate Vul nerabili ti es...................................................................................... 59
Patch Management ........................................................................................................................... 62
Data Attributes ............................................................................................................................... 62
Technol ogi es Tabl e..................................................................................................................... 63
Technol ogi es Tabl e..................................................................................................................... 64
Patch Information Tabl e ............................................................................................................ 65
Patch Activi ty Tabl e.................................................................................................................... 66
The CIS Security Metrics v1.1.0 November 1, 2010
i v | P a g e
2010 The Center for Internet Security
Patch Activi ty Revi ew Tabl e....................................................................................................... 67
Classificati ons ............................................................................................................................. 69
Cri ticali ty of Patches................................................................................................................... 69
Technol ogy Val ue (CTV, ITV, ATV)............................................................................................. 69
Sources........................................................................................................................................ 70
Di mensi ons ................................................................................................................................. 70
Automation................................................................................................................................. 70
Visuali zati on................................................................................................................................ 71
Management and Operati onal Metrics ........................................................................................ 72
Patch Policy Compliance ............................................................................................................ 72
Mean Ti me to Patch ................................................................................................................... 75
Mean Cost to Patch .................................................................................................................... 78
Configurati on Management Metrics ................................................................................................ 81
Data Attributes ............................................................................................................................... 81
Technol ogi es Tabl e..................................................................................................................... 81
Configurati on Status Accounti ng Tabl e .................................................................................... 83
Configurati on Deviation Tabl e................................................................................................... 83
Configurati on Deviation Tabl e................................................................................................... 84
Defined Metrics .............................................................................................................................. 85
Percentage of Confi guration Compliance................................................................................. 85
Change Management Metrics........................................................................................................... 89
Data Attributes ............................................................................................................................... 89
Technol ogi es Tabl e..................................................................................................................... 90
Change Exempti on Tabl e ........................................................................................................... 91
Change Request Tabl e ............................................................................................................... 92
Change Item Tabl e ..................................................................................................................... 93
Change Revi ew Tabl e ................................................................................................................. 93
Classificati ons ............................................................................................................................. 95
Sources........................................................................................................................................ 96
The CIS Security Metrics v1.1.0 November 1, 2010
v | P a g e
2010 The Center for Internet Security
Di mensi ons ................................................................................................................................. 96
Automation................................................................................................................................. 96
Visuali zati on................................................................................................................................ 96
Defined Metrics .............................................................................................................................. 97
Mean Ti me to Compl ete Changes............................................................................................. 97
Percent of Changes wi th Securi ty Revi ew .............................................................................. 100
Percent of Changes wi th Securi ty Exceptions ........................................................................ 102
Applicati on Securi ty Metrics ........................................................................................................... 104
Data Attributes ............................................................................................................................. 106
Technol ogi es Tabl e................................................................................................................... 106
Busi ness Applications Tabl e..................................................................................................... 107
Busi ness Application Status Tabl e........................................................................................... 109
Risk Assessments Tabl e............................................................................................................ 109
Securi ty Testing Tabl e .............................................................................................................. 110
Busi ness Application Weaknesses Tabl e ................................................................................ 111
Most Dangerous Programmi ng Errors Tabl e.......................................................................... 113
Classificati ons ........................................................................................................................... 115
Busi ness Application Value...................................................................................................... 116
Sources...................................................................................................................................... 116
Di mensi ons ............................................................................................................................... 117
Automation............................................................................................................................... 117
Visuali zati on.............................................................................................................................. 117
Defined Metrics ............................................................................................................................ 118
Percentage of Cri ti cal Applications ......................................................................................... 118
Risk Assessment Coverage....................................................................................................... 120
Financial Metrics .............................................................................................................................. 124
Data Attributes ............................................................................................................................. 126
Informati on Securi ty Spending Tabl e ..................................................................................... 126
Securi ty Spendi ng and Budget ................................................................................................ 127
The CIS Security Metrics v1.1.0 November 1, 2010
vi | P a g e
2010 The Center for Internet Security
Spending Categori es and Purpose .......................................................................................... 127
Sources...................................................................................................................................... 128
Di mensi ons ............................................................................................................................... 128
Automation............................................................................................................................... 128
Visuali zati on.............................................................................................................................. 128
Defined Metrics ............................................................................................................................ 129
Informati on Securi ty Budget as % of IT Budget ..................................................................... 129
Informati on Securi ty Budget All ocati on ................................................................................. 132
Technical Metrics ............................................................................................................................. 135
Incidents........................................................................................................................................ 135
Number of Incidents ................................................................................................................ 135
Vulnerabili ty Management.......................................................................................................... 138
Vulnerabili ty Scan Coverage .................................................................................................... 138
Number of Known Vul nerabili ty Instances............................................................................. 140
Patch Management...................................................................................................................... 142
Patch Management Coverage ................................................................................................. 142
Configurati on Management ........................................................................................................ 144
Configurati on Management Coverage ................................................................................... 144
Current Anti-Mal ware Coverage ............................................................................................. 148
Applicati on Securi ty ..................................................................................................................... 151
Number of Applicati ons ........................................................................................................... 151
Appendix A: Gl ossary ....................................................................................................................... 153
Anti -malware ............................................................................................................................ 153
Applicati on Securi ty Testi ng .................................................................................................... 153
Bias ............................................................................................................................................ 153
Busi ness Application ................................................................................................................ 153
Containment ............................................................................................................................. 154
Data Record .............................................................................................................................. 154
De-identi fi ed ............................................................................................................................. 154
The CIS Security Metrics v1.1.0 November 1, 2010
vi i | P a g e
2010 The Center for Internet Security
Malware .................................................................................................................................... 154
Risk Assessment ....................................................................................................................... 155
Securi ty Incident....................................................................................................................... 155
Securi ty Patch ........................................................................................................................... 155
Technol ogy................................................................................................................................ 155
Third party ................................................................................................................................ 155
Vulnerabili ty ............................................................................................................................. 155
Appendix B: Acknowl edgements .................................................................................................... 156
Appendix C: Exampl es of Addi ti onal Metrics ................................................................................. 156
Percentage of Inci dents Detected by Internal Controls ........................................................ 156
Mean Ti me to Depl oy Cri tical Patches.................................................................................... 161
Index of Tabl es ................................................................................................................................. 164

The CIS Security Metrics v1.1.0 November 1, 2010
vi i i | P a g e
2010 The Center for Internet Security
Terms of Use Agreement
The nonprofit Center for Internet Security (CIS) provides consensus-oriented information security products, services,
tools, metrics, suggestions, and recommendations (the CIS Products) as a public service to Internet users worldwide.
Downloading or using any CIS Product in any way signifies and confirms your acceptance of and your binding
agreement to these CIS Terms of Use.
CIS Terms of Use
Both CIS Members and non-Members may:
Download, install, and use each of the CIS Products on a single computer, and/or
Print one or more copies of any CIS Product that is in a .txt, .pdf, .doc, .mcw, or .rtf format, but only if each such
copy is printed in its entirety and is kept intact, including without limitation the text of these CIS Terms of Use.
Under the Following Terms and Conditions:
CIS Products Provided As Is. CIS is providing the CIS Products as is and as available without: (1) any
representations, warranties, or covenants of any kind whatsoever (including the absence of any warranty
regarding: (a) the effect or lack of effect of any CIS Product on the operation or the security of any network,
system, software, hardware, or any component of any of them, and (b) the accuracy, utility, reliability,
timeliness, or completeness of any CIS Product); or (2) the responsibility to make or notify you of any
corrections, updates, upgrades, or fixes.
Intellectual Property and Rights Reserved. You are not acquiring any title or ownership rights in or to any
CIS Product, and full title and all ownership rights to the CIS Products remain the exclusive property of CIS. All
rights to the CIS Products not expressly granted in these Terms of Use are hereby reserved.
Restrictions. You acknowledge and agree that you may not: (1) decompile, dis-assemble, alter, reverse
engineer, or otherwise attempt to derive the source code for any software CIS Product that is not already in the
form of source code; (2) distribute, redistribute, sell, rent, lease, sublicense or otherwise transfer or exploit any
rights to any CIS Product in any way or for any purpose; (3) post any CIS Product on any website, bulletin board,
ftp server, newsgroup, or other similar mechanism or device; (4) remove from or alter these CIS Terms of Use on
any CIS Product; (5) remove or alter any proprietary notices on any CIS Product; (6) use any CIS Product or any
component of a CIS Product with any derivative works based directly on a CIS Product or any component of a
CIS Product; (7) use any CIS Product or any component of a CIS Product with other products or applications that
are directly and specifically dependent on such CIS Product or any component for any part of their functionality;
(8) represent or claim a particular level of compliance or consistency with any CIS Product; or (9) facilitate or
otherwise aid other individuals or entities in violating these CIS Terms of Use.
Your Responsibility to Evaluate Risks. You acknowledge and agree that: (1) no network, system, device,
hardware, software, or component can be made fully secure; (2) you have the sole responsibility to evaluate the
risks and benefits of the CIS Products to your particular circumstances and requirements; and (3) CIS is not
assuming any of the liabilities associated with your use of any or all of the CIS Products.
CIS Liability. You acknowledge and agree that neither CIS nor any of its employees, officers, directors, agents or
other service providers has or will have any liability to you whatsoever (whether based in contract, tort, strict
liability or otherwise) for any direct, indirect, incidental, consequential, or special damages that arise out of or
are connected in any way with your use of any CIS Product.
Indemnification. You agree to indemnify, defend, and hold CIS and all of CIS's employees, officers, directors,
agents and other service providers harmless from and against any liabilities, costs and expenses incurred by any
of them in connection with your violation of these CIS Terms of Use.
Jurisdiction. You acknowledge and agree that: (1) these CIS Terms of Use will be governed by and construed in
accordance with the laws of the State of Maryland; (2) any action at law or in equity arising out of or relating to
these CIS Terms of Use shall be filed only in the courts located in the State of Maryland; and (3) you hereby
consent and submit to the personal jurisdiction of such courts for the purposes of litigating any such action.

The CIS Security Metrics v1.1.0 November 1, 2010

i x | P a g e
2010 The Center for Internet Security
Special Rules for CIS Member Organizations:
CIS reserves the right to create special rules for: (1) CIS Members; and (2) Non-Member
organizations and individuals with which CIS has a written contractual relationship. CIS
hereby grants to each CIS Member Organization in good standing the right to distribute the
CIS Products within such Members own organization, whether by manual or electronic
means. Each such Member Organization acknowledges and agrees that the foregoing
grants in this paragraph are subject to the terms of such Members membership
arrangement with CIS and may, therefore, be modified or terminated by CIS at any time.
The CIS Security Metrics v1.1.0 November 1, 2010
1 | P a g e
2010 The Center for Internet Security
Background
Consensus Guidance
Thi s gui de was created using a consensus process comprised of volunteer and contract subject
matter experts. Consensus participants provide perspective from a di verse set of backgrounds
i ncl uding consulting, software development, audit and compliance, security research,
operati ons, government and l egal.
Intent and Scope
Thi s i nitial set comprises metrics and business function selected as a starting point by the
metri cs community, both i n terms of the scope of the metri cs across business functions and the
depth of the metri cs in assessing security outcomes and performance. Once these foundational
datasets and metrics are in place, additional metrics can and will be devel oped by the
communi ty covering additional functions and topics i n each function.
Management Perspective and Benefits
The i mmedi ate objective of these definitions is to hel p enterprises i mprove thei r overall level of
security and reduce costs by providing a set of standard metri cs that can be i mplemented i n a
wi de range of organizations. A future objective i s to provide standard metri cs as a basis for
i nter-enterprise benchmarking. These security control metrics were sel ected for common
security functions and concepts based on the availability of data, value provided for security
management, and their ability to communicate the state of security performance.
Organizations can create a foundation for a metrics program by fi rst selecting metrics from the
busi ness management areas of i mmediate i nterest and then i mplement one or more of the
metri cs based on the defi nitions provided in this document. This wel l-defined set of standard
metri cs will enable the use of metri cs i n the security community by providing:
Clear Guidance for Organizations on Implementing Metrics. Practical definitions of security
metrics based on data most organizations are already collecting. This will make it easier, faster, and
cheaper to implement a metrics program that supports effective decision-making. Metrics provide a
means of communicating security performance and can be used to guide resource allocation, identify
best practices, improve risk management effectiveness, align business and security decision-making,
and demonstrate compliance.
Defined Metric Framework for Security Products and Services. A clear set of data requirements
and consensus-based metric definitions will enable vendors to efficiently incorporate and enhance
their security products with metrics. Consensus-driven metric standards will provide ways to
demonstrate the effectiveness of vendor products, processes, and services assist the state of their
customers.
The CIS Security Metrics v1.1.0 November 1, 2010

2 | P a g e
2010 The Center for Internet Security
Common Standards for Meaningful Data Sharing and Benchmarking. Metric results will be
calculated uniformly enabling meaningful benchmarking among business partners and industry
sectors. A shared metric framework and the ability to track and compare results will leverage the
capabilities of the entire security community, leading to best practice identification and
improvements in overall information security practices.
The CIS Security Metrics v1.1.0 November 1, 2010

3 | P a g e
2010 The Center for Internet Security
Business Functions
Thi s i nitial document provides twenty consensus metrics definitions for six i mportant business
functi ons. Organizations can adopt the metri cs based on the business functions of highest
pri ority. More metri cs will be defi ned i n the future for these and additional business functions.
Table 1: Business Functions
Business Functions
Function Management Perspective Defined Metrics
Incident
Management
How wel l do we detect,
accuratel y i dentify, handle, and
recover from security i ncidents?
Cost of Incidents
Mean Cost of Incidents
Mean Incident Recovery Cost
Mean-Ti me to Incident Discovery
Number of Inci dents
Mean-Ti me Between Security
Inci dents
Mean-Ti me to Incident Recovery
Vulnerability
Management
How wel l do we manage the
exposure of the organization to
vul nerabilities by identifying and
mi ti gating known
vul nerabilities?
Vul nerability Scanning Coverage
Percent of Systems with No Known
Severe Vul nerabilities
Mean-Ti me to Mi tigate
Vul nerabilities
Number of Known Vul nerability
Instances
Mean Cost to Mi ti gate
Vul nerabilities
Patch
Management
How wel l are we abl e to
mai ntain the patch state of our
systems?
Patch Policy Compliance
Patch Management Coverage
Mean-Ti me to Patch
Mean Cost to Patch
Configuration
Management
What i s the configuration state
of systems i n the organization?
Percentage of Configuration
Compl i ance
Confi guration Management
Coverage
Current Anti -Malware Compliance
Change How do changes to system
confi gurations affect the
Mean-Ti me to Complete Changes
The CIS Security Metrics v1.1.0 November 1, 2010

4 | P a g e
2010 The Center for Internet Security
Management security of the organization? Percent of Changes wi th Security
Revi ews
Percent of Changes wi th Security
Excepti ons
Application
Security
Can we rel y on the security
model of business applications
to operate as i ntended?
Number of Appl i cations
Percent of Cri tical Applications
Ri sk Assessment Coverage
Securi ty Testing Coverage
Financial Metrics What i s the l evel and purpose of
spendi ng on i nformation
security?
IT Securi ty Spending as % of IT
Budget
IT Securi ty Budget Al location
Future Functions Communi ty recommendations
for addi tional business functions
i ncl ude:
Data / Information
Software Devel opment Li fe-Cycle
Confi guration Management
Thi rd Party Risk Management
Addi ti onal Fi nancial and Impact
Metri cs
Authenti cation and Authorization
Data and Network Security
Remedi ati on Efforts
Anti -Malware Controls

Metric Categories
Metri cs are organized into a hi erarchy based on thei r purpose and audience. Management
metri cs are generally the most valuable to the organization but may require that foundational
techni cal metrics be i n place.
Table 2: Metric Categories
Metric Categories
Management
Metrics
Provi de i nformation on the
performance of business
functi ons, and the i mpact on
the organization.
Audi ence: Business
Cost of Incidents
Mean Cost of Incidents
Percent of Systems with No Known Severe
Vul nerabilities
Patch Policy Compliance
Percentage of Configuration Compliance
The CIS Security Metrics v1.1.0 November 1, 2010

5 | P a g e
2010 The Center for Internet Security
Management Percent of Changes wi th Security Reviews
IT Securi ty Spending as % of IT Budget
Operational
Metrics
Used to understand and
opti mi ze the activities of
busi ness functi ons.

Audi ence: Security
Management
Mean Incident Recovery Cost
Mean-Ti me to Incident Discovery
Mean-Ti me Between Security Incidents
Mean-Ti me to Incident Recovery
Mean-Ti me to Mi tigate Vul nerabilities
Mean Cost to Mi ti gate Vul nerabiliti es
Mean Cost to Patch
Mean-Ti me to Patch
Mean-Ti me to Complete Changes
Percent of Changes wi th Security Exceptions
IT Securi ty Budget Al location
Technical
Metrics
Provi de technical details as
wel l as a foundation for other
metri cs.
Audi ence: Security
Operations
Number of Inci dents
Vul nerability Scanning Coverage
Number of Known Vul nerability Instances
Patch Management Coverage
Confi guration Management Coverage
Current Anti -Malware Compliance
Number of Appl i cations
Percent of Cri tical Applications
Ri sk Assessment Coverage
Securi ty Testing Coverage

The CIS Security Metrics v1.1.0 November 1, 2010

6 | P a g e
2010 The Center for Internet Security
Incident Management
Thi s section describes metri cs for measuring the processes for detecting, handling, and
recoveri ng from security i ncidents.
As described i n the Glossary section of this document, a security incident results i n the actual
outcomes of a business process deviating from expected outcomes for confidentiality, i ntegrity,
and availability resulting from people, process, or technology deficiencies or failures
1
. Incidents
that should not be considered security incidents i nclude disruption of service due to
equi pment failures.
Updating Data over Time
It i s possible that data gathered for metrics may change over ti me. For example, the number of
affected records or hosts may change during the i nvestigation of an incident. Metri c values
should be calculated using the current best known data values that can be provided at the ti me
of metri c calculation. A data element should only be i ncluded i n a metric calculation i f data is
available. For example i f i t is only known that an i ncident has occurred but no analysis of the
scope has occurred by the calculation date of monthly incident metri cs, that i ncident should be
i ncl uded i n incident counts but not i ncluded i n calculations of mean records l ost. When
updated data i s available i t should be i ncluded i n future metric calculations and updated values
should be used when presenting metric results. Later, additional metri cs could be added later
to compare esti mates to l ater (actual) values.
Data Attributes
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
security i ncident.
Table 3: Security Incidents Table
The Securi ty Incident Table contains i nformation regarding each of the i ncidents di scovered by
the organization.
Security Incidents Table
Name Type De-
Identified
Required Description
Inci dent ID Number No Yes Uni que i dentifier for the
i nci dent. Generally auto-
generated.
Technol ogy ID Text / Number Yes No Uni que i dentifier for the
technol ogy. Generally auto-

1
Source: Operational Risk Exchange. <http://www.orx.org/reporting/>
The CIS Security Metrics v1.1.0 November 1, 2010

7 | P a g e
2010 The Center for Internet Security
generated.
Event ID Text / Number No No Uni que i dentifier for the
event.
Date of
Occurrence
Date / Ti me No Yes Date and ti me the i ncident
occurred.
Date of
Di scovery
Date / Ti me No Yes Date and ti me the i ncident
was di scovered.
Di scovered By Text Yes No Uni que i dentifier for the
enti ty that fi rst discovered
the i ncident.

Date of
Veri fi cati on
Date / Ti me No No Date and ti me the i ncident
was veri fied, by an Incident
Handl er
Veri fi ed By Text Yes No The name of the person or
system that verified the
i nci dent.
Date of
Contai nment
Date / Ti me No Yes Date and ti me the i ncident
was contained.
Date of
Recovery
Date / Ti me No Yes Date and ti me the affected
systems were brought back
to a ful l y operational state.
Scope of
Inci dent
Text No No Free-form text comment
i ndi cating the scope and size
of the i nci dent; for example,
the number of hosts,
networks, or business units
affected by the i ncident.
Report ID Number Yes No Uni que i dentifier for
reporti ng of incident.
Inci dent
Anal ysis ID
Number No No Uni que i dentifier for incident
analysis.

Attacker Text No No Type of attacker. Use values
Hackers, Spies, Terrorists,
Corporate Raiders,
Professional Criminals,
Vandals, or Voyeurs.

The CIS Security Metrics v1.1.0 November 1, 2010

8 | P a g e
2010 The Center for Internet Security
Table 4: Security Incident Classification Table
The Securi ty Incident Classification Table contains i nformation regarding the cl assification of
i nci dents using taxonomies agreed upon by the organization.
Security Incident Classification Table
Name Type De-
Identified
Required Description
Inci dent ID Number No No Uni que i dentifier for the i ncident.
General ly auto-generated.
Inci dent Name Text No No Name of the i nci dent.
Inci dent
Description
Text No No Description of the i ncident.
Cl assification Text No No Cl assification of the i ncident using
Howard-Longstaff taxonomy
Addi ti onal
Cl assification
Text No No Addi ti onal, optional classifications of
the i ncident for i nternal or other
reporti ng purposes. Incidents may
i ncl ude more than one tag.
Effect Rati ng Text Yes No Esti mated effect of the i ncident on the
organization, using the US-CERT effect
tabl e.
Cri ti cality
Rati ng
Text Yes No Cri ti cality of the systems i nvolved i n
thi s i ncident, using the US-CERT
cri ti cality table.
Addi ti onal
Pri ority
Text No No One-to-many list of values used to
i ndi cate the severity or priority of the
i nci dent for each affected organization,
usi ng a priority classification (links
bel ow). Pri orities may vary by affected
organization.
Country of
Ori gination
Text No No The ISO code of the country where the
source of the i ncident resides.
Country of
Desti nation
Text No No The ISO codes of the country where
the target company/server(s) reside.

Table 3: Security Events Table
The Securi ty Events Table contains i nformation regarding the rel ationship among security
events and i ncidents.
The CIS Security Metrics v1.1.0 November 1, 2010

9 | P a g e
2010 The Center for Internet Security
Security Events Table
Name Type De-
Identified
Required Description
Event ID Number No Yes Uni que i dentifier for the event.
Event Name Text No No Name of the event.
Date of
Occurrence
Date /
Ti me
No No Date and ti me the event occurred.
Date of
Di scovery
Date /
Ti me
No No Date and ti me the event was
di scovered.
Di scovered By Text No No Uni que i dentifier for the enti ty that
fi rst discovered the event.
Attacker Text No No Type of attacker. Use values Hackers,
Spi es, Terrorists, Corporate Raiders,
Professional Criminals, Vandals, or
Voyeurs.
Tool Text No No Type of tool used. Use values Physical
Attack, Information Exchange, User
Command, Script or Program,
Autonomous Agent, Tool kit,
Di stributed Tool, or Data Tap.
Vul nerability Text No No Type of vul nerability exploited. Use
val ues Design, Implementation, or
Confi guration.
Acti on Text No No Type of acti on performed. Use values
Probe, Scan, Flood, Authenticate,
Bypass, Spoof, Read, Copy, Steal,
Modi fy, Del ete, Target, Account,
Process, Data, Component, Computer,
Network, or Internetwork.
Objective Text No No Reason for attack. Use values
Chal lenge, Status, Thrill, Political Gain,
Fi nancial Gain, or Damage.

Table 4: Security Incident Impact Analysis Table
The Securi ty Incident Impact Analysis Table contains information resulting from the review and
analysis of security incidents that occurred wi thin the organization.
Security Incident Impact Analysis Table
Name Type De-
Identified
Required Description
The CIS Security Metrics v1.1.0 November 1, 2010

10 | P a g e
2010 The Center for Internet Security
Inci dent
Anal ysis ID
Number No Yes Uni que i dentifier for incident analysis.
Inci dent ID Number No No Uni que i dentifier for the i ncident.
Technol ogy ID Text /
Number
Yes No Uni que i dentifier for the technology.
Vul nerability ID Text /
Number
No No Uni que i dentifier for the vulnerability
i nstance.
Detected by
Internal
Control s
Bool ean No No Whether the i ncident was detected by
a control operated by the organization.
Response
Protocol
Fol l owed
Bool ean No No Whether i ncident response protocol
was fol lowed.
Busi ness
Conti nuity Plan
Executed
Bool ean No No Whether business continuity plan was
executed fol l owing i ncident.
Reoccurring Bool ean No No Whether i ncident has occurred before.
Root Cause Text No No Text descripti on of the root cause of
the i ncident.
Di rect Loss
Amount
Number No No Quanti fiable, direct fi nancial l oss
veri fi ed by management due to money,
IP or other assets l ost or stolen.
Busi ness
System
Downti me
Number No No The number of hours that a business
system was unavailable or non-
operati onal (if any); on a per-business
system (not per-host) basis.
Cost of
Busi ness
System
Downti me
Number No No Total l osses (if any) attributed to the
ti me busi ness systems were
unavailable or non-operational.
Cost of
Contai nment
Number No No Total cost to contain i ncident.
Cost of
Recovery
Number No No Total cost to recover from i ncident for
effort and equi pment and costs to
repai r or replace affected systems.
Customers
Affected
Bool ean No No Whether or not customer data was
affected by the i ncident.
Loss of
Personally
Identi fiable
Information
Bool ean No No Whether or not PII was l ost during the
i nci dent.
Data Types Lost Text No No CCN (Credi t Card Numbers)
SSN (Soci al Security Numbers or Non-
The CIS Security Metrics v1.1.0 November 1, 2010

11 | P a g e
2010 The Center for Internet Security
US Equi valent)
NAA (Names and/or Addresses)
EMA( Emai l Addresses)
MISC (Mi scellaneous)
MED (Medi cal)
ACC( Fi nancial Account Information)
DOB (Date of Bi rth)
FIN (Fi nancial Information)
Records
Affected
Number No No Total number of records affected i n
data breach incidents.
Cost of
Resti tuti on
Number No No Total cost of noti fication, restitution
and additional security services offered
to affected customers i n data breach
i nci dents.
PCI Penalties Number No No Total cost of PCI penalties defined by
PCI DSS.

Table 5: Security Incident Reporting Table
The Securi ty Incident Reporting Table contains i nformation regarding the i ncident reports the
organization may have published. These reports may fulfill i nternal management requests or
external governance and compliance requirements.
Security Incident Reporting Table
Name Type De-
Identified
Required Description
Report ID Number Yes No Uni que i dentifier for reporting of
i nci dent.
Report Date Date/Ti me No No Date i ncident was reported.
Internal Bool ean No No Whether report i s i nternal or external.
Industry Sector Text No No Sector the organization bel ongs to.
Organization
Si ze
Number No No Si ze of the organization.

Table 6: Technologies Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy wi thin the organization:
The CIS Security Metrics v1.1.0 November 1, 2010

12 | P a g e
2010 The Center for Internet Security
Technologies Table
Name Type De-
identi
fied
Required Description
Technol
ogy ID
Text
/
Num
ber
No Yes Uni que i dentifier for the technology. Generally auto-
generated.
Name Text No No Name from CPE Di cti onary which follows the fol lowing
structure:
cpe:/{PART}:{VENDOR}:{PRODUCT}:{VERSION}:{UPDAT
E}:{EDITION}:{LANGUAGE}.
Part Text No No Pl atform. Use value: H, O, or A. H, O, and A represent
hardware, operating system, and application
envi ronment respectively.
Vendor Text No No Vendor from CPE Di ctionary. This i s the hi ghest
organization-specific l abel of the DNS name.
Product Text No No Product from CPE Di ctionary. Thi s is the most
recogni zable name of the product.
Versi on Text No No Versi on from CPE Di ctionary. Same format as seen
wi th the product.
Update Text No No Update or service pack i nformation from CPE
Di cti onary.
Edi ti on Text No No Edi ti on from CPE Di ctionary. May defi ne specific target
hardware and software architectures.
Languag
e
Text No No Language from CPE Dictionary.
The CIS Security Metrics v1.1.0 November 1, 2010

13 | P a g e
2010 The Center for Internet Security
Technol
ogy
Val ue
Text No Recomme
nded
Impact from the l oss of this technology (C/I/A) to the
organization. Uses value Low, Medium, High, or Not
Defined.
2

Busi ness
Uni t
Text No No Organizational business unit that the technology
bel ongs to.
Owner Text No No Uni que i dentifier for individual within the organization
that i s responsible for the technology.
Cl assific
ati on
Text No No Cl assification of technology: Servers, Workstations,
Laptops, Network Device, Storage Devi ce,
Appl i cations, Operating systems

Table 7: Effect Rating Table
The Effect Rati ng Table contains the val ues for the Effect Rating dimension used i n the Security
Inci dent Cl assification Table.
Security Incident Effect Rating Table
Value Rating Definition
0.00 None No effect on a si ngle agency, mul tiple agencies, or critical i nfrastructure
0.10 Mi nimal Negl i gi ble effect on a single agency
0.25 Low Moderate effect on a single agency
0.50 Medi um Severe effect on a si ngle agency or negligible effect on mul tiple agencies
or cri tical i nfrastructure
0.75 Hi gh Moderate effect on mul tiple agencies or critical i nfrastructure
1.00 Cri ti cal Severe effect on mul ti ple agencies or criti cal infrastructure

Table 8: Criticality Rating Table
The Cri ti cality Rating Table contains the values for the Cri ticality Rating dimension used in the
Securi ty Incident Classification Table.
Security Incident Criticality Rating Table
Value Rating Definition
0.10 Mi nimal Non-cri tical system (e.g., empl oyee workstations), systems, or
i nfrastructure

2
This is adopting 2.3.3 Security Requirements Scoring Evaluation from CVSS v2, http://www.first.org/cvss/cvss-guide.ht ml#i2.3.
The CIS Security Metrics v1.1.0 November 1, 2010

14 | P a g e
2010 The Center for Internet Security
0.25 Low System or systems that support a single agencys mission (e.g., DNS
servers, domain controllers) but are not mi ssion critical
0.50 Medi um System or systems that are mi ssion critical (e.g., payroll system) to a single
agency
0.75 Hi gh System or systems that support multiple agencies or sectors of the critical
i nfrastructure (e.g., root DNS servers)
1.00 Cri ti cal System or systems that are mi ssion critical to mul tiple agencies or critical
i nfrastructure

The di agram bel ow shows the rel ationship of tables described i n Incident Management Data
Attri butes:
The CIS Security Metrics v1.1.0 November 1, 2010

15 | P a g e
2010 The Center for Internet Security
Diagram 1: Relational Diagram for Incidents Data Attributes
Security Incident
PK,FK1,FK4 Incident ID CHAR(10)
PK,FK1,FK3 Technology ID CHAR(10)
PK,FK2 Event ID CHAR(10)
Date of Occurrence DATETIME
Date of Discovery DATETIME
Discovered By CHAR(10)
Date of Verification DATETIME
Verified By CHAR(10)
Date of Containment DATETIME
Date of Recovery DATETIME
Scope of Incident CHAR(10)
Report ID CHAR(10)
FK1 Incident Analysis ID CHAR(10)
Attacker CHAR(10)
Security Incident Classification
PK Incident ID CHAR(10)
Incident Name CHAR(10)
Classification CHAR(10)
Additional Classification CHAR(10)
Effect Rating SHORT
Criticality Rating SHORT
Additional Priority CHAR(10)
Country of Origination CHAR(10)
Country of Destination CHAR(10)
Security Incident Reporting
PK Report ID CHAR(10)
Report Date DATETIME
Internal BIT
Reported By CHAR(10)
Industry Sector CHAR(10)
Organization Size SHORT
Security Incident Impact Analysis
PK Incident Analysis ID CHAR(10)
PK Incident ID CHAR(10)
PK Technology ID CHAR(10)
FK1 Vulnerability ID CHAR(10)
Detected by Internal Controls BIT
Response Protocol Followed BIT
Business Continuity Plan Executed BIT
Reoccurring BIT
Root Cause TEXT(50)
Direct Loss Amount CURRENCY
Business System Downtime SHORT
Cost of Business System Downtime CURRENCY
Cost of Containment CURRENCY
Cost of Recovery CURRENCY
Customers Affected BIT
Loss of Personally Identifiable Information BIT
Records Affected SHORT
Cost of Restitution CURRENCY
PCI Penalties CURRENCY
Covered Costs CURRENCY
Security Events
PK Event ID CHAR(10)
Event Name CHAR(10)
Date of Occurence DATETIME
Date of Discovery DATETIME
Discovered By CHAR(10)
Attacker CHAR(10)
Tool CHAR(10)
Vulnerability CHAR(10)
Action CHAR(10)
Objective CHAR(10)
Technologies
PK Technology ID CHAR(10)
Name TEXT(10)
Part CHAR(1)
Vendor TEXT(10)
Product TEXT(10)
Version TEXT(10)
Update CHAR(10)
Edition TEXT(10)
Language TEXT(10)
Technology Value CHAR(10)
Business Unit CHAR(10)
Owner CHAR(10)
Classification CHAR(10)
Vulnerability
PK Vulnerability ID CHAR(10)
Vulnerability Name TEXT(10)
CVE ID CHAR(10)
CWE ID CHAR(10)
Description TEXT(20)
Release Date DATETIME
Severity CHAR(10)
Classification CHAR(10)

Classifications
Taggi ng of i nformation is a very valuable way to provide context to collected data records.
Cl assification tags provide a way to group i ncidents. A si ngle i ncident mi ght fall i nto one or
more categories, so the security i ncident records management system must support one-to-
many taggi ng capabilities.
The CIS Security Metrics v1.1.0 November 1, 2010

16 | P a g e
2010 The Center for Internet Security
Cl assification tags for security i ncidents may include NIST i ncident categories as defi ned i n
Speci al Publication 800-61
3
, for example:
Denial of service an attack that prevents or i mpairs the authorized use of networks,
systems, or applications by exhausting resources
Malicious code a vi rus, worm, Trojan horse, or other code-based malicious enti ty that
i nfects a host
Unauthorized access a person gains logical or physical access wi thout permission to a
network, system, application, data, or other resource
Inappropriate usage a person violates acceptable computing use policies
Howard and Longstaff
4
recommend the fol lowing taxonomy:
Attackers an i ndividual who attempts one or more attacks i n order to achieve an
objecti ve
o Hackers attackers who attack computers for challenge, status or the thrill of
obtai ning access
o Spi es attackers who attack computers for i nformation to be used for political
gai n
o Terrori sts attackers who attack computers to cause fear for political gain
o Corporate Raiders empl oyees who attack competi tors computers for financial
gai n
o Professional Criminals attackers who attack computers for personal fi nancial
gai n
o Vandals attackers who attack computers to cause damage
o Voyeurs attackers who attack computers for the thrill of obtaining sensitive
i nformation
Tool a means that can be used to expl oit a vulnerability i n a computer or network
o Physical Attack a means of physically stealing or damaging a computer,
network, i ts components, or its supporting systems
o Information Exchange a means of obtaining i nformation ei ther from other
attackers, or from the peopl e being attacked
o User Command a means of exploiti ng a vulnerability by entering commands to
a process through direct user i nput at the process i nterface
o Scri pt or Program a means of expl oiting a vulnerability by enteri ng commands
to a process through the execution of a fi le of commands or a program at the
process interface

3
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security Incident Handling Guide. US National Institute of
Standards and Technology, 2004. <http://csrc.nist.gov/publications/nistpubs/800 -61-rev1/SP800-61rev1.pdf>
4
Howard & Longstaff. A Common Language for Computer Security Inicdents. (October 1998).
The CIS Security Metrics v1.1.0 November 1, 2010

17 | P a g e
2010 The Center for Internet Security
o Autonomous Agent a means of exploiti ng a vulnerability by using a program, or
program fragment, whi ch operates i ndependently from the user
o Tool ki t a software package which contains scripts, programs, or autonomous
agents that exploit vulnerabilities
o Di stributed Tool a tool that can be di stributed to mul tiple hosts
o Data Tap a means of monitoring the el ectromagnetic radiation emanating from
a computer or network using an external device
Vul nerability a weakness in a system allowing unauthorized action
o Desi gn a vul nerability i nherent i n the design or specification of hardware or
software whereby even a perfect i mplementation wi ll result in a vulnerability
o Impl ementation a vul nerability resulti ng from an error made i n the software or
hardware i mplementation of a satisfactory design
o Confi guration a vul nerability resulting from an error i n the configuration of a
system
Acti on a step taken by a user or process in order to achieve a result
o Probe an acti on used to determi ne the characteristics of a specific target
o Scan an action where a user or process accesses a range of targets sequentially
i n order to determi ne which targets have a particular characteristic
o Fl ood access a target repeatedly i n order to overload the targets capacity
o Authenti cate an action taken by a user to assume an i dentity
o Bypass an action taken to avoid a process by using an alternative method to
access a target
o Spoof an acti ve security attack in which one machine on the network
masquerades as a different machine
o Read an acti on to obtain the content of the data contained within a fi le or
other data medi um
o Copy reproduce a target l eaving the ori ginal target unchanged
o Steal an acti on that results i n the target coming i nto the possession of the
attacker and becoming unavailable to the ori ginal owner or user
o Modi fy change the content of characteristics of a target
o Del ete remove a target or render i t i rretrievable
Target
o Account a domain of user access on a computer or network which is controlled
according to a record of i nformation whi ch contains the users account name,
password, and user restrictions
o Process a program i n execution, consisting of the executable program, the
programs data and stack, i ts program counter, stack point and other registers,
and al l other i nformation needed to execute the program
The CIS Security Metrics v1.1.0 November 1, 2010

18 | P a g e
2010 The Center for Internet Security
o Data representations of fact, concepts, or i nstructions in a manner suitable for
communi cation, i nterpretation, or processing by humans or by automatic means
o Component one of the parts that make up a computer or network
o Computer a devi ce that consists of one or more associated components
o Network an i nterconnected or interrelated group of host computers, switching
el ements, and i nterconnecting branches
o Internetwork a network of networks
Unauthorized Result an unauthorized consequence of an event
o Increased Access an unauthorized i ncrease i n the domain of access on a
computer or network
o Di sclosure of Information di ssemination of i nformation to anyone who i s not
authorized to access that i nformati on
o Corruption of Information unauthorized alteration of data on a computer or
network
o Deni al of Service i ntenti onal degradation or blocking of computer or network
resources
o Theft of Resources unauthorized use of computer or network resources
Objectives
o Chal lenge, Status, Thrill
o Pol i tical Gain
o Fi nancial Gain
o Damage
Priority
Pri oriti es for security i ncidents may i nclude CERT severity l evels or priorities as summarized i n
CERT publ i cation State of the Practice of Computer Security Incident Response Teams
(CSIRTs)
5
. For example:
[Kruse 02] Highest (e-commerce, authentication/billing) to Low (network switch,
chat, shell server)
[Schul tz 01] Level 4 (hi gh-impact affecting many sites) to Level 1 (affects one l ocation)
[ISS 01] Severi ty 5 (penetrati on or DoS wi th significati on impact on operations) to
Severi ty 1 (l ow-level probes/scans, known vi rus)
[Schul tz 90] Priority 1 (human l ife, human safety) to Priority 5 (mi nimize disruption to
computi ng processes)

5
Killcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident Response Teams (CSIRTs). Carnegie-Mellon
Software Engineering Institute, 2003: p94-96. <http://www.cert.org/archive/pdf/03tr001.pdf>
The CIS Security Metrics v1.1.0 November 1, 2010

19 | P a g e
2010 The Center for Internet Security
[Schi ffman 01] Devilish (extremel y skilled, able to cover tracks, l eave covert channels)
to Low (script kiddie attacks, l ow i nnovation)
[McGl ashan 01] Priority 5 (l ife and health) to Priority 1 (preservation of non-critical
systems)
Sources
Sources for i ncident data can come from a variety of sources i ncluding i ncident tracking
systems, help desk ti cket systems, i ncident reports, and SIM/SEM systems.
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the underlying incident record as
described in Security Incident Metrics: Data Attributes. For example:
Priority di mension allows metri cs to be computed for high, medi um, or l ow severity
i nci dents
Classifications for characterizing types of i ncidents, such as denial of service, theft of
i nformation, etc.
Affected Organization for i denti fying the affected part of the organization
Cause di mension, such as Missing Patch, Thi rd-Party Access, etc. could be used to
i mprove mi tigation effort
Automation
The abi lity to automate source data collection for these metri cs i s low, because humans, rather
than machines, declare when an i ncident occurs, i s contained and i s resolved. Calculation of
these metri cs on an ongoing basis, after source data has been obtained, lends itself to a high
degree of automati on.
Visualization
These metri cs may be vi sually represented i n several ways:
Simple visualizations may i nclude a table showing the metri c result for the organization with
each row di splaying the value as of selected ti me periods (each week or each month). Columns
may be used for di fferent incident classes (e.g. Denial of Service, Unauthorized Access, etc.)
Graphical visualizations may i nclude ti me-series charts where the metri c result i s plotted on
the verti cal axis and ti me periods displayed on the horizontal axis. To provide maximum i nsight,
pl otted values for each period may i nclude stacked series for the di ffering incident classifications.
The CIS Security Metrics v1.1.0 November 1, 2010

20 | P a g e
2010 The Center for Internet Security
Complex visualizations should be used for displaying the metri c result for cross-sections by
organization, incident classification, or incident priority. For exampl e, small multiples could be used to
compare the number of high priority incidents of unauthorized access across business units or regions.
The CIS Security Metrics v1.1.0 November 1, 2010

21 | P a g e
2010 The Center for Internet Security
Defined Metrics
Mean-Time-To-Incident-Discovery
Objective
Mean-Ti me-To-Incident-Discovery (MTTID) characterizes the effi ciency of detecti ng i ncidents,
by measuring the average el apsed ti me between the i ni tial occurrence of an i ncident and i ts
subsequent discovery. The MTTID metri c also serves as a l eading indicator of resilience i n
organization defenses because i t measures detection of attacks from known vectors and
unknown ones.
Table 9: Mean Time to Incident Discovery
Metric
Name
Mean ti me to Inci dent Di scovery
Version 1.0.0
Status Fi nal
Description Mean-Ti me-To-Incident-Discovery (MTTID) measures the effecti veness of the
organization i n detecting security i ncidents. Generally, the faster an
organization can detects an incident, the l ess damage i t i s l ikely to i ncur.
MTTID i s the average amount of ti me, i n hours, that el apsed between the
Date of Occurrence and the Date of Di scovery for a given set of i ncidents. The
cal culation can be averaged across a time period, type of i ncident, business
uni t, or severity.
Type Operational
Audience Securi ty Management
Question What i s the average (mean) number of hours between the occurrence of a
security i ncident and i ts discovery?
Answer A posi tive decimal value that i s greater than or equal to zero. A val ue of 0
i ndi cates hypotheti cal instant detection.
Formula For each record, the ti me-to-discovery metri c is calculated by subtracting the
Date of Occurrence from the Date of Di scovery. These metrics are then
averaged across a scope of i ncidents, for example by ti me, category or
busi ness unit:
The CIS Security Metrics v1.1.0 November 1, 2010

22 | P a g e
2010 The Center for Internet Security

MTTID
(Date_of_Discovery Date_of_Occurrence)

Count(Incidents)

Units Hours per i ncident
Frequency Weekl y, Monthly, Quarterly, Annually
Targets MTTID val ues should trend l ower over ti me. The val ue of 0 hours i ndicates
hypotheti cal instant detecti on ti mes. There i s evidence the metri c result may
be i n a range from weeks to months (2008 Verizon Data Breach Report).
Because of the l ack of experiential data from the fi eld, no consensus on the
range of acceptable goal values for MTTIDs exist.
Sources Si nce humans determine when an i ncident occurs, when the i ncident i s
contained, and when the i ncident i s resolved, the pri mary data sources for
thi s metri c are manual i nputs as defi ned i n Security Incident Metrics: Data
Attri butes. However, these i ncidents may be reported by operational security
systems, such as anti-malware software, security i ncident and event
management (SIEM) systems, and host l ogs.
Visualization Col umn Chart
x-axis: Ti me (Week, Month, Quarter, or Year)
y-axi s: MTTID (Hours per Incident)

Usage
Mean-Ti me-To-Incident-Discovery i s a type of security i ncident metric, and relies on the
common defi nition of security i ncident as defined in Terms in Definitions.
Opti mal conditions would reflect a l ow value i n the MTTID. The l ower the value of MTTID, the
heal thier the security posture i s. The hi gher the MTTID, the more ti me mal icious activity is l ikely
to have occurred wi thin the envi ronment prior to containment and recovery activities. Gi ven
the current threat l andscape and the ability for malicious code to l i nk to other modules once
entrenched, there may be a di rect correlation between a hi gher MTTID and a higher l evel -of-
effort val ue (or cost) of the i ncident.
MTTIDs are calculated across a range of i ncidents over ti me, typically per-week or per-month.
To gai n i nsight i nto the rel ati ve performance of one business unit over another, MTTIDs may
al so be calculated for cross-sections of the organization, such as i ndividual business units or
geographies.
The CIS Security Metrics v1.1.0 November 1, 2010

23 | P a g e
2010 The Center for Internet Security
Limitations
Thi s metri c measures i ncident detection capabilities of an organization. As such, the i mportance
of thi s metri c wi ll vary between organizations. Some organizations have much higher profiles
than others, and would thus be a more attracti ve target for attackers, whose attack vectors and
capabilities wi ll vary. As such, MTTIDs may not be di rectly comparable between organizations.
In addi tion, the ability to calculate meaningful MTTIDs assumes that i ncidents are, i n fact,
detected and reported. A l ack of participation by the system owners could cause a skew to
appear in these metrics. A hi gher rate of participation i n the reporting of security i ncidents can
i ncrease the accuracy of these metri cs.
The date of occurrence of an i ncident may be hard to determi ne precisely. The date of
occurrence fi eld should be the date that the i ncident could have occurred no l ater than given
the best available i nformation. Thi s date may be subject to revi sion and more i nformation
becomes known about a particular i ncident.
Mean val ues may not provide a useful representation of the ti me to detect i ncidents if
di stribution of data exhibits significantly bi -modal or multi-model. In such cases additional
di mensions and results for each of the major modes will provide more representative results.
References
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security
Inci dent Handling Guide. US Nati onal Institute of Standards and Technology, 2004.
<http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf>
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams (CSIRTs). Carnegie-Mellon Software Engi neering Institute, 2003.
<http://www.cert.org/archive/pdf/03tr001.pdf>
Baker, Hyl ender and Valentine, 2008 Data Breach Investigations Report. Veri zon Business RISK
Team, 2008. <http://www.verizonbusiness.com/resources/security/databreachreport.pdf>
The CIS Security Metrics v1.1.0 November 1, 2010

24 | P a g e
2010 The Center for Internet Security
Mean Time between Security Incidents
Objective
Mean Ti me between Securi ty Incidents (MTBSI) identifies the rel ative l evels of security i ncident
acti vity.
Table 10: Mean Time between Security Incidents
Metric
Name
Mean Ti me Between Securi ty Incidents
Version 1.0.0
Status Fi nal
Description Mean Ti me Between Securi ty Incidents (MTBSI) calculates the average ti me, i n
days, between security incidents. This metric is analogous to the Mean Ti me
Between Fai lure (MTBF) metri c found i n break-fix processes for data center.
Type Operational
Audience Securi ty Management
Question For al l security i ncidents that occurred within a gi ven ti me period, what is the
average (mean) number of days between i ncidents?
Answer A fl oating-point value that i s greater than or equal to zero. A val ue of 0
i ndi cates i nstantaneous occurrence of security i ncidents.
Formula For each record, the mean ti me between i ncidents is calculated by dividing the
number of hours between the ti me on the Date of Occurrence for the current
i nci dent from the ti me on the Date of Occurrence of the previ ous i ncident by
the total number of i ncidents prior to the current i ncident:

MTBSI
(Date_of_Occurence[Incident
n
] Date_of_Occurence[Incident
n1
])

Count(Incidents)

Units Hours per i ncident i nterval
Frequency Weekl y, Monthly, Quarterly, Annually
Targets MTBSI val ues should trend higher over. The value of 0 i ndicates hypothetical
i nstantaneous occurrence between security i ncidents. Because of the l ack of
experi ential data from the fi el d, no consensus on the range of acceptable goal
The CIS Security Metrics v1.1.0 November 1, 2010

25 | P a g e
2010 The Center for Internet Security
val ues for Mean Ti me Between Security Incidents exists.
Sources Si nce humans determine when an i ncident occurs, when the i ncident i s
contained, and when the i ncident i s resolved, the pri mary data sources for this
metri c are manual i nputs as defined in Security Incident Metrics: Data
Attri butes. However, these i ncidents may be reported by operational security
systems, such as anti-malware software, security i ncident and event
management (SIEM) systems, and host l ogs.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: MTBSI (Hours per Incident)
Usage
Thi s metri c provides an i ndication of acti vity within the envi ronment. A hi gher value for this
metri c mi ght i ndicate a l ess-active l andscape. However, an i nacti ve l andscape mi ght be caused
by a l ack of reporting or a l ack of detecti on of i ncidents.
Limitations
The date of occurrence of an i ncident may be hard to determi ne precisely. The date of
occurrence fi eld should be the date that the i ncident could have occurred. This date may be
subject to revi sion as more information becomes known about a particular i ncident.
References
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security
Inci dent Handling Guide. US Nati onal Institute of Standards and Technology, 2004.
<http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf>
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams (CSIRTs). Carnegie-Mellon Software Engi neering Institute, 2003.
<http://www.cert.org/archive/pdf/03tr001.pdf>
The CIS Security Metrics v1.1.0 November 1, 2010

26 | P a g e
2010 The Center for Internet Security
Mean Time to Incident Recovery
Objective
Mean Ti me to Inci dent Recovery (MTIR) characterizes the ability of the organization to return to
a normal state of operations. Thi s i s measured by the average el apse ti me between when the
i nci dent occurred to when the organization recovered from the i ncident.
Table 11: Mean Time to Incident Recovery
Metric
Name
Mean Ti me to Inci dent Recovery
Version 1.0.0
Status Fi nal
Description Mean Ti me to Inci dent Recovery (MTIR) measures the effecti veness of the
organization to recovery from security i ncidents. The sooner the
organization can recover from a security incident, the l ess i mpact the
i nci dent wi ll have on the overall organization. This calculation can be
averaged across a ti me period, type of i ncident, business unit, or severity.
Type Operational
Audience Busi ness Management, Security Management
Question What i s the average (mean) number of hours from when an incident occurs
to recovery?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0
i ndi cates i nstantaneous recovery.
Formula Mean ti me-to-i ncident recovery (MTIR) i s calculated by di viding the
di fference between the Date of Occurrence and the Date of Recovery for
each i ncident recovered i n the metri c ti me period, by the total number of
i nci dents recovered i n the metri c ti me period

MTIR
(Date_of_Recovery Date_of_Occurrence)

Count(Incidents)

Units Hours per i ncident
Frequency Weekl y, Monthly, Quarterly, Annually
The CIS Security Metrics v1.1.0 November 1, 2010

27 | P a g e
2010 The Center for Internet Security
Targets MTIR val ues should trend l ower over ti me. There i s evidence the metri c
resul t wi ll be i n a range from days to weeks (2008 Verizon Data Breach
Report). The val ue of 0 i ndicates hypotheti cal instantaneous recovery.
Because of the l ack of experiential data from the fi eld, no consensus on the
range of acceptable goal values for Mean Ti me to Incident Recovery exi sts.
Sources Si nce humans determine when an i ncident occurs, when the i ncident i s
contained, and when the i ncident i s resolved, the pri mary data sources for
thi s metri c are manual i nputs as defi ned i n Security Incident Metrics: Data
Attri butes. However, these i ncidents may be reported by operational
security systems, such as anti-malware software, security incident and event
management (SIEM) systems, and host l ogs.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: MTIR (Hours per Incident)
Usage
MTIR i s a type of security incident metri c and relies on the common definition of security
i nci dents as defined in Glossary.
Opti mal conditions would reflect a l ow value i n the MTIR. A l ow MTIR value i ndicates a
heal thier security posture as the organization quickly recovered from the i ncident. Gi ven the
i mpact that an i ncident can have on an organizations business processes, there may be a di rect
correl ation between a hi gher MTIR and a higher i ncident cost.
Limitations
Thi s metri c measures i ncident recovery capabilities of an organization. As such, the i mportance
of thi s metri c wi ll vary between organizations. Some organizations have much higher profiles
than others and would be a more attractive target for attackers whose attack vectors and
capabilities vary. MTIRs may not be di rectl y comparable between organizations.
The date of occurrence of an i ncident may be hard to determi ne precisely. The date of
occurrence fi eld should be the date that the i ncident could have occurred. This date may be
subject to revi sion and more i nformation becomes known about a particular i ncident.
References
Baker, Hyl ender and Valentine, 2008 Data Breach Investigations Report. Veri zon Business RISK
Team, 2008. <http://www.verizonbusiness.com/resources/security/databreachreport.pdf>
The CIS Security Metrics v1.1.0 November 1, 2010

28 | P a g e
2010 The Center for Internet Security
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams (CSIRTs). Carnegie-Mellon Software Engi neering Institute, 2003.
<http://www.cert.org/archive/pdf/03tr001.pdf>
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security
Inci dent Handling Guide. US Nati onal Institute of Standards and Technology, 2004.
<http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf>
The CIS Security Metrics v1.1.0 November 1, 2010

29 | P a g e
2010 The Center for Internet Security
Cost of Incidents
Objective
Organizations need to understand the i mpact of security incidents. Impact can take many forms
from negati ve publicity to money di rectly stolen. Monetary costs provide a set of units that can
be di rectly compared across impact of the i ncidents and across organizations.
In order to make effecti ve risk management decisions, the i mpact of i ncidents needs to be
measured and considered. Understanding of the costs experienced by the organization can be
used to i mprove security process effecti veness and efficiency.
Table 12: Cost of Incidents
Metric
Name
Cost of Incidents
Version 1.0.0
Status Fi nal Draft for Revi ew
Description Cost of Incidents (COI) measures the total cost to the organization from
security i ncidents occurring during the metri c ti me period. Total costs from
security i ncidents consists of the fol lowing costs:
Di rect Loss
o Val ue of IP, customer l ists, trade secrets, or other assets that
are destroyed
Cost of Business System Downtime
o Cost of refunds for failed transactions
o Cost of l ost business directly attributable to the i ncident
Cost of Containment
o Efforts and cost
o Consulting services
Cost of Recovery
o Cost of i ncident i nvesti gation and analysis
o Effort requi red to repair and replace systems
o Repl acement cost of systems
o Consulting services for repair or i nvestigation
o Addi ti onal costs not covered by an i nsurance policy
Cost of Resti tution
o Penal ties and other funds paid out due to breaches of
The CIS Security Metrics v1.1.0 November 1, 2010

30 | P a g e
2010 The Center for Internet Security
contacts or SLAs resulting from the i ncident
o Cost of services provided to customers as a direct result of the
i nci dent (e.g. ID Theft Insurance)
o Publ ic relations costs
o Cost of di sclosures and notifications
o Legal costs, fines, and settlements

Type Management
Audience Busi ness Management, Security Management
Question What i s the total cost to the organization from security incidents during the
gi ven peri od?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0.0
i ndi cates there were no measured costs to the organization.
Formula Cost of Incidents (COI) i s calculated by summing all costs associated wi th
security i ncidents during the ti me peri od:
COI = (Di rect Loss + Cost of Business System Downti me + Cost of
Contai nment + Cost of Recovery + Cost of Restitution)
Units $USD per i ncident
Frequency Monthl y
Targets Ideal ly there would be no security i ncidents wi th material i mpacts on the
organization, and the metri c value would be zero. In practice a target can be
set based on the expected l oss budget determi ned by risk assessments
processes.
Sources Inci dent tracking systems will provide i ncident data. Cost data can come
from both management estimates, ti cket tracking systems and capital and
servi ces budgets.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: COI ($)
Usage
Cost of Incidents (COI) represents the overall known outcome of security systems, processes,
and pol icies. The l ower the COI, the l ess the organization i s impacted by security i ncidents.
The CIS Security Metrics v1.1.0 November 1, 2010

31 | P a g e
2010 The Center for Internet Security
Opti mal conditions would reflect a l ow value of COI. Costs experienced by organizations may
vary as a result of the threat environment, controls i n pl ace, and resiliency of the organization.
Over ti me as processes and controls become more effecti veness, COI should be reduced.
Limitations
Some i nci dents such as exposure of business strategy vi a social engineering may not
have a di rect i ncident costs. Si gnificant harm, bad press, or competi tive di sadvantage
may sti ll be experienced for which it i s not practical to assign a cost.
Some new controls may have significant costs and/or address recovery from multiple
i nci dents.
Thi s metri c relies on the common definition of security i ncident as defined in Terms
and Defi nitions.
Thi s metri c relies on an organization being able to produce costs or cost estimates
rel ated to security i ncidents.
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the underlying incident record as
described in Security Incident Metrics: Data Attributes. For example:
Pri ority di mension allows COI to be computed for high, medi um, or l ow severity
i nci dents
Cl assifications for characterizing types of i ncidents, such as denial of service, theft of
i nformation, etc.
Affected Organization for identifying the affected part of the organization

References
Computer Security Incident Handling Guide. NIST Special Publication 800-61. National
Insti tute of Standards and Technology. January 2004.
Dorofee, Ki llcrece, et al . Incident Management Capability Metrics Version 0.1, Software
Engi neeri ng Insti tute. Apri l, 2007.
Inci dent Cost Analysis and Modeling Project (ICAMP) Final Report 2. Commi ttee on
Insti tutional Cooperation Security Working Group. 2000.
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams Carnegie-Mellon Software Engi neering Institute, October 2003.
The CIS Security Metrics v1.1.0 November 1, 2010

32 | P a g e
2010 The Center for Internet Security
West-Brown, Sti kvoort, et al . Handbook for Computer Security Incident Response Teams
(CSIRTs). Carnegie Mellon Software Engineering Institute. April 2003.

The CIS Security Metrics v1.1.0 November 1, 2010

33 | P a g e
2010 The Center for Internet Security
Mean Cost of Incidents
Objective
Organizations need to understand the i mpact of security incidents. Impact can take many forms
from negati ve publicity to money di rectly stolen. Monetary costs provide a set of units that can
be di rectly compared across impact of the i ncidents and across organizations.
In order to make effecti ve risk management decisions, the i mpact of i ncidents needs to be
measured and considered. Understanding of the mean costs the organization i ncurs from
security i ncidents allows the organization to i mprove security process effectiveness and
effi ci ency.
Table 13: Mean Cost of Incidents
Metric Name Mean Cost of Incidents
Version 1.0.0
Status Fi nal Draft for Revi ew
Description Mean Cost of Incidents (MCOI) measures the mean cost to the organizati on
from security incidents i dentified relative to the number of i ncidents that
occurred during the metri c ti me period. Total costs from security i ncidents
consists of the fol lowing costs:
Di rect Loss
o Val ue of IP, customer l ists, trade secrets, or other assets that are
destroyed
Cost of Business System Downtime
o Cost of refunds for failed transactions
o Cost of l ost business directly attributable to the i ncident
Cost of Containment
o Efforts and cost
o Consulting services
Cost of Recovery
o Cost of i ncident i nvesti gation and analysis
o Effort requi red to repair and replace systems
o Repl acement cost of systems
o Consulting services for repair or i nvestigation
o Addi ti onal costs not covered by an i nsurance policy
Cost of Resti tution
The CIS Security Metrics v1.1.0 November 1, 2010

34 | P a g e
2010 The Center for Internet Security
o Penal ties and other funds paid out due to breaches of contacts
or SLAs resul ting from the i ncident
o Cost of services provided to customers as a direct result of the
i nci dent (e.g. ID Theft Insurance)
o Publ ic relations costs
o Cost of di sclosures and notifications
o Legal costs, fines, and settlements

Type Management
Audience Busi ness Management, Security Management
Question What i s the average (mean) cost to the organization from security i ncidents
duri ng the gi ven period?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0.0
i ndi cates there were no measured costs to the organization.
Formula Mean Cost of Incidents (MCOI) i s calculated by summing all costs associated
wi th security i ncidents by the number of security i ncidents that occurred during
the ti me peri od:
) (
) Re _ cov Re _ _
_ sin _ _ (
Incidents Count
stitution Cost ery Cost t Containmen Cost
Downtime ess Bu Cost Loss Direct
MCOI




Units $USD per i ncident
Frequency Monthl y
Targets Ideal ly there would be no security i ncidents wi th material i mpacts on the
organization, and the metri c value would be zero. In practice a target can be set
based on the expected l oss budget determined by risk assessments processes .
Sources Inci dent tracking systems will provide i ncident data. Cost data can come from
both management estimates, ti cket tracking systems and capital and services
budgets.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: MCOI ($/Incident)
The CIS Security Metrics v1.1.0 November 1, 2010

35 | P a g e
2010 The Center for Internet Security
Usage
Mean Cost of Incidents (MCOI) represents the average i mpact of a security incident on the
organization. Thi s impact i s the average known outcome resulting from the i nteraction of the
threat envi ronment wi th the security systems, processes, and policies or the organization. The
l ower the MCOI, the l ess the organization is i mpacted by security incidents on average. Opti mal
condi tions would reflect a l ow value of MCOI. Costs experienced by organizati ons can vary as a
resul t of the threat envi ronment, systems and processes i n pl ace, and resiliency of the
organization. Over ti me, the effecti veness of changes to an organizations security activiti es
should result i n a reduction i n the Mean Cost of Incidents.
MCOI should provide a management i ndicator of the abi lity of the organization to alter the
known i mpact expected from security i ncidents.
Limitations
Some i nci dents such as exposure of business strategy vi a social engineering may not
have a di rect i ncident costs. Si gnificant harm, bad press, or competi tive di sadvantage
may sti ll be experienced for which it i s not practical to assign a cost.
Some new controls may have significant costs and/or address recovery from multiple
i nci dents.
Thi s metri c relies on the common definition of security i ncident as defined in Terms
and Defi nitions.
Thi s metri c relies on an organization being able to produce costs or cost estimates
rel ated to security i ncidents.
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the underlying incident record as
described in Security Incident Metrics: Data Attributes. For example:
Pri ority di mension allows COI to be computed for high, medi um, or l ow severity
i nci dents
Cl assifications for characterizing types of i ncidents, such as denial of service, theft of
i nformation, etc.
Affected Organization for identifying the affected part of the organization

References
Computer Security Incident Handling Guide. NIST Special Publication 800-61. National
Insti tute of Standards and Technology. January 2004.
The CIS Security Metrics v1.1.0 November 1, 2010

36 | P a g e
2010 The Center for Internet Security
Dorofee, Ki llcrece, et al . Incident Management Capability Metrics Version 0.1, Software
Engi neeri ng Insti tute. Apri l, 2007.
Inci dent Cost Analysis and Modeling Project (ICAMP) Final Report 2. Commi ttee on
Insti tutional Cooperation Security Working Group. 2000.
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams Carnegie-Mellon Software Engi neering Institute, October 2003.
West-Brown, Sti kvoort, et al . Handbook for Computer Security Incident Response Teams
(CSIRTs). Carnegie Mellon Software Engineering Institute. April 2003.
Mean Incident Recovery Cost
Objective
Mean Incident Recovery Cost measures the total costs di rectly associated wi th the operational
recovery from an i ncident. While the i mpact of similar i ncidents may vary across organizations,
the techni cal recovery should be comparable on a per-system basis across firms.
Table 14: Mean Cost of Incidents
Metric
Name
Mean Incident Recovery Cost
Version 1.0.0
Status Fi nal Draft for Revi ew
Description Mean Incident Recovery Cost (MIRC) measures the cost of returning business
systems to thei r pre-incident condition. The fol lowing costs may be taken
i nto consideration:
Cost to repai r and/or replace systems
Opportunity cost of staff i mplementing i ncident handling plan
Cost to hi re external technical consultants to hel p recover from the
i nci dent
Cost to i nstallation new controls or procurement of new resources
that di rectly addresses the re-occurrence of the i ncident (e.g.
i nstallation of AV software)
Legal and regulatory liabilities resulting from the i ncident
Type Operational
The CIS Security Metrics v1.1.0 November 1, 2010

37 | P a g e
2010 The Center for Internet Security
Audience Securi ty Management
Question What i s the average (mean) cost of recovery from a security incidents during
the gi ven peri od?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0.0
i ndi cates there were no measured costs to the organization.
Formula Mean Incident Recovery Cost (MIRC) i s calculated by summing all costs
associated wi th recovering from security i ncidents by the number of security
i nci dents that occurred during the ti me peri od:
) (
) cov Re _ (
Incidents Count
ery Cost
MIRC


Units $USD per i ncident
Frequency Monthl y
Targets Ideal ly, recovery from security i ncidents would have no material impacts on
the organization, and the metri c value would be zero. In practice a target can
be set based on the expected l oss budget determined by risk assessments
processes, and planned i ncident recovery resources.
Sources Inci dent tracking systems will provide i ncident data. Cost data can come
from management estimates, ti cket tracking systems and capital and
servi ces budgets.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: MIRC ($/Incident)
Usage
Mean Incident Recovery Cost (MIRC) represents the average cost the organization i ncurs while
recoveri ng from a security i ncident. This cost i s correlated to the capabilities and resiliency of
the systems, processes, and policies. The l ower the MIRC, the l ess the organization i s i mpacted
by security i ncidents on average, and the greater the general resiliency of the organizations
systems. Optimal conditions would refl ect a l ow value of MIRC. Costs experienced by
organizations can vary as a result of the threat environment, systems, and processes i n pl ace.
Over ti me, the effecti veness of changes to an organizati ons security activities should result i n a
reducti on i n the Mean Incidents Recovery Cost.
The CIS Security Metrics v1.1.0 November 1, 2010

38 | P a g e
2010 The Center for Internet Security
MIRC should provide a management i ndicator of the expected ability of the organizations
resi liency and ability to recover from security i ncidents.
Limitations
Some i nci dents, such as theft vi a social engineering, may not have a di rect recovery
costs as there may not be a cl ear end point or may be the result of several related
i nci dents
Establishment of new controls or procurement of new resources may have significant
costs.
Thi s metri c i s dependent upon when during the i ncident management process cost
i nformation i s collected. Depending i f i nformation is collected during the occurrence of
the i ncident or following the i ncident may influence the metri c outcome.
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the underlying incident record as
described in Security Incident Metrics: Data Attributes. For example:
Pri ority di mension allows MIRC to be computed for high, medium, or low severity
i nci dents
Cl assifications for characterizing types of i ncidents, such as denial of service, theft of
i nformation, etc.
Affected Organization for identifying the affected part of the organization

References
Computer Security Incident Handling Guide. NIST Special Publication 800-61. National
Insti tute of Standards and Technology. January 2004.
Dorofee, Ki llcrece, et al . Incident Management Capability Metrics Version 0.1, Software
Engi neeri ng Insti tute. Apri l, 2007.
Inci dent Cost Analysis and Modeling Project (ICAMP) Final Report 1. Commi ttee on
Insti tutional Cooperation Security Working Group. 1998.
Inci dent Cost Analysis and Modeling Project (ICAMP) Final Report 2. Commi ttee on
Insti tutional Cooperation Security Working Group. 2000.
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams Carnegie-Mellon Software Engi neering Institute, October 2003.
The CIS Security Metrics v1.1.0 November 1, 2010

39 | P a g e
2010 The Center for Internet Security
West-Brown, Sti kvoort, et al . Handbook for Computer Security Incident Response Teams
(CSIRTs). Carnegie Mellon Software Engineering Institute. April 2003.
The CIS Security Metrics v1.1.0 November 1, 2010

40 | P a g e
2010 The Center for Internet Security
Vulnerability Management
Thi s section describes metri cs for measuring the process used for the i dentification and
management of vul nerabilities wi thin an organizations envi ronment.
As described i n the Glossary section of this document, a vulnerability is a fl aw or
mi sconfiguration that causes a weakness i n the security of a system that could be expl oited.
Sources of vulnerabilities i nclude new systems or applications introduced to the organizations
envi ronment or the discovery of new vul nerabilities on existing systems and applications.
Vul nerability management i s a vi tal part of keepi ng an organizations assets safe; i dentifying
and mi ti gating weaknesses found on systems and applications reduces the risk of negatively
i mpacting the business should these vulnerabilities be expl oited. It consists of the fol lowing
hi gh-l evel process steps:
Vul nerability Notification through becoming aware of disclosed vulnerabilities and
performi ng security assessments.
Vul nerability Identification through manual or automated scanning of technologies
throughout the organization.
Vul nerability Remediation & Mi tigation through application of patches, adjustment of
confi gurations, modification of systems, or acceptance of risk.
The pri mary question this activity is concerned with is: Are my systems safe? In vulnerability
management terms this question can be decomposed to: Are there vulnerable systems? Have
systems been checked, and if so, what was found?
Data Attributes
Vul nerability metrics are comprised of the fol lowing datasets:
Technologies. Contains i nformation about the technologies i n the organizations environment.
Technol ogies should be i dentified and named according to the Common Product Enumeration
Di cti onary maintained by NIST (http://nvd.nist.gov/cpe.cfm).
Vulnerability Information. Contains i nformation about the vul nerability, such as its severity
and cl assification, as denoted by the Nati onal Vulnerability Database (http://nvd.nist.gov/) or
other source.
Identified Vulnerabilities. Contains the set of vul nerability i nstances i dentified in the
organizations environment for the metri c ti me period (this can be a l arger set that i s filtered by
scan date).
The CIS Security Metrics v1.1.0 November 1, 2010

41 | P a g e
2010 The Center for Internet Security
Table 15: Technologies Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy wi thin the organization:
Technologies Table
Name Type De-
identi
fied
Required Description
Technol
ogy ID
Text
/
Num
ber
No Yes Uni que i dentifier for the technology. Generally auto-
generated.
Name Text No No Name from CPE Di cti onary which follows the fol lowing
structure:
cpe:/{PART}:{VENDOR}:{PRODUCT}:{VERSION}:{UPDAT
E}:{EDITION}:{LANGUAGE}.
Part Text No No Pl atform. Use value: H, O, or A. H, O, and A represent
hardware, operating system, and application
envi ronment respectively.
Vendor Text No No Vendor from CPE Di ctionary. This i s the hi ghest
organization-specific l abel of the DNS name.
Product Text No No Product from CPE Di ctionary. Thi s is the most
recogni zable name of the product.
Versi on Text No No Versi on from CPE Di ctionary. Same format as seen
wi th the product.
Update Text No No Update or service pack i nformation from CPE
Di cti onary.
Edi ti on Text No No Edi ti on from CPE Di ctionary. May defi ne specific target
hardware and software architectures.
Languag
e
Text No No Language from CPE Dictionary.
The CIS Security Metrics v1.1.0 November 1, 2010

42 | P a g e
2010 The Center for Internet Security
Technol
ogy
Val ue
Text No Recomme
nded
Impact from the l oss of this technology (C/I/A) to the
organization. Uses value Low, Medium, High, or Not
Defined.
6

Busi ness
Uni t
Text No No Organizational business unit that the technology
bel ongs to.
Owner Text No No Uni que i dentifier for individual within the organization
that i s responsible for the technology.
Cl assific
ati on
Text No No Cl assification of technology: Servers, Workstations,
Laptops, Network Device, Storage Devi ce,
Appl i cations, Operating systems





















Table 16: Vulnerability Information Table
Thi s i s a table of i nformation about known vulnerabilities, such as affected versions, severities,
and references. The NVD wi l l be the reference database, and CVSS v2 the reference severity
rati ng system. Vendors of vulnerability i dentification systems may also enhance or expand both
the l i sting and specifications of known vulnerabilities. The fol lowing is a l ist of attributes that
should be populated as completel y as possible for each vulnerability:
Vulnerability Information Table
Name Type De-
identified
Required Description
Vul nerability
ID
Text /
Number
No Yes Uni que i dentifier for the vulnerability.
General ly auto-generated. This can
be an organization-specific i dentifier
for the vul nerability.
Vul nerability
Name
Text No No Name of the vul nerability.
CVE ID Number No No Common Vul nerability Enumeration

6
This is adopting 2.3.3 Security Requirements Scoring Evaluation from CVSS v2, http://www.first.org/cvss/cvss-guide.ht ml#i2.3.
The CIS Security Metrics v1.1.0 November 1, 2010

43 | P a g e
2010 The Center for Internet Security
i denti fier for this vulnerability.
CWE ID Number No No Common Weakness Enumeration id
for the weakness associated wi th thi s
vul nerability
Description Number No No Text descripti on of the vul nerability
(from NVD or el sewhere)
Rel ease Date Date /
Ti me
No No Date that the vul nerability was made
publ icly known.
Severi ty Text No No Severi ty rating for the vul nerability.
May use Low, Medi um, or Hi gh.
Cl assification Text No No Cl assification of the vul nerability.

Table 17: CVSS Score Table
The Common Vul nerability Scoring System (CVSS) score for each of the vul nerabilities
computed.
CVSS Score Table
Name Type De-
identified
Required Description
Vul nerability
ID
Text/Number No Yes Uni que i dentifier for the
vul nerability.
Overall CVSS
Score
Number No No Overall CVSS Score
CVSS Base
Score
Number No Recommended CVSS Base Score
CVSS
Temporal
Score
Number No No CVSS Temporal Score
CVSS
Envi ronmental
Number No No CVSS Envi ronmental Score
The CIS Security Metrics v1.1.0 November 1, 2010

44 | P a g e
2010 The Center for Internet Security
Score
Access Vector Text No No CVSS cl assification of how the
vul nerability i s exploited. Uses
val ues of Undefined, Local, or
Remote.
Access
Compl exity
Text No No CVSS rati ng of the complexity
of the attack required to
expl oi t the vulnerability. Uses
val ues of Undefinted, High, or
Low.
Authenti cation Text No No CVSS rati ng of the number of
ti mes an attacker must
authenti cate to expl oit a
vul nerability. Uses values of
Undefi ned, Required, or Not
Requi red.
Confi dentiality
Impact
Text No No CVSS rati ng of the i mpact the
vul nerability has on
confi dentiality of the
technol ogy. Use values of
Undefi ned, None, Parti al, or
Compl ete.
Integri ty
Impact
Text No No CVSS rati ng of the i mpact the
vul nerability has on i ntegrity of
the technol ogy. Use values of
Undefi ned, None, Parti al, or
Compl ete.
Avai l ability
Impact
Text No No CVSS rati ng of the i mpact the
vul nerability has on i ntegrity of
the technol ogy. Use values of
Undefi ned, None, Parti al, or
Compl ete.
Impact Bi as Text No No CVSS wei ght for i mpact. Use
The CIS Security Metrics v1.1.0 November 1, 2010

45 | P a g e
2010 The Center for Internet Security
val ues of Normal, Weight
Confi dentiality, Wei ght
Integri ty, or Weight
Avai l ability.
Col l ateral
Damage
Potenti al
Text No No Potenti al for l oss through
damage or theft of the asset.
Uses values of Undefined,
None, Low, Medi um, or Hi gh.
Target
Di stribution
Text No No Proportion of vulnerable
systems. Uses value None,
Low, Medi um, Hi gh, or Not
Defi ned.
Expl oitability Text No No CVSS current state of expl oit
techni ques. Uses value
Undefi ned, Unproven that
Expl oit Exi sts, Proof of Concept
Code, Functi onal Exploit Exi sts,
or Hi gh.
Remedi ati on
Level
Text No No CVSS stage of the remedi ation
l i fecycle. Uses value Official
Fi x, Temporary Fix,
Workaround, Unavailable, or
Not Defi ned.
Report
Confi dence
Text No No CVSS degree of confi dence i n
the exi stence of the
vul nerability. Uses value
Unconfi rmed, Uncorroborated,
Confi rmed, or Undefined.
Generated On Date/Ti me No No Date and ti me the CVSS score
was generated


The CIS Security Metrics v1.1.0 November 1, 2010

46 | P a g e
2010 The Center for Internet Security
Table 18: Identified Vulnerabilities Table
Thi s table represents i nformation regarding vulnerability i nstances on technologies. The
fol l owing i s a l ist of attributes that should be populated as completely as possible for the
current set of vul nerability i nstances i dentified on technologies wi thin the organization:
Identified Vulnerabilities Table
Name Type De-
identified
Required Description
Vul nerability
ID
Text /
Number
No Yes Reference to the Vul nerability in the
Vul nerability Information Table
Technol ogy ID Text /
Number
Yes Yes Reference i n the Technol ogies Table
to the speci fic technology wi th this
vul nerability i nstance.
Date of
Detecti on
Date/Ti me No Yes Date and ti me when the
vul nerability was i nitially detected
Detecti on
Method
Text No No Method that the vul nerability was
detected. Use val ues of Vulnerability
Scanner Name or Manual Detection.
Scan Name Text No No If usi ng Vulnerability Scanner, name
of the scan.
Scan Date Date /
Ti me
No No If usi ng Vulnerability Scanner, date
the scan took pl ace.
Vul nerability
Status
Text No Yes Current status of the vul nerability
i nstance. Uses values of Open, Not
Val i d, or Mi tigated. Vulnerabilities
should be fl agged Open by default.

Table 19: Vulnerability Remediation Table
Thi s table represents i nformation regarding the remedi ation of vulnerability i nstances on
technol ogies. The following i s a list of attributes that should be populated as completel y as
The CIS Security Metrics v1.1.0 November 1, 2010

47 | P a g e
2010 The Center for Internet Security
possible for the current set of vulnerability i nstances identified on technologies within the
organization:
Identified Vulnerabilities Table
Name Type De-
identified
Required Description
Vul nerability
ID
Text /
Number
No Yes Uni que i dentifier for the vulnerability
i nstance.
Technol ogy ID Text /
Number
No Yes Uni que i dentifier for the technology.
Open Date Date /
Ti me
No No Date and ti me when the vul nerability
was submitted for remediation.
Status Text No No Current status of the remedi ation
effort. Use val ues of Open or Cl osed.
Pri ority Text No No How qui ckly vulnerability should be
remedi ated. Use values of Hi gh,
Medi um, Low.
Cl ose Date Date /
Ti me
No No Date and ti me when the vul nerability
was remedi ated, i f applicable.
Cl osed By Text No No Uni que i dentifier for entity that
remedi ated the vul nerability.

Table 20: Exempt Technologies Table
Di splays technologies exempt from vulnerability management:
Technologies Table
Name Type De-
identified
Required Description
Technol ogy
ID
Text /
Number
No Yes Uni que i dentifier for the
technol ogy. Generally auto-
The CIS Security Metrics v1.1.0 November 1, 2010

48 | P a g e
2010 The Center for Internet Security
generated.
Exempt By Text Yes No Uni que i dentifier of the person who
approved the exempti on.
Exempti on
Date
Date /
Ti me
No No Date and ti me the technol ogy was
exempt.
Reason Text No No Reason why technology was exempt

Diagram 2: Relational Diagram for Vulnerability Management Data Attributes
The di agram bel ow shows the rel ationship of tables described i n Vul nerability Management
Data Attri butes:
The CIS Security Metrics v1.1.0 November 1, 2010

49 | P a g e
2010 The Center for Internet Security
Vulnerability
PK Vulnerability ID CHAR(10)
Vulnerability Name TEXT(10)
CVE ID CHAR(10)
CWE ID CHAR(10)
Description TEXT(20)
Release Date DATETIME
Severity CHAR(10)
Classification CHAR(10)
CVSS Base Score
PK,FK1 Vulnerability ID CHAR(10)
PK Generated On DATETIME
Overall CVSS Score SHORT
CVSS Base Score SHORT
CVSS Temporal Score SHORT
CVSS Environmental Score SHORT
Access Vector CHAR(10)
Access Complexity CHAR(10)
Authentication CHAR(10)
Confidentiality Impact CHAR(10)
Integrity Impact CHAR(10)
Availability Impact CHAR(10)
Impact Bias CHAR(10)
Collateral Damage Potential CHAR(10)
Target Distribution CHAR(10)
Exploitability CHAR(10)
Remediation Level CHAR(10)
Report Confidence CHAR(10)
Identified Vulnerabilities
PK Technology ID CHAR(10)
PK,FK2 Vulnerability ID CHAR(10)
Date of Detection DATETIME
Detection Method CHAR(10)
Scan Name TEXT(10)
Scan Date DATETIME
Vulnerability Status CHAR(10)
Vulnerability Remediation
PK Vulnerability Remediation ID CHAR(10)
FK1 Vulnerability ID CHAR(10)
Technology ID CHAR(10)
Open Date DATETIME
Status CHAR(10)
Priority CHAR(10)
Close Date DATETIME
Closed By CHAR(10)
Technologies
PK,FK1,FK2 Technology ID CHAR(10)
Name TEXT(10)
Part CHAR(1)
Vendor TEXT(10)
Product TEXT(10)
Version TEXT(10)
Update CHAR(10)
Edition TEXT(10)
Language TEXT(10)
Technology Value CHAR(10)
Business Unit CHAR(10)
Owner CHAR(10)
Classification CHAR(10)
Business Application Weaknesses
PK,FK1 Mitigation ID CHAR(10)
Application ID CHAR(10)
FK2 Technology ID CHAR(10)
CWE ID CHAR(10)
Discovered Date DATETIME
Discovered By CHAR(10)
Status CHAR(10)
Priority CHAR(10)
Type CHAR(10)
Mitigation Date DATETIME
Mitigated By CHAR(10)
Patches
PK Patch ID CHAR(10)
FK1 Vulnerability ID CHAR(10)
Patch Name TEXT(10)
Patch Source CHAR(10)
Criticality CHAR(10)
Date of Notification DATETIME
Date of Availability DATETIME
Patch Type CHAR(10)
Test Date DATETIME
Tested By CHAR(10)
Approval Date DATETIME
Approved By CHAR(10)
Security Incident Impact Analysis
PK Incident Analysis ID CHAR(10)
PK Incident ID CHAR(10)
PK Technology ID CHAR(10)
FK1 Vulnerability ID CHAR(10)
Detected by Internal Controls BIT
Response Protocol Followed BIT
Business Continuity Plan Executed BIT
Reoccurring BIT
Root Cause TEXT(50)
Direct Loss Amount CURRENCY
Business System Downtime SHORT
Cost of Business System Downtime CURRENCY
Cost of Containment CURRENCY
Cost of Recovery CURRENCY
Customers Affected BIT
Loss of Personally Identifiable Information BIT
Data Types Loss CHAR(10)
Records Affected SHORT
Cost of Restitution CURRENCY
Covered Costs CURRENCY
Exempt Technologies
PK Technology ID CHAR(10)
Exempt By CHAR(10)
Exemption Date CHAR(10)
Reason CHAR(10)


Classifications and Dimensions
Taggi ng of i nformation is a very valuable way to provide context to collected data records.
Cl assification tags provide a way to group vulnerabilities. Currently, the onl y classification used
The CIS Security Metrics v1.1.0 November 1, 2010

50 | P a g e
2010 The Center for Internet Security
i s the severi ty of the vul nerability. In the future, vulnerabilities can be grouped by other
categories, such as vulnerability type or source of the vul nerability.
It i s expected that dimensions will be added to these tables to provide the ability to vi ew metri c
resul ts that address key questions and concerns. Examples of dimensions i nclude:
Technologies: business unit, geography, business value, or technology category by
technol ogy
Vulnerability Information: vulnerability severity, classification, or vendor
Identified Vulnerabilities: remediation status, identification date, envi ronment-specific
severi ty
Wi thi n an organization, the combination of di mensions can provide key i nsight into their
concentrations of risk.
Severity of Vulnerabilities
Severi ty ratings are determined by the CVSS v2 scoring system and can commonly be found i n
reference systems such as the National Vul nerability Database (NVD). Severity ratings for
vul nerabilities are along several dimensions wi th Base Scores derived from exploitability factors
(such as attack complexity) and i mpact factors (such as i ntegrity i mpact). CVSS Base scores can
be expressed i n a 0-10 range, commonly summarized as:
"Low" severi ty if they have a CVSS base score of 0.0-3.9
"Medi um" severity if they have a CVSS base score of 4.0-6.9
"Hi gh" severity i f they have a CVSS base score of 7.0-10.0
The severi ty of a specific vulnerability i nstance in an organization can be more accurately
determi ned by combining environment and temporal factors with the base score. Metri cs can
be generated using organization-specific values i n pl ace of external values for fields such as
vul nerability i mpact or exploitability scores to account for an organizations specific
envi ronment. These calculations are beyond the current scope of these metri cs.
Technology Value (CTV, ITV, ATV)
Technol ogy values wi ll be rated by adopting the Common Vul nerability Scoring System (v2)
secti on 2.3.3 Security Requirements Scoring Evaluation ratings. These Technology Value scores
can be used i ndependently as well as used for the complete scoring of a vulnerability that
affected the technol ogy. Each technology i s assigned one of three possible values, Low,
The CIS Security Metrics v1.1.0 November 1, 2010

51 | P a g e
2010 The Center for Internet Security
Medi um, Hi gh (or Not Defi ned) depending on the i mpact from l oss of confidentiality (CTV),
i ntegri ty (ITV), or availability (ATV). These ratings are reproduced here:
Low (L). Loss of [confidentiality | i ntegrity | availability] is l ikely to have only a l imited
adverse effect on the organization or i ndividuals associated wi th the organization (e.g.,
empl oyees, customers).
Medi um (M). Loss of [confidentiality | i ntegrity | availability] is l ikely to have a serious
adverse effect on the organization or i ndividuals associated wi th the organization (e.g.,
empl oyees, customers).
Hi gh (H). Loss of [confidentiality | i ntegrity | availability] i s likely to have a catastrophic
adverse effect on the organization or i ndividuals associated wi th the organization (e.g.,
empl oyees, customers).
Not Defi ned (ND). Assigning this value to the metri c wi ll not i nfluence the score. It i s a
si gnal to the equati on to skip thi s metric.
As described i n CVSS v2, these values should be based on network l ocation, business function,
and the potenti al for l oss of revenue of l ife. No specific methodology is defined to assign these
val ues.
Sources
The pri mary data source for both systems scanned and vulnerabilities identified on systems will
be network scanning and vulnerability i dentification tools. Generally a list of all discovered and
scanned systems can be extracted from vulnerability scanning systems and compared to
reports of all systems wi th identified vulnerabilities. The total s of all systems i n the
organization can come from asset management systems and/or network di scovery scans.
Dimensions
These metri cs may i nclude additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the technology record as described i n
Vulnerability Management Metrics: Data Attributes. For example:
Technology Value allows the metri c to be computed for high, medium, or lower value
technol ogies.
Remediation Level of the vul nerability allows metri cs to be computed around the
current state of vul nerabilities and remediation efforts
Tags for characterizing types of technologies, such as coverage by vendor, etc.
The CIS Security Metrics v1.1.0 November 1, 2010

52 | P a g e
2010 The Center for Internet Security
Business Units for i dentifying the concentrations of risk across different parts of the
organization
Severity of the vul nerabilities i s a dimension that should be used. While CVSS Base
Score uses a scale of 1-10, thi s is generally summarized i nto l ow, medi um, and hi gh
severi ty vulnerabilities. Generally many low severity vulnerabilities are found.
Automation
The abi lity to automate source data collection for these metri cs i s high because most
automated vulnerability i dentification systems can provide the necessary reports i n
combi nation wi th asset tracking and/or discovery scans providing counts of all technologies.
Cal culation of these metri cs i s on an ongoing basis. Once source data has been obtained, i t
l ends i tself to a high degree of automation.
Visualization
These metri cs may be vi sually represented i n several ways:
Simple visualizations may i nclude a table showing the metri c result for the organization with
each row di splaying the value as of selected ti me periods (each week or each month). Columns
may be used for di fferent vulnerability severities (e.g. Low, Medi um, Hi gh).
Graphical visualizations may i nclude ti me-series charts where the metri c result i s plotted on
the verti cal axis and ti me periods displayed on the horizontal axis. To provide maximum i nsight,
pl otted values for each period may i nclude stacked series for the di ffering severity values.
Complex visualizations should be used for displaying the metri c result for cross-sections by
organization, vulnerabilities, or technology values. For example, small multiples could be used
to compare the number of hi gh severity vulnerabilities across business units or technology
val ues.
The CIS Security Metrics v1.1.0 November 1, 2010

53 | P a g e
2010 The Center for Internet Security
Management and Operational Metrics
Percent of Systems without Known Severe Vulnerabilities
Objective
Percent of Systems without Known Severe Vulnerabilities (PSWKSV) measures the
organizations rel ative exposure to known severe vulnerabilities. The metri c evaluates the
percentage of systems scanned that do not have any known hi gh severity vulnerabilities.
Table 21: Percentage of Systems without Known Severe Vulnerabilities
Metric
Name
Percent of Systems Without Known Severe Vulnerabilities
Version 1.0.0
Status Fi nal
Description Percent of Systems Without Known Severe Vulnerabilities (PSWKSV)
measures the percentage of systems that when checked were not found to
have any known high severity vulnerabilities during a vulnerability scan.
Vul nerabilities are defi ned as Hi gh" severity i f they have a CVSS base score
of 7.0-10.0
Si nce vul nerability management i nvolves both the i dentification of new
severe vul nerabilities and the remedi ation of known severe vulnerabilities,
the percentage of systems wi thout known severe vulnerabilities will vary
over ti me. Organizations can use this metric to gauge their relative l evel of
exposure to exploits and serves as a potential i ndicator of expected l evels of
security i ncidents (and therefore i mpacts on the organization).
Thi s severity threshold is i mportant, as there are numerous informational,
l ocal, and exposure vulnerabilities that can be detected that are not
necessarily material to the organizations risk profile. Managers generally
wi l l want to reduce the l evel of noise to focus on the greater risks fi rst. This
metri c can also be calculated for subsets of systems, such as by asset
cri ti cality of business unit
Type Management
Audience Busi ness Management, Security Management
The CIS Security Metrics v1.1.0 November 1, 2010

54 | P a g e
2010 The Center for Internet Security
Question Of the systems scanned, what percentage does not have any known severe
vul nerabilities?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of
100% i ndicates that none of the organizations systems have any known
hi gh severity vulnerabilities.
Formula Percent of Systems Without Known Severe Vulnerabilities is calculated by
counti ng those systems that have no open high severity level vulnerabilities
(Vul nerability Status != Open && CVSS Base Score >= 7.0). Thi s result i s
then di vi ded by the total number of systems i n the scanning scope.
100 *
) (
) (
stems Scanned_Sy Count
ties ulnerabili n_Severe_V thout_Know Systems_Wi Count
PSWKSV
Units Percentage of systems
Frequency Weekl y, Monthly, Quarterly, Annually
Targets PSWKSV values should trend higher over ti me. It would be i deal to have no
known severe vul nerabilities on systems; therefore, an ideal target value
woul d be 100%. Because of the l ack of experiential data from the fi eld, no
consensus on the range of acceptable goal values for Percent of Systems
Wi thout Known Severe Vul nerabilities exists.
Sources Vul nerability management systems will provide i nformati on on which
systems were i dentified with severe vulnerabilities.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: PSWKSV (%)

Usage
Percent of Systems Without Known Severe Vulnerabilities is a type of vul nerability management
metri c and relies on the common definition of vulnerability as defined in the Gl ossary. Due
to the number of vul nerabilities and exposures found by most scanning tools, thi s metric should
be cal culated for Hi gh severity vulnerabilities.
Opti mal conditions would reflect a hi gh value i n the metri c. A value of 100% would i ndicate
that none of the organizations systems are known to possess severe vulnerabilities. The l ower
The CIS Security Metrics v1.1.0 November 1, 2010

55 | P a g e
2010 The Center for Internet Security
the val ue, the greater the ri sk that systems are exploited. Si nce many attacks are designed to
expl oi t known severe vulnerabilities there may be a di rect correl ation between a higher
percentage of vul nerable systems and the number of security incidents.
Percent of Systems Without Known Severe Vulnerabilities can be calculated over ti me, typically
per-week or per-month. To gai n i nsight i nto the rel ative performance and risk to one business
uni t over another, the metri c may also be calculated for cross-sections of the organization such
as i ndividual business units or geographies.
Limitations
Due to techni cal or operational i ncompatibility certain systems may be excluded from scanning
acti vities while other systems such as l aptops may be i ntermittently present for network scans.
Systems not scanned, even i f they possess severe vulnerabilities will not be i ncluded i n thi s
metri c result. In addition, scanning activiti es can vary in depth, completeness, and capabilities.
Thi s metri c assumes that systems scanned for vulnerabilities are systems known to and under
ful l management by the organization. These systems do not i nclude partial or unknown
systems. Future risk metrics may account for these to provide a clearer view of all system
ranges.
References
ISO/IEC 27002:2005
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.
The CIS Security Metrics v1.1.0 November 1, 2010

56 | P a g e
2010 The Center for Internet Security
Mean-Time to Mitigate Vulnerabilities
Objective
Mean-Ti me to Mi tigate Vul nerabilities (MTTMV) measures the average amount of ti me
requi red to mi tigate an identified vulnerability. This metri c i ndicates the performance of the
organization i n reacting to vulnerabilities identified i n the envi ronment. It only measures the
ti me average ti mes for explicitly mi tigated vulnerabilities, and not mean ti me to mi ti gate any
vul nerability, or account for vulnerabilities that no l onger appear in scanning activities.
Table 22: Mean-Time to Mitigate Vulnerabilities
Metric
Name
Mean-Ti me to Mi tigate Vul nerabilities
Version 1.0.0
Status Fi nal
Description Mean-Ti me to Mi tigate Vul nerabilities measures the average ti me taken to
mi ti gate vulnerabilities i dentified i n an organizations technologies. The
vul nerability management processes consists of the i dentification and
remedi ation of known vulnerabilities i n an organizations environment. Thi s
metri c i s an i ndicator of the performance of the organization i n addressing
i denti fied vulnerabilities. The l ess ti me required to mi tigate a vulnerability
the more l i kely an organization can react effectively to reduce the ri sk of
expl oi tation of vulnerabilities.
It i s i mportant to not that only data from vulnerabilities explicitly mi tigated
are i s i ncluded i n this metric result. The metri c result is the mean ti me to
mi ti gate vulnerabilities that are actively addressed during the metri c ti me
peri od, and not a mean ti me to mi tigate based on the ti me for all known
vul nerabilities to be mi tigated.
Type Operational
Audience Securi ty Management
Question How l ong does i t take the organization to mi tigate a vulnerability?
Answer A posi tive floating-point value that is greater than or equal to zero. A val ue
of 0 i ndi cates that vulnerabilities were i nstantaneously mi tigated.
The CIS Security Metrics v1.1.0 November 1, 2010

57 | P a g e
2010 The Center for Internet Security
Formula Mean-Ti me to Mi tigate Vul nerabilities i s calculated by determining the
number of hours between the date of detecti on and the Date of Mi tigation
for each i dentified vulnerability i nstance i n the current scope, for example,
by ti me peri od, severity or business unit. These results are then averaged
across the number of mi tigated vulnerabilities i n the current scope:

MTTMV
(Date_of_Mitigation Date_of_Detection )

Count(Mitigated_Vulnerabilities )

Units Hours per vulnerability
Frequency Weekl y, Monthly, Quarterly, Annually
Targets MTTMV val ues should trend l ower over time. Lower l evels of MTTMV are
preferred. Most organizations put mi tigation plans through test and
approval cycles prior to i mplementation. Generally, the target ti me for
MTTMV wi l l be a function of the severity of the vul nerability and business
cri ti cality of the technology. Because of the l ack of experiential data from
the fi el d, no consensus on the range of acceptable goal values for Mean
Ti me to Mi ti gate Vul nerabilities exists.
Sources Vul nerability management systems will provide i nformati on on which
systems were i dentified with severe vulnerabilities.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: PSWKSV (%)

Usage
Mean-Ti me to Mi tigate Vul nerabilities i s a type of vulnerability management metric and relies
on the common defi niti on of vulnerability as defined in the Gl ossary. Due to the number of
vul nerabilities and exposures found by most scanning tools, this metri c should generally be
cal culated for Hi gh and Medium severity vulnerabilities. Combined with the number of
i denti fied vulnerabilities this metri c can provide vi sibility i nto the ti me and effort required to
manage the known vulnerabilities i n the organization.
Opti mal conditions would reflect a l ow value i n the metri c. The l ower the value the more
qui ckly the organization i s able to react to and mi tigate i denti fied vulnerabilities. Si nce many
The CIS Security Metrics v1.1.0 November 1, 2010

58 | P a g e
2010 The Center for Internet Security
attacks are designed to expl oit known vulnerabilities there may be a direct correlation between
a l ower ti me to mi ti gate vulnerabilities and the number of security incidents.
MTTV can be calculated over ti me, typi cally per-month. To gain insight i nto the rel ative
performance and risk , thi s metric can be calculated for vulnerabilities wi th differing severity
l evel s, as well as calculated for cross-sections of the organization such as i ndividual business
uni ts or geographies.
Limitations
Onl y data from mi tigated vulnerabilities are i ncluded i n thi s calculation. Therefore i t i s an
i ndi cator of the organizations ability to mi ti gate vul nerabilities as they are i dentified, but not
necessarily a true representation of the average ti me taken to mi tigate all vulnerabilities that
may exi st i n the organizations environment. Other i ndicators of the scale of scope of
unmi ti gated vulnerabilities should also be used to assess the performance of the vul nerability
management function.
Mi ti gati on effort can vary depending on the scope and depth of the mi ti gation solution,
modi fi cation of fi rewall rules or other changes to the envi ronment may be l ess effort than
di rectly addressing vulnerabilities i n an applicati ons code. It i s possible that the vul nerabilities
that are easier to mi tigate are the ones completed in the metri c scope, and the remaining
vul nerabilities represent the most challenging to mi ti gate. Therefore the metri c result could be
bi ased l ow compared the to mean ti me to mi ti gate remaining known vulnerabilities.
References
ISO/IEC 27002:2005
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.

The CIS Security Metrics v1.1.0 November 1, 2010

59 | P a g e
2010 The Center for Internet Security
Mean Cost to Mitigate Vulnerabilities
Objective
Thi s defi nes a metri c for measuring the mean effort required to mi ti gate an i dentified
vul nerability that can be remedied.
The metri c i s expressed i n the context of a vul nerability management process, with the
assumption that the organizations is scanning for known vulnerabilities, a formal system (i .e.
change management and el ectronic ti cketing system) i s used to track activities to mi ti gate
known vul nerabilities, and there i s a known remedy for the vul nerability.
The metri c i s useful where a single vulnerability or remedy (no matter how many systems are
affected) i s expressed as a single change ti cket or as one change ti cket per affected network
node.
Table 23: Mean Cost to Mitigate Vulnerabilities
Metric
Name
Mean Cost to Mi ti gate Vul nerabiliti es
Version 1.0.0
Status Fi nal for Review
Description The goal of thi s metric i s to understand the effort required for vulnerability
remedi ation acti vities.
Ri sk management decisions can take i nto account the effi ciency of
vul nerability remediation and make more i nformed decisions around
vul nerability policies, SLAs, and resource allocation i n the IT envi ronment.
Type Operational
Audience Securi ty Management
Question What i s the average (mean) cost to the organization to mi tigate an identified
vul nerability during the gi ven period?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0.0
i ndi cates there were no measured costs to the organization.
Formula Thi s metri c i s calculated by summing the total cost to mi tigate each
vul nerability and dividing it by the total number of mi tigated vulnerabilities.
Thi s count should also be done for each severity value (Low, Medi um, and
The CIS Security Metrics v1.1.0 November 1, 2010

60 | P a g e
2010 The Center for Internet Security
Hi gh):
) (
)) _ _
) _ * _ _ _ ((
ities Vulnerabil Mitigated_ Count
Costs Mitigation Other
Rate Hourly Mitigate to Hours Person
MCMV


Units $USD per Vul nerabilities
Frequency Monthl y
Targets Ideal ly, all vulnerabilities would be remedied by a automated vulnerability
remedi ation system, and the mean cost to remedi ate would be zero. In
practice a target can be set based on the expected l oss budget determi ned by
ri sk assessments processes.
Sources Vul nerability tracking systems will provide vulnerability data. Cost data can
come from management estimates, ti cket tracking systems, and capital and
servi ces budgets.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: MCMV ($)

Usage
Mean-Ti me to Mi tigate Vul nerabilities i s a type of vulnerability management metric and relies
on the common defi niti on of vulnerability as defined in the Gl ossary. Due to the number of
vul nerabilities and exposures found by most scanning tools, this metri c should generally be
cal culated for Hi gh and Medium severity vulnerabilities.
Combi ned wi th the number of i denti fied vulnerabilities thi s metric can provide vi sibility into the
total cost and effort required to remediate and manage the known vulnerabilities i n the
organization.
Opti mal conditions would reflect a l ow value i n the metri c. The l ower the value the more effi ci
ent and cheaply the organization is able to mi tigate i denti fied vulnerabilities.
There may be a di rect correlation between the number of un-mi tigated vulnerabilities and the
number of security i ncidents. Si nce vulnerabilities may not be addressed due to cost concerns,
The CIS Security Metrics v1.1.0 November 1, 2010

61 | P a g e
2010 The Center for Internet Security
an organization with a l ower average remediation cost may be able to mi ti gate more
vul nerabilities.
Limitations
Note that thi s assumes:
Effort i s tracked for vulnerability remediation
Ti ckets are cl osed when the change i s known to have mi tigated the vul nerability
Vul nerabilities can be tracked between scans on a vulnerability i nstance per-host basis
We are not i ncluding in-progress tickets, vulnerabilities that have not been mi tigated, or
vul nerabilities that do not have a resolution.
References
ISO/IEC 27002:2005
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.

The CIS Security Metrics v1.1.0 November 1, 2010

62 | P a g e
2010 The Center for Internet Security
Patch Management
Thi s section describes metri cs for measuring the effecti veness of patch management processes.
Many security incidents are caused by exploitation of known vulnerabilities for which patches
are available. Patches are rel eased by vendors on regular and ad-hoc schedules and the cycle of
testi ng and deploying patches is a regular part of an organizations IT activities. Many patches
are rel eased to di rectly address security i ssues i n applications and operating systems and the
performance of the patch management process wi ll directly affect the security posture of the
organization.
These metri cs are based upon a patching management process with the fol lowing structure:
1. Securi ty and Patch Information Sources
2. Patch Prioritization and Scheduling
3. Patch Testi ng
4. Confi guration (Change) Management
5. Patch Installation and Deployment
6. Patch Veri fication and Closure
Data Attributes
Patch metri cs are comprised of the fol lowing datasets:
Technologies. Contains i nformation about the technologies i n the organizations environment.
Technol ogies should be i dentified and named according to the Common Product Enumeration
Di cti onary maintained by NIST (http://nvd.nist.gov/cpe.cfm).
Patch Information. This table contains information about the patch, such as the rel ease date,
vendor references, vulnerability references, etc. The Open Vul nerability and Assessment
Language (OVAL) Repository
8
provides a structured data source of patch i nformation that can
be used for thi s purpose.
Patch Activity. Thi s table contains l ocal i nformati on about specific patch deployments i n an
envi ronment, such as the number of systems patched, patch i nstallation date, etc.


8 http://oval.mitre.org/repository/index.html
The CIS Security Metrics v1.1.0 November 1, 2010

63 | P a g e
2010 The Center for Internet Security
Table 24: Technologies Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy:
Technologies Table
Name Type De-
identi
fied
Required Description
Technol
ogy ID
Text
/
Num
ber
No Yes Uni que i dentifier for the technology. Generally auto-
generated.
Name Text No No Name from CPE Di cti onary which follows the fol lowing
structure:
cpe:/{PART}:{VENDOR}:{PRODUCT}:{VERSION}:{UPDAT
E}:{EDITION}:{LANGUAGE}.
Part Text No No Pl atform. Use value: H, O, or A. H, O, and A represent
hardware, operating system, and application
envi ronment respectively.
Vendor Text No No Vendor from CPE Di ctionary. This i s the hi ghest
organization-specific l abel of the DNS name.
Product Text No No Product from CPE Di ctionary. Thi s is the most
recogni zable name of the product.
Versi on Text No No Versi on from CPE Di ctionary. Same format as seen
wi th the product.
Update Text No No Update or service pack i nformation from CPE
Di cti onary.
Edi ti on Text No No Edi ti on from CPE Di ctionary. May defi ne specific target
hardware and software architectures.
Languag
e
Text No No Language from CPE Dictionary.
The CIS Security Metrics v1.1.0 November 1, 2010

64 | P a g e
2010 The Center for Internet Security
Technol
ogy
Val ue
Text No Recomme
nded
Impact from the l oss of this technology (C/I/A) to the
organization. Uses value Low, Medium, High, or Not
Defined.
9

Busi ness
Uni t
Text No No Organizational business unit that the technology
bel ongs to.
Owner Text No No Uni que i dentifier for individual within the organization
that i s responsible for the technology.
Cl assific
ati on
Text No No Cl assification of technology: Servers, Workstations,
Laptops, Network Device, Storage Devi ce,
Appl i cations, Operating systems

Table 25: Exempt Technologies Table
Thi s table contains a list of technologies exempt from patch management.
Technologies Table
Name Type De-
identified
Required Description
Technol ogy
ID
Number No Yes Uni que i dentifier for the technology.
General ly auto-generated.
Exempti on
Date
Date/Ti me No No Date that the technol ogy was
exempt from patch management.
Exempt By Stri ng No No Uni que i dentifier for entity granting
exempti on.
Reason Stri ng No No Reason for exemption.

Table 26: Patch Information Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
patch:

9
This is adopting 2.3.3 Security Requirements Scoring Evaluation from CVSS v2, http://www.first.org/cvss/cvss-guide.ht ml#i2.3.
The CIS Security Metrics v1.1.0 November 1, 2010

65 | P a g e
2010 The Center for Internet Security
Patch Information Table
Name Type De-
identified
Required Description
Patch ID Number No Yes Uni que i dentifier for the patch.
General ly auto-generated. This can
be an organization-specific
i denti fier for the patch.
Patch Source Text No No The name of the vendor or group
i ssuing the patch
Patch Name Text No No The name of the patch.
Vul nerability
ID
Number No Yes One to many references to
vul nerabilities in NVD addressed by
thi s patch
Cri ti cality
Level
Text No No Level of cri ti cality as determi ned by
the cl assification process, typi cally
Hi gh, Medi um, or Low.
Organization-
Speci fic
Cri ti cality
Level
Text No No Level of cri ti cality as determi ned by
the organization. This may be
di stinct from a vendor or
communi ty determined patch
cri ti cality.
Date of
Noti fi cation
Date/Ti me No No Date and ti me when the patch
noti fi cation was first received.
General ly this should be the
rel ease date of the patch.
Date of
Avai l ability
Date/Ti me No No Date and ti me when the patch was
rel eased.
Date of Patch
Approval
Date/Ti me No No Date and ti me when the patch was
approved by the organization for
depl oyment.
The CIS Security Metrics v1.1.0 November 1, 2010

66 | P a g e
2010 The Center for Internet Security
Patch Type Stri ng No No Type of patch (service pack,
appl ication update, driver, etc.)

Table 27: Patch Activity Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
patch depl oyed in the envi ronment. Some organizations may wi sh to track patch activity wi th
greater granularity, at the l evel of each patch i nstance. In thi s case, the same table structure
can be used, wi th the number of Technology Instances and Patch Instances bei ng 1 for
each row.
Patch Activity Table
Name Type De-
identified
Required Description
Patch
Instance ID
Number No Yes Uni que i dentifier for the patch
i nstance. Generally auto-
generated
Patch ID Number No Yes Reference to the Patch i n the Patch
Information Table
Technol ogy
ID
Number Yes Yes Number of i nstances of a specific
technol ogy. Thi s is a count of all
the technol ogies to whi ch this
patch applies.
Date of
Installati on
Date/Ti me No No Date and ti me when the patch was
i nstalled (including any rebooting
or rel oading process).
Patch Status Text No No Current status of the patch. Use
val ues Installed and Not Installed.
Pri ority Text No No Pri ority of patch i nstallation. Use
val ues of Hi gh, Medi um, or Low.


The CIS Security Metrics v1.1.0 November 1, 2010

67 | P a g e
2010 The Center for Internet Security
Table 28: Patch Activity Review Table
Thi s table contains the verification that patches were i nstalled properly and consistently
throughout the organization.
Patch Activity Review Table
Name Type De-
identified
Required Description
Patch
Instance ID
Number No Yes Uni que i dentifier for the patch i nstance.
General ly auto-generated
Revi ew Date Date No No Date revi ew was conducted
Revi ewed By Stri ng No No Enti ty that conducted the Revi ew
Compl i ant Bool ean No No Whether or not patch was i nstalled i n
accordance to policy.
Patch Cost Numeri c No No Cost of the patch depl oyment (USD)
Patch Effort Numeri c No No Total person-hours of effort for the patch
depl oyment.

Diagram 3: Relational Diagram for Patch Management Data Attributes
The di agram bel ow shows the rel ationship of tables described i n Patch Management Data
Attri butes:
The CIS Security Metrics v1.1.0 November 1, 2010

68 | P a g e
2010 The Center for Internet Security
Patch Information
PK Patch ID CHAR(10)
FK1 Vulnerability ID CHAR(10)
Patch Name TEXT(10)
Patch Source CHAR(10)
Criticality CHAR(10)
Organization-Specific Criticality Level CHAR(10)
Date of Notification DATETIME
Date of Availability DATETIME
Date of Patch Approval DATETIME
Patch Type CHAR(10)
Patch Activity
PK Patch Instance ID CHAR(10)
PK,FK1 Patch ID CHAR(10)
PK,FK2 Technology ID CHAR(10)
Installation Date DATETIME
Patch Status BIT
Priority CHAR(10)
Patch Activity Review
PK,FK1 Patch Instance ID CHAR(10)
PK Review Date DATETIME
Reviewed By CHAR(10)
Compliant BIT
Patch Cost CURRENCY
Patch Effort SHORT
Exempt Technologies
PK Technology ID CHAR(10)
FK1 Patch ID CHAR(10)
Exempt By CHAR(10)
Exemption Date DATETIME
Reason TEXT(50)
Technologies
PK,FK1 Technology ID CHAR(10)
Name TEXT(10)
Part CHAR(1)
Vendor TEXT(10)
Product TEXT(10)
Version TEXT(10)
Update CHAR(10)
Edition TEXT(10)
Language TEXT(10)
Technology Value CHAR(10)
Business Unit CHAR(10)
Owner CHAR(10)
Classification CHAR(10)
Vulnerability
PK Vulnerability ID CHAR(10)
Vulnerability Name TEXT(10)
CVE ID CHAR(10)
CWE ID CHAR(10)
Description TEXT(20)
Release Date DATETIME
Severity CHAR(10)
Classification CHAR(10)

The CIS Security Metrics v1.1.0 November 1, 2010

69 | P a g e
2010 The Center for Internet Security
Classifications
Taggi ng of i nformation is a very valuable way to provide context to collected data records.
Cl assification tags provide a way to group patches. While currently the onl y classification i s the
cri ti cality of the patch, i n the future, patches may fall i nto one or more categories, so the patch
management record system should support one-to-many tagging capabilities.
Criticality of Patches
Cri ti cality ratings for patches are usually provided by vendors, although alternate ratings may
be provi ded by security companies. An example of such a scale i s Microsofts Severity Rating
System
11
:
Cri ti cal A vul nerability whose exploitation could allow the propagation of an Internet
worm wi thout user acti on.
Important A vul nerability whose expl oitation could result i n compromise of the
confi dentiality, i ntegrity, or availability of users data, or of the i ntegrity or availability of
processing resources.
Moderate Expl oitability i s miti gated to a significant degree by factors such as default
confi guration, auditing, or di fficulty of exploitation.
Low A vul nerability whose exploitation is extremely difficult, or whose i mpact i s
mi ni mal.
Technology Value (CTV, ITV, ATV)
Technol ogy values wi ll be rated by adopting the Common Vul nerability Scoring System (v2)
secti on 2.3.3 Security Requirements Scoring Evaluation ratings. These Technology Value scores
can be used i ndependently as well as used for the complete scoring of a vulnerabili ty that
affected the technol ogy. Each technology i s assigned one of three possible values, Low,
Medi um, Hi gh (or Not Defi ned) depending on the i mpact from l oss of confidentiality (CTV),
i ntegri ty (ITV), or availability (ATV). These ratings are reproduced here:
Low (L) Loss of [confidentiality | i ntegrity | availability] i s likely to have onl y a l imited
adverse effect on the organization or i ndividuals associated wi th the organization (e.g.,
empl oyees, customers).
Medi um (M) Loss of [confidentiality | i ntegrity | availability] i s likely to have a serious
adverse effect on the organization or i ndividuals associated wi th the organization (e.g.,
empl oyees, customers).

11
http://www.microsoft.com/technet/security/bulletin/rating. mspx
The CIS Security Metrics v1.1.0 November 1, 2010

70 | P a g e
2010 The Center for Internet Security
Hi gh (H) Loss of [confidentiality | i ntegrity | availability] i s l ikely to have a catastrophic
adverse effect on the organization or i ndividuals associated wi th the organization (e.g.,
empl oyees, customers).
Not Defi ned (ND) Assigning thi s value to the metri c wi ll not i nfluence the score. It i s a
si gnal to the equati on to skip thi s metric.
As described i n CVSS v2, these values should be based on network l ocation, business function,
and the potenti al for l oss of revenue of l ife, although no specific methodology i s defined to
assign these values.
Sources
The pri mary data source for patch deployments, systems under management, and ti me to
patch can be found in automated patch management systems and processes. The primary
source for data about those systems not under management can be derived from asset
management systems or network discovery activities. Generally, a l ist of all assets under
management can be extracted from patch management systems and compared to l ists of all
assets generated from asset management systems and/or network discovery scans.
Dimensions
These metri cs may i nclude additional dimension for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the technology record as described i n
Patch Management Metrics: Data Attributes. For example:
Technology Value di mension allows Coverage to be computed for high, medi um, or
l ower value technologies.
Patch Criticality could be a di mension i f data wi th sufficient granularity is available.
Business Units for i dentifying the coverage by parts of the organization.
Asset Value di mension allows Coverage to be computed for hi gh, medi um, or l ower
val ue assets.
Tags for characterizing types of assets, such as coverage by vendor, etc.
Automation
The abi lity to automate source data collection for this metric is high because most automated
patch management systems can provide the necessary reports i n combination wi th assets
tracking and discovery across networks providing counts of all technologies. Calculation of this
metri c i s an ongoing basis. Once source data has been obtained, i t l ends i tself to a high degree
of automati on.
The CIS Security Metrics v1.1.0 November 1, 2010

71 | P a g e
2010 The Center for Internet Security
Visualization
These metri cs may be vi sually represented i n several ways:
Simple visualizations may i nclude a table showing the metri c result for the organization with
each row di splaying the value as of selected ti me periods (each week or each month).
Graphical visualizations may i nclude ti me-series charts where the metri c result i s plotted on
the verti cal axis and ti me periods displayed on the horizontal axis. To provide maximum i nsight,
pl otted values for each period may i nclude stacked series for the di ffering severity values.
Complex visualizations should be used for displaying the metri c result for cross-sections of
di mensions to expose concentrations of risk, such as patch criticality, business units, or
technol ogy value. For example, small multiples could be used to compare the number of hi gh
severi ty vulnerabilities across business units or technology values.
The CIS Security Metrics v1.1.0 November 1, 2010

72 | P a g e
2010 The Center for Internet Security
Management and Operational Metrics
Patch Policy Compliance
Objective
Patch Policy Compliance (PPC) i ndicates the scope of the organizations patch l evel for
supported technologies as compared to thei r documented patch policy. While specific patch
pol i cies may vary wi thin and across organizations, performance versus stated patch state
objecti ves can be compared as a percentage of compliant systems.
Table 29: Patch Policy Compliance
Metric
Name
Patch Policy Compliance
Version 1.0.0
Status Fi nal
Description Patch Policy Compliance (PPC) measures an organizations patch l evel for
supported technologies as compared to thei r documented patch policy.
Pol i cy refers to the patching policy of the organization, more specifically,
whi ch patches are required for what type of computer systems at any gi ven
ti me. Thi s policy mi ght be as simple as i nstall the l atest patches from
system vendors or may be more complex to account for the criticality of the
patch or system.

Patched to pol icy reflects an organizations risk/reward decisions regarding
patch management. It i s not meant to i mply that all vendor patches are
i mmedi atel y i nstalled when they are distributed.
Type Management
Audience Busi ness Management, Security Management
Question What percentage of the organizations technologies is not i n compliance
wi th current patch policy?
Answer A posi tive i nteger value between zero and 100 i nclusive. A value of 100%
i ndi cates that all technologies are i n compliance to the patch policy.
Formula Patch Policy Compliance (PPC) i s calculated by dividing the sum of the
The CIS Security Metrics v1.1.0 November 1, 2010

73 | P a g e
2010 The Center for Internet Security
technol ogies currently compliant by the sum of all technologies under patch
management (where the current patch state i s known). This metri c can be
cal culated for subsets of technologies such as by technology value or
busi ness unit:

PPC
Count(Compliant_Instances)
Count(Technology_Instances)
*100
Units Percentage of technology instances
Frequency Weekl y, Monthly, Quarterly, Annually
Targets PPC val ues should trend higher over ti me. An i deal result would be 100% of
technol ogies. The expected trend for thi s metric over ti me i s to remain
stable or i ncrease towards 100%. There wi ll be variations when new patches
are rel eased for l arge number of technologies (such as a common operating
system) that could cause this value to vary significantly. Measurement of this
metri c should take such events i nto consideration. Higher values would
general ly result i n less exposure to known security issues. Because of the
l ack of experiential data from the fi el d, no consensus on the range of
acceptable goal values for Patch Policy Compliance exists.
Sources Patch management and IT support tracking systems wi ll provide patch
depl oyment data. Audit reports wi ll provide compliance status.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: PPC (%)

Usage
Patch Management Coverage is a type of patch management metric and rel ies on the common
defi ni tion of patch as defined i n Glossary.

Patch Policy Compliance can be calculated over ti me typically per-week or per-month. To gain
i nsight i nto the rel ative risk to one business unit over another, Compliance may also be
cal culated for cross-sections of the organizati on, such as individual business units or
geographies or technology values and types.
The CIS Security Metrics v1.1.0 November 1, 2010

74 | P a g e
2010 The Center for Internet Security
Limitations
Thi s metri c i s highly dependent upon the current set of patch policy requirements. When
patches are released that affect l arge numbers of technologies (such as common operating
systems), thi s number can vary greatly wi th ti me i f the l ack of new patches makes a system
non-compliant.
References
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.
The CIS Security Metrics v1.1.0 November 1, 2010

75 | P a g e
2010 The Center for Internet Security
Mean Time to Patch
Objective
Mean Ti me to Patch (MTTP) characterizes the effecti veness of the patch management process
by measuring the average ti me taken from date of patch release to i nstallation i n the
organization for patches deployed during the metri c time peri od. Thi s metric serves as an
i ndi cator of the organizations overall level of exposure to vul nerabilities by measuring the ti me
the organization takes to address systems known to be i n vul nerable states that can be
remedi ated by security patches. Thi s is a partial indicator as vulnerabilities may have no
patches available or occur for other reasons such as system configurations.
Table 30: Mean Time to Patch
Metric
Name
Mean Ti me to Patch
Version 1.0.0
Status Fi nal
Description Mean Ti me to Patch (MTTP) measures the average ti me taken to depl oy a
patch to the organizations technologies. The more qui ckly patches can be
depl oyed, the l ower the mean ti me to patch and the l ess ti me the
organization spends with systems in a state known to be vul nerable.
Type Operational
Audience Securi ty Management
Question How l ong does i t take the organization to deploy patches i nto the
envi ronment?
Answer A posi tive floating-point value that is greater than or equal to zero. A val ue
of 0 i ndi cates that patches were theoreti cally instantaneously deployed.
Formula Mean Ti me to Patch i s calculated by determi ning the number of hours
between the Date of Avai lability and the Date of Installation for each patch
compl eted i n the current scope, for example by ti me period, criti cality or
busi ness unit. These results are then averaged across the number of
compl eted patches i n the current scope:
The CIS Security Metrics v1.1.0 November 1, 2010

76 | P a g e
2010 The Center for Internet Security

MTTP
(Date_of_Installation Date_of_Availability )

Count(Completed_Patches )

Units Hours per patch
Frequency Weekl y, Monthly, Quarterly, Annually
Targets MTTP val ues should trend l ower over ti me. Most organizations put patches
through test and approval cycles prior to deployment. Generally, the target
ti me for MTTP wi l l be a function of the criticality of the patch and business
cri ti cality of the technology. Because of the l ack of experiential data from
the fi el d, no consensus on the range of acceptable goal values for Mean
Ti me to Patch exists.
Sources Patch management and IT support tracking systems wi ll provide patch
depl oyment data.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: MTTP (Hr/Patch)
Usage
Mean Ti me to Patch i s a type of patch management metri c, and relies on the common
defi ni tion of patch as defined i n Glossary.
Gi ven that many known vulnerabilities result from missing patches, there may be a di rect
correl ation between l ower MTTP and l ower l evels of Security Incidents. MTTP can be
cal culated over ti me, typi cally per-week or per-month. To gain i nsight i nto the rel ative
performance and risk to one business unit over another, MTTP may also be calculated for
di fferent patch criticalities and cross-secti ons of the organization, such as individual business
uni ts or geographies.
Limitations
Cri ti cal Technologies. Thi s metric assumes that the cri tical technologies are known and
recorded. If the critical technologies are unknown, this metric cannot be accurately measured.
As new technol ogies are added thei r criticality needs to be determi ned and, i f appropriate,
i ncl uded i n this metric.
Vendor Rel i ance. This metri c i s reliant upon the vendors ability to noti fy organization of
updates and vulnerabilities that need patching. If the vendor does not provide a program for
The CIS Security Metrics v1.1.0 November 1, 2010

77 | P a g e
2010 The Center for Internet Security
noti fyi ng thei r customers then the technology, i f critical, wi ll always be a black mark on thi s
metri c.
Cri ti cality Ranking. This metri c i s highly dependent upon the ranking of critical technologies by
the organization. If this ranking i s abused then the metri c wi ll become unreliable.
Patches in-Progress. Thi s metric calculation does not account for patch installations that are
i ncomplete or on-going during the ti me peri od measured. It is not cl ear how this wi ll bias the
resul ts, although potentially an extended patch deployment wi ll not appear i n the results for
some ti me.
References
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.

The CIS Security Metrics v1.1.0 November 1, 2010

78 | P a g e
2010 The Center for Internet Security
Mean Cost to Patch
Objective
Thi s defi nes a metri c for measuring the mean effort required to depl oy a patch i nto an
envi ronment.
The metri c i s expressed i n the context of a patch management process, wi th the assumption
that the organizations has a formal system (i .e. patch management and electronic ti cketing
system) used to track activities to depl oy patches.
The metri c i s useful where a single patch deployment (no matter how many systems are
affected) i s expressed as a single change ti cket or as one change ti cket per affected network
node. Thi s data can also be recorded as monthly totals where per-patch l evel granularity is not
possible.
.
Table 31: Mean Cost to Patch
Metric
Name
Mean Cost to Patch
Version 1.0.0
Status Fi nal
Description The goal of thi s metric i s to understand the effort required for patch
management activities.
Ri sk management decisions can take i nto account the effi ciency of patch
depl oyment to make more i nformed decisions around patch compliance
pol i cies, Service Level Agreements, and resource allocation i n the IT
envi ronment.
Type Operational
Audience Securi ty Management
Question What i s the average (mean) cost to the organization to deploy a patch during
the gi ven peri od?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0.0
i ndi cates there were no measured costs to the organization.
The CIS Security Metrics v1.1.0 November 1, 2010

79 | P a g e
2010 The Center for Internet Security
Formula Mean Cost to Patch i s calculated by determining the total cost to depl oy
patches. These results are then averaged across the number of patches
depl oyed i n the current scope:
) (
) _ _ _ (
atches Deployed_P Count
Cost Patch Other Cost Patch
MCP



Patch Cost may be a determi ned aggregate cost or i s calculated based upon
the amount of effort put i nto the patching process as calculated by: Patch
Effort * Hourl y Rate.
Other Patch Costs may i nclude:
purchases of additional equipment
purchases of new software versions
cost associated wi th mitigation of a specific vulnerability
cost associated wi th vendor waiting to rel ease patch until i ts next
rel ease cycle for i denti fied vulnerabilities
cost associated wi th delaying updates until next update cycle
cost associated wi th identifying mi ssing patches
cost associated wi th downtime during testing and i nstallati on of
mi ssing patches
The cost of patch management systems should not be i ncluded i n thi s cost.
If a one-ti me cost is associated with multiple vulnerabilities the cost should
be di stributed evenl y across the rel evant vulnerabilities.
Units $USD per Patch
Frequency Monthl y
Targets Ideal ly, all patches would be deployed by an automated system, and the
mean cost to patch would approach zero (gi ven patch testing costs, etc.).
Sources Patch management and IT support tracking systems wi ll provide patch
depl oyment data. Cost data can come from management estimates, ti cket
tracking systems, and services budgets.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: MCP ($/Patch)
The CIS Security Metrics v1.1.0 November 1, 2010

80 | P a g e
2010 The Center for Internet Security
Usage
Keepi ng systems fully patched should reduce risk and result i n lower i ncidents costs to the
organization. Organizations generally have to balance the desi re to patch systems with the cost
and effort of preparing, testi ng, and deploying patches i n thei r envi ronment. Mean Cost to
Patch allows the organization understand the cost of patch deployment, manage trends, and
perform cost-benefit analysis patch updates, comparing the cost to the organizati on to the
costs of any i ncreases i n security i ncidents.
Limitations
Note that thi s assumes:
Effort i s tracked for vulnerability remediation
Ti ckets are cl osed when the change i s known to have mi tigated the vul nerability
Vul nerabilities can be tracked between scans on a vulnerability i nstance per-host basis
We are not i ncluding in-progress tickets, vulnerabilities that have not been mi tigated, or
vul nerabilities that do not have a resolution.
References
Cavusoglu, Cavusoglu, Zhang. Economics of Security Patch Management. 2006.
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.
The CIS Security Metrics v1.1.0 November 1, 2010

81 | P a g e
2010 The Center for Internet Security
Configuration Management Metrics
Thi s section describes metri cs for measuring security around configuration management i n an
organizations environment. Configuration management i s i mportant to organizations for both
the depl oyment and ongoing management of systems.
The goal of Configuration Management i s to provide control over the state of the IT
i nfrastructure. Configuration management covers processes for the i dentification, recording,
and reporting on the configuration state of the IT i nfrastructure. Some aspects of configuration
management are also covered by other sets of security metrics, such as security patch
management and vulnerability management.
Confi guration management processes include: Identification of IT components, establishing
control and authorized over changes, monitoring the status of configuration items, and
veri fi cation and audit.
Key Questi ons i n this business function are:
What systems are in the organization?
Are these systems configured as i ntended?
What are the excepti ons to i ntended configurati ons?
The i ni tial metrics for this business function are:
Confi guration Compliance
Confi guration Assessment Coverage
AV/AM Compl iance
Data Attributes
Table 32: Technologies Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy:
Technologies Table
Name Type De-
identi
fied
Required Description
The CIS Security Metrics v1.1.0 November 1, 2010

82 | P a g e
2010 The Center for Internet Security
Technol
ogy ID
Text
/
Num
ber
No Yes Uni que i dentifier for the technology. Generally auto-
generated.
Name Text No No Name from CPE Di cti onary which follows the fol lowing
structure:
cpe:/{PART}:{VENDOR}:{PRODUCT}:{VERSION}:{UPDAT
E}:{EDITION}:{LANGUAGE}.
Part Text No No Pl atform. Use value: H, O, or A. H, O, and A represent
hardware, operating system, and application
envi ronment respectively.
Vendor Text No No Vendor from CPE Di ctionary. This i s the hi ghest
organization-specific l abel of the DNS name.
Product Text No No Product from CPE Di ctionary. Thi s is the most
recogni zable name of the product.
Versi on Text No No Versi on from CPE Di ctionary. Same format as seen
wi th the product.
Update Text No No Update or service pack i nformation from CPE
Di cti onary.
Edi ti on Text No No Edi ti on from CPE Di ctionary. May defi ne specific target
hardware and software architectures.
Languag
e
Text No No Language from CPE Dictionary.
Technol
ogy
Val ue
Text No Recomme
nded
Impact from the l oss of this technology (C/I/A) to the
organization. Uses value Low, Medium, High, or Not
Defined.
12

Busi ness
Uni t
Text No No Organizational business unit that the technology
bel ongs to.

12
This is adopting 2.3.3 Security Requirements Scoring Evaluation from CVSS v2, http://www.first.org/cvss/cvss-guide.ht ml#i2.3.
The CIS Security Metrics v1.1.0 November 1, 2010

83 | P a g e
2010 The Center for Internet Security
Owner Text No No Uni que i dentifier for individual within the organization
that i s responsible for the technology.
Cl assific
ati on
Text No No Cl assification of technology: Servers, Workstations,
Laptops, Network Device, Storage Devi ce,
Appl i cations, Operating systems

Table 33: Configuration Status Accounting Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy:
Configuration Status Accounting Table
Name Type De-
identified
Required Description
Technol ogy
ID
Text /
Number
No Yes Uni que i dentifier for the
technol ogy. Generally auto-
generated.
Basel ine Text No No Description of baseline
Change ID Text /
Number
No Yes Uni que i dentifier for the change, i f
appl icable.

Table 34: Configuration Deviation Table
The fol l owing is a l ist of technologies wi th requests for deviation:
Configuration Deviation Table
Name Type De-
identified
Required Description
Technol ogy
ID
Text /
Number
No Yes Uni que i dentifier for the
technol ogy. Generally auto-
generated.
Devi ation ID Text / No Yes Uni que i dentifier for deviation
The CIS Security Metrics v1.1.0 November 1, 2010

84 | P a g e
2010 The Center for Internet Security
Number request. Generally auto-generated.
Requested By Text /
Number
No No Uni que i dentifier of enti ty
submi tting deviation request.
Request Date Date /
Ti me
No No Date and ti me devi ation request
was submitted.
Reason Text No No Reason for deviation.
Status Text No No Current status of devi ation request.
Use val ues Pending, Approved, or
Not Approved.
Approval By Text /
Number
No No Uni que i dentifier of enti ty
approving/not approving deviation
request.
Approval
Date
Date /
Ti me
No No Date and ti me devi ation request
was approved/not approved.

Table 35: Configuration Audit Table
The fol l owing is a l ist of technologies that have undergone configuration audit:
Configuration Deviation Table
Name Type De-
identified
Required Description
Technol ogy
ID
Text /
Number
No Yes Uni que i dentifier for the
technol ogy. Generally auto-
generated.
Audi t ID Text /
Number
No Yes Uni que i dentifier for configuration
audi t.
Audi t Date Date /
Ti me
No No Date and ti me configuration audit
occurred.
Audi t By Text / No No Uni que i dentifier for entity that
The CIS Security Metrics v1.1.0 November 1, 2010

85 | P a g e
2010 The Center for Internet Security
Number conducted the configuration audit.
Compl i ant Bool ean No No Whether or not technology i s
compl iant to configuration
standards. Use values Compliant or
Not Compl iant.

Defined Metrics
Percentage of Configuration Compliance
Objective
The goal of thi s metric i s to provide an i ndicator of the effecti veness of an organizations
confi guration management policy relative to i nformation security, especially emergi ng exploits.
If 100% of systems are configured to standard, then those systems are rel atively more secure
and manageable. If this metri c i s less than 100%, then those systems are relatively more
exposed to expl oits and to unknown threats.
Table 36: Percentage of Configuration Compliance
Metric
Name
Percentage of Configuration Compliance
Version 1.0.0
Status Fi nal
Description Thi s document defines a metric for the effecti veness of configuration
management i n the context of i nformation security. A percentage metri c will
al l ow benchmarking across organizations.
Thi s metri c attempts to answer the question Are system configuration
compl iance l evels acceptable? Thi s question presumes the organization has
defi ned an acceptable l evel of compliance, which may be l ess than 100% to
account for the realities of ongoing change i n the operational environments.
The percentage of total computer systems i n an organization that are
confi gured in compliance wi th the organizations approved standards.
Compl i ance i s a binary evaluation: a gi ven system i s ei ther configured
correctly according to the standard or i t is not. Compliance can be evaluated
The CIS Security Metrics v1.1.0 November 1, 2010

86 | P a g e
2010 The Center for Internet Security
by automated methods, manual i nspection, audit, or some combination.
The computer system population base i s the total number of computer
systems with approved configuration standards. This may be all systems or
onl y a subset (i .e. only desktops, or only servers, etc.)
The Confi guration benchmark used i s the CIS benchmarks i f available
(http://ci security.org). Additional metri c results can be calculated for other
or i nternal configuration benchmarks.
Organizations that do not have approved standards for their computer
systems should report N/A rather than a numeric value (0% or 100%)
In Scope
Exampl es of percentage of systems configured to approved standard could
i ncl ude :
Confi guration of servers
Confi guration of workstations/laptops
Confi guration of hand-held devices
Confi guration of other supported computer systems covered by the
organizations patch policy

Out of Scope
Exampl es of computer system configurations that are not i n scope i nclude:
Temporary guest systems (contractors, vendors)
Lab/test systems performing to or i n support of a specific non-
production project
Networki ng systems (routers, switches, access points)
Storage systems (i .e. network accessible storage)

Type Management
Audience Busi ness Management, Security Management
Question What percentage of the organizations systems are i n compliance wi th
approved standards?
Answer A posi tive i nteger value between zero and 100 i nclusive, expressed as a
The CIS Security Metrics v1.1.0 November 1, 2010

87 | P a g e
2010 The Center for Internet Security
percentage. A val ue of 100% i ndicates that all technologies are i n
confi guration management system scope.
Formula Percentage of Configuration Compliance (PCC) i s calculated by determining
the total number of i n-scope systems wi th approved configuration and then
averaging thi s across the total number of i n-scope systems:
) _ _ (
) _ _ _ _ _ (
Systems Scope In Count
ion Configurat Approved With Systems Scope In
PCC


Units Percentage of Systems
Frequency Monthl y
Targets The expected trend for thi s metric over ti me i s to remain stable or increase
towards 100%.
Sources Confi guration management and IT support tracking system audit reports will
provi de compliance status. Automated testing tools for CIS benchmarks are
al so available.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: PCC (%)
Usage
The Percent of Configuration Compliance (PCC) represents the overall compliance to
confi guration policies. The hi gher the PCC the more consistent the organizations systems are
and the easi er i s i t to establish and maintain those systems.
Limitations
Thi s metri c relies on the organization bei ng able to i dentify all systems that are under
confi guration management. Some systems may be exempt from configuration
management policies.
Thi s metri c relies on the organization bei ng able to veri fy that the system i s i n
compl iance to configuration policies

References
Center for Internet Security, Benchmarks (http://cisecurity.org)

The CIS Security Metrics v1.1.0 November 1, 2010

88 | P a g e
2010 The Center for Internet Security
IEEE Standard 828-1990, Software Configuration Management Plans.

ISO/IEC 12207:2008, Information technology Software l ife cycle processes and ISO/IEC
15288: 2008, Information technology System l ife cycle processes.

Ross, Katzke, Johnson, Swanson, Stoneburner and Rogers. Special Publication SP 800-53:
Recommended Security Controls for Federal Information Systems (Rev 2). US Nati onal Institute
of Standards and Technology, 2007

Chew, Swanson, Stine, Bartol, Brown and Robinson. Special Publication 800-55: Performance
Measurement Guide for Information Security (Rev 1). US Nati onal Institute of Standards and
Technol ogy, 2008

The CIS Security Metrics v1.1.0 November 1, 2010

89 | P a g e
2010 The Center for Internet Security
Change Management Metrics
Thi s section describes metri cs for measuring security around the change management i n an
organizations environment.
Changes are l ikely to be constantly occurring i n large and complex envi ronments. Managers
wi l l want to know how these changes i mpact the security of thei r systems and need metri cs
that answer questions such as:
How much change is happening?
How frequentl y are we maki ng changes?
How qui ckly can changes be i mplemented?
Do we know the securi ty i mpacts of these changes?
Are we devi ating from existing security policies?
The fol l owing initial set of metri cs for Configuration Management are designed to provide
managers with i nformation the organizations ability to i mplement change, to understand the
security i mpacts of those changes, and how these changes affect thei r overall risk profile.
1. Mean time to Complete Change. The average ti me taken to complete change requests.
2. Percent of Security Reviews. The percentage of completed change requests that had a
revi ew of the security i mpacts.
3. Percentage of Security Exceptions. The percentage of completed changes that di d
recei ved an exception to current security policy.
Data Attributes
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
change data record. These attributes were derived from the ITIL v3 Request for Change data
record.
13
Pl ease note that some fi elds in the Request for Change record are documented here
because they are not needed for configuration metri cs calculations.
Table 37: Technologies Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy:

13
S. Kempter and A. Kempter, ITIL V3 Checklist Request for Change RFC, 2008. <http://wiki.en.it -
processmaps.com/index.php/Checklist_Request_for_Change_RFC>
The CIS Security Metrics v1.1.0 November 1, 2010

90 | P a g e
2010 The Center for Internet Security
Technologies Table
Name Type De-
identi
fied
Required Description
Technol
ogy ID
Text
/
Num
ber
No Yes Uni que i dentifier for the technology. Generally auto-
generated.
Name Text No No Name from CPE Di cti onary which follows the fol lowing
structure:
cpe:/{PART}:{VENDOR}:{PRODUCT}:{VERSION}:{UPDAT
E}:{EDITION}:{LANGUAGE}.
Part Text No No Pl atform. Use value: H, O, or A. H, O, and A represent
hardware, operating system, and application
envi ronment respectively.
Vendor Text No No Vendor from CPE Di ctionary. This i s the hi ghest
organization-specific l abel of the DNS name.
Product Text No No Product from CPE Di ctionary. Thi s is the most
recogni zable name of the product.
Versi on Text No No Versi on from CPE Di ctionary. Same format as seen
wi th the product.
Update Text No No Update or service pack i nformation from CPE
Di cti onary.
Edi ti on Text No No Edi ti on from CPE Di ctionary. May defi ne specific target
hardware and software architectures.
Languag
e
Text No No Language from CPE Dictionary.
The CIS Security Metrics v1.1.0 November 1, 2010

91 | P a g e
2010 The Center for Internet Security
Technol
ogy
Val ue
Text No Recomme
nded
Impact from the l oss of this technology (C/I/A) to the
organization. Uses value Low, Medium, High, or Not
Defined.
14

Busi ness
Uni t
Text No No Organizational business unit that the technology
bel ongs to.
Owner Text No No Uni que i dentifier for individual within the organization
that i s responsible for the technology.
Cl assific
ati on
Text No No Cl assification of technology: Servers, Workstations,
Laptops, Network Device, Storage Devi ce,
Appl i cations, Operating systems

Table 38: Change Exemption Table
Thi s table displays technologies that are exempt from changes.
Change Exemption Table
Name Type De-
identified
Required Description
Change
Request ID
Number No Yes Uni que i dentifier for the change
request. Generally auto-
generated.
Technol ogy
ID
Text/Numb
er
No Yes One-to-many reference to
technol ogies that should undergo
change.
Exempt By Text Yes No Uni que i dentifier of the person
who approved the exempti on
Exempti on
Date
Date/Ti me No No Date and ti me the technol ogy was
exempt
Reason Text No No Reason why technology was
exempt



14
This is adopting 2.3.3 Security Requirements Scoring Evaluation from CVSS v2, http://www.first.org/cvss/cvss-guide.ht ml#i2.3.
The CIS Security Metrics v1.1.0 November 1, 2010

92 | P a g e
2010 The Center for Internet Security
Table 39: Change Request Table
Change Request contains i nformation regarding the approval of change requests.
Change Request Table
Name Type De-
identified
Required Description
Change
Request ID
Number No Yes Uni que i dentifier for the change
request. Generally auto-
generated.
Submi ssion
Date
Date/Ti me No No Date and ti me the change i tem
was submitted
Requested
By
Text Yes

No Uni que i dentifier of the person
that submi tted the change
Pri ority Text No JNo How soon the request should take
pl ace. Uses values High, Medium,
and Low.
Change Type Text No No Type of change. Use values
Archi tectural, Component, or
Emergency.
Esti mated
Cost
Text No No Esti mated cost of the change i n
Level of Effort or actual dollar
amounts
Status Text No No Current status of the request. Use
val ues Approved, Not Approved,
or Pendi ng Approval.
Approval
Date
Date/Ti me No No Date and ti me the request was
approved or disapproved
Approved By Text Yes No Uni que i dentifier of the person
who approved the change
Technol ogy
ID
Text/Numb
er
No No One-to-many reference to
technol ogies that should undergo
confi guration change.



The CIS Security Metrics v1.1.0 November 1, 2010

93 | P a g e
2010 The Center for Internet Security
Table 40: Change Item Table
Thi s table displays configuration changes that occurred on technologies within organizations.
Change Item Table
Name Type De-
identified
Required Description
Change ID Number No Yes Uni que i dentifier for the change.
General ly auto-generated.
Change
Request ID
Number No Yes Uni que i dentifier for the change
request. Generally auto-
generated.
Changed By Text Yes No Uni que i dentifier of the
i ndi vidual that performed the
change.

Technol ogy
ID
Text/Number No No One-to-many reference to the
technol ogies that underwent
confi guration change.
Schedul ed
Start Date
Date/Ti me No No Suggested date and ti me for the
change
Start Date Date/Ti me No No Date and ti me change started.
May use Approval Date.
Compl eti on
Date
Date/Ti me No No Date and ti me the change was
compl eted.

Table 41: Configuration Change Review Table
Thi s table displays change requests that were reviewed following a change.
Change Review Table
Name Type De-
identified
Required Description
Change ID Number No Yes Uni que i dentifier for the change.
General ly auto-generated.
Change Number No Yes Uni que i dentifier for the change
The CIS Security Metrics v1.1.0 November 1, 2010

94 | P a g e
2010 The Center for Internet Security
Request ID request. Generally auto-
generated.
Technol ogy
ID
Text/Number No No One-to-many reference to
technol ogies that should
undergo configuration change.
Actual Cost Text No No Actual cost of the change i n
Level of Effort or actual dollar
amounts
Change
Revi ew ID
Text/Number No Yes Uni que i dentifier for the change
revi ew.
Revi ewed By Text Yes No Uni que i dentifier of the person
who revi ewed the change.
Revi ewed
Date
Date/Ti me No No Date and ti me the change was
revi ewed.
Resul ts Text No No Resul ts of the change revi ew.
Use val ues of In Compliance,
Not i n Compliance.
Compl i ant Bool ean No No Whether or not change was
compl eted i n accordance to
pol i cy. Use values Compliant or
Not Compl iant.

Diagram 5: Relational Diagram for Configuration Management Data Attributes
The di agram bel ow shows the rel ationship of tables described i n Configuration Management
Data Attri butes:
The CIS Security Metrics v1.1.0 November 1, 2010

95 | P a g e
2010 The Center for Internet Security
Change Request
PK Change Request ID CHAR(10)
FK1 Technology ID CHAR(10)
Submission Date DATETIME
Requested By CHAR(10)
Priority CHAR(10)
Purpose CHAR(10)
Change Type CHAR(10)
Estimated Cost CURRENCY
Status CHAR(10)
Approval Date DATETIME
Approved By CHAR(10)
Exempt Technologies
PK,FK2 Technology ID CHAR(10)
FK1 Change Request ID CHAR(10)
Exempt By CHAR(10)
Exemption Date DATETIME
Reason TEXT(50)
Change Item
PK,FK1 Change Request ID CHAR(10)
PK Change ID CHAR(10)
PK Technology ID CHAR(10)
Changed By CHAR(10)
Scheduled Start Date DATETIME
Start Date DATETIME
Completion Date DATETIME
Change Review
PK Change Review ID CHAR(10)
PK,FK1 Change ID CHAR(10)
PK,FK1,FK2 Change Request ID CHAR(10)
PK,FK1 Technology ID CHAR(10)
Actual Cost CURRENCY
Reviewed By CHAR(10)
Reviewed Date DATETIME
Results TEXT(50)
Compliant BIT
Technologies
PK Technology ID CHAR(10)
Name TEXT(10)
Part CHAR(1)
Vendor TEXT(10)
Product TEXT(10)
Version TEXT(10)
Update CHAR(10)
Edition TEXT(10)
Language TEXT(10)
Technology Value CHAR(10)
Business Unit CHAR(10)
Owner CHAR(10)
Classification CHAR(10)

Classifications
Taggi ng of i nformation is a valuable way to provide context to collected data records.
Cl assification tags provide a way to group change requests, requesting parties, affected
busi ness applications or technologies, i mplementation teams, and change approval and review
methods.
Wi thi n an organization, the combination of di mensions can provide key i nsight into
concentrations of risks for an organization such as urgent requests on critical applicati ons or
changes to critical applications without security review.
The CIS Security Metrics v1.1.0 November 1, 2010

96 | P a g e
2010 The Center for Internet Security
Sources
The pri mary data source for these metri cs is a configuration management system or a change-
control tracking system.
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the underlying change record as
described in Configuration Management Metrics: Data Attributes. For example:
Pri ority of the change request
Group requesting the change
Whether or not security reviews were i nvolved
Locati on or business unit of the changed technology
Resul ts of the security review
Importance of the technology to the organization requiring the change request
Automation
The abi lity to automate the source data collection for these metri cs i s medium because most
organizations maintain a tracking system for configuration changes, although these systems
may vary i n thei r degree of automation. Once the i nitial dataset has been collected, use of the
dataset can be automated for metric calculation purposes.
Visualization
Confi guration change metrics may be vi sually represented in several ways:
Si mpl e visualizations may i nclude a table showing metri c results for the organization
wi th each row displaying the value as of selected ti me periods (each week or each
month). Col umns may be used for di fferent request priority levels (e.g. Low, Medi um,
and Hi gh).
Graphical vi sualizations may i nclude ti me-series charts where metri c results are pl otted
on the verti cal axis and the ti me peri ods displayed on the horizontal. To provide
maxi mum insight, pl otted values for each period may i nclude stacked series for the
di ffering request priorities.
Compl ex vi sualizations should be used for displaying metri c results for cross-secti ons
such as by organization or request priority. For example, small mul tiples could be used
to compare the number of urgent change requests across business units or values of the
target technol ogies or applications.
The CIS Security Metrics v1.1.0 November 1, 2010

97 | P a g e
2010 The Center for Internet Security
Defined Metrics
Mean Time to Complete Changes
Objective
The goal of thi s metric i s to provide managers with i nformation on the average ti me i t takes for
a confi guration change request to be completed.
Table 42: Mean Time to Complete Changes
Metric
Name
Mean Ti me to Compl ete Changes
Version 1.0.0
Status Fi nal
Description The average ti me i t takes to complete a configuration change request.
Type Operational
Audience Securi ty Management
Question What i s the mean ti me to compl ete a change request?
Answer A posi tive i nteger value that is greater than zero. A val ue of 0 i ndicates
that the organization i mmediately i mplements changes.
Formula The mean ti me to compl ete a change request is calculated by taking the
di fference between the date the request was submitted and the date the
change was completed for each change completed wi thin the ti me peri od of
the metri c. Thi s number i s then divided by the total number of changes
compl eted during the metri cs time period:

MTCC
Sum(Completion_Date Submission_Date)
Count(Completed_Changes)
Units Days per configuration change request
Frequency Weekl y, Monthly, Quarterly, Annually
Targets MTCC val ues should generally trend l ower over ti me provided operational
system upti me i s very hi gh. This number wi ll depend on the organizations
The CIS Security Metrics v1.1.0 November 1, 2010

98 | P a g e
2010 The Center for Internet Security
busi ness, structure, and use of IT. While a l ower value i ndicates greater
effecti veness at managing the IT envi ronment, this should be examined i n
combi nation wi th the use of approval and change review controls. Because
of the l ack of experiential data from the fi el d, no consensus on the range of
acceptable goal values for Mean Ti me to Complete Changes exi sts.
Sources Confi guration management and IT support tracking systems will provide
confi guration change data.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: MTCC (Days/Request)

Usage
Managers can use thi s metric to understand their ability to react to changing needs i n thei r
envi ronment. The faster the approval cycle, the shorter the response ti me wi ll be. The exact
val ue that reflects a healthy environment will be subjective for the type of company. However,
val ues should be similar for companies of the same size and business focus.
By focusing on high-value applications or urgent change requests they can i mprove thei r
understanding of risk management capabilities. It i s useful to pair this metric with data on the
absolute number of changes i n order to understand the effecti veness of the change
management capabilities of the organizati on.
Limitations
Onl y completed changes. This metric only calculates the result for changes that have been
compl eted during the ti me peri od. Changes that have not occurred will not i nfl uence the
metri c results until they are completed, perhaps several reporting periods l ater. Thi s may over -
report performance while the changes are not completed and under-report performance after
the changes has been completed.
Schedul ed changes. Changes that have been submitted with a scheduled change date may
resul t i n metri c values that do not provide material information. The ti me taken for the change
request to be approved and any del ays due to the work queue vol umes should be considered,
but not ti me a change request i s not bei ng processed i n some manner.
Vari ations i n the scale of changes. Al l changes are wei ghted equally for this metric regardless of
the l evel of effort required or priority of the request and are not taken i nto account by the
current metri c defi nition. Organizations wanting increased precision could group results by
categories of change size (e.g. Large, Medi um, Small) or normalize based on l evel of effort.
The CIS Security Metrics v1.1.0 November 1, 2010

99 | P a g e
2010 The Center for Internet Security
References
S. Kempter and A. Kempter, ITIL V3 Checklist Request for Change RFC, 2008. <http://wi ki.en.it-
processmaps.com/index.php/Checklist_Request_for_Change_RFC>
S. Kempter and A. Kempter, ITIL V3 Confi guration Management Process, 2008.
<http://wi ki .en.it-processmaps.com/index.php/Change_Management>
A. Ri l ey et al. Open Gui de ITIL Configuration Management, 2008.
<http://www.i tl ibrary.org/index.php?page=Configuration_Management>
The CIS Security Metrics v1.1.0 November 1, 2010

100 | P a g e
2010 The Center for Internet Security
Percent of Changes with Security Review
Objective
The goal of thi s metric i s to provide managers with i nformation about the amount of changes
and system churn i n thei r environment that have unknown i mpact on thei r security state.
Table 43: Percent of Change with Security Review
Metric
Name
Percent of Changes wi th Security Review
Version 1.0.0
Status Fi nal
Description Thi s metri c i ndicates the percentage of configuration or system changes that
were revi ewed for security i mpacts before the change was i mplemented.
Type Management
Audience Busi ness Management, Security Management
Question What percentage of changes received security reviews?
Answer A posi tive i nteger value between zero and one hundred that represents a
percentage. A val ue of 100% i ndicates that all changes received security
revi ews during the metri c ti me period.

Formula The Percent of Changes with Security Review (PCSR) metri c i s calculated by
counti ng the number of completed configuration changes that had a security
revi ew during the metri c ti me period di vided by the total number of
confi guration changes completed during the metri c ti me period.

PCSR
Count(Completed_Changes_with_Security_Reviews )
Count(Completed_Changes)
*100
Units Percentage of configuration changes
Frequency Weekl y, Monthly, Quarterly, Annually
Targets PCSR val ues should trend higher over ti me. Generally speaking, change
management processes should contain review and approval steps that
i denti fy potential business and security risks. Because of the l ack of
The CIS Security Metrics v1.1.0 November 1, 2010

101 | P a g e
2010 The Center for Internet Security
experi ential data from the fi el d, no consensus on the range of acceptable
goal values for Percent of Changes with Security Review exi sts.
Sources Confi guration management and IT support tracking systems will provide
confi guration change data.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: PCSR (%)

Usage
Managers can use thi s metric to understand the degree to whi ch changes wi th unknown
security i mpacts are occurring in their environment. The metri c results i ndicate the amount of
churn that has a known i mpact on the i ntended security model of the organization. As changes
wi th unknown security implications accumulate, i t would be expected that the security model
of these systems would degrade.
By focusing on changes to hi gh-value applications and technologies or key business units,
managers can understand the degree to whi ch security risks may be i ntroduced to these
systems.
Limitations
Onl y completed changes. This metric is only calculati ng the results for changes that have been
compl eted during the ti me peri od. Changes i n security review policies may not be i ncluded i n
thi s metri c i f the changes have not been completed in the metri c ti me period.
Vari ations i n the scale of changes. Al l changes are wei ghted equally for this metric regardless of
the l evel of effort required or priority of the request and are not taken i nto account by the
current metri c defi nition. Organizations wanting increased precision could group results by
categories of change size (e.g. Large, Medi um, Small) or normalize based on l evel of effort.
References
S. Kempter and A. Kempter, ITIL V3 Checklist Request for Change RFC, 2008. <http://wi ki.en.it-
processmaps.com/index.php/Checklist_Request_for_Change_RFC>
S. Kempter and A. Kempter, ITIL V3 Confi guration Management Process, 2008.
<http://wi ki .en.it-processmaps.com/index.php/Change_Management>
A. Ri l ey et al. Open Gui de ITIL Configuration Management, 2008.
<http://www.i tl ibrary.org/index.php?page=Configuration_Management>
The CIS Security Metrics v1.1.0 November 1, 2010

102 | P a g e
2010 The Center for Internet Security
Percent of Changes with Security Exceptions
Objective
The goal of thi s metric i s to provide managers with i nformation about the potenti al risks to
thei r envi ronment resulti ng from configuration or system changes exempt from the
organizations security policy.
Table 44: Percent of Changes with Security Exceptions
Metric
Name
Percent of Changes wi th Security Exceptions
Version 1.0.0
Status Fi nal
Description Thi s metri c i ndicates the percentage of configuration or system changes that
recei ved an exception to exi sting security policy.
Type Operational
Audience Securi ty Management
Question What percentage of changes received security exceptions?
Answer A posi tive i nteger value between zero and one, reported as a percentage. A
val ue of 100% i ndicates that all changes are exceptions.
Formula Thi s Percentage of Security Exception (PCSE) metri cs are calculated by
counti ng the number of completed configuration changes that received
security exceptions during the metri c ti me period divided by the total
number of configuration changes completed during the metri c ti me period:

PCSE
Count(Completed_Changes_with_Security_Exceptions )
Count(Completed_Changes)
*100
Units Percentage of configuration changes
Frequency Weekl y, Monthly, Quarterly, Annually.
Targets PCSE val ues should trend l ower over ti me. Generally speaking, excepti ons
made to security policies i ncrease the complexity and di fficulty of managing
the security of the organization. Because of the l ack of experiential data
from the fi el d, no consensus on the range of acceptable goal values for
The CIS Security Metrics v1.1.0 November 1, 2010

103 | P a g e
2010 The Center for Internet Security
Percent of Changes wi th Security Exceptions exists.
Sources Confi guration management and IT support tracking systems will provide
confi guration change data.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: PCSE (%)
Usage
Manager can use this metri c to understand their exposure i n terms of the percentage of change
excepti ons to thei r security policy. While exceptions may be granted based on negligible risk or
addi tional controls, i t i s possible that accumulated change exceptions could degrade thei r
security posture.By focusing on exceptions granted to changes to hi gh-value applications and
technol ogies, or key business units, managers can focus thei r attenti on and resources and
i ncrease thei r understanding of the degree to whi ch security risks may be i ntroduced to these
systems.
Limitations
Onl y completed changes. This metric is only calculati ng the results for changes that have been
compl eted during the ti me peri od. Changes i n-progress wi ll not be i ncluded in this metri c i f
they have not been compl eted i n the metri c ti me period.
Vari ations i n the scale of changes. Al l changes are wei ghted equally for this metric and do not
take i nto account the amount of effort required. For a better understanding of the scale of
excepti ons, organizations should group results by categories of change size (Large, Medi um,
Smal l) or normalize based on scale of the change.
Dependency on security reviews. Security exceptions may only have been granted for systems
that recei ved security reviews. Changes i mplemented wi thout security reviews may i nclude
unknown and untracked exceptions to security policies.
References
S. Kempter and A. Kempter, ITIL V3 Checklist Request for Change RFC, 2008. <http://wi ki.en.it-
processmaps.com/index.php/Checklist_Request_for_Change_RFC>
S. Kempter and A. Kempter, ITIL V3 Confi guration Management Process, 2008.
<http://wi ki .en.it-processmaps.com/index.php/Change_Management>
A. Ri l ey et al. Open Gui de ITIL Configuration Management, 2008.
<http://www.i tl ibrary.org/index.php?page=Configuration_Management>
The CIS Security Metrics v1.1.0 November 1, 2010

104 | P a g e
2010 The Center for Internet Security
Application Security Metrics
Thi s section describes metri cs for measuring security around the business applications i n an
organizations environment.
Busi ness applications perform many functions from order processing to i nventory
management. Organizations are i ncreasingly dependent on business applications, especially
appl ications connected to the Internet for transactions between customers, suppliers, business
uni ts and employees.
Whi le a i ndividual applications may be more or l ess critical than another, all managers want to
understand i f they can rely on thei r business applications to rel iably function as i ntended.
Securi ty issues wi th business applications can put both i nformation assets as wel l as the
capability to operate at risk.
The security of these business applications depends on several factors:
Desi gn of the underlying security model
Sel ecti on and i ncorporation of component technologies
Devel opment of the applications, through software development and integration
processes
Underl ying infrastructure such as the operating systems and applications
The fol l owing initial set of metri cs for Application Security are designed to provide managers
wi th i nformation on the di stribution by types of applications they are managing, what the
known ri sks to those applications are, and how wel l their applications have been examined for
weaknesses:
1. Number of Applications. The absolute number of applications provides a useful
measure that allows an organization to understand what they have and to i nterpret
the resul ts provided by other metrics. As a key i ndicator of ri sk, the number of critical
and hi gh value applications should be vi ewed.
2. Percentage of Critical Applications. This metri c i dentifies the percentage of an
organizations applications that are critical to i ts operations. This hel ps the organization
understand thei r rel ative l evel of exposure to application security risks.
3. Risk Assessment Coverage. Thi s metric examines the percentage of applications that
have undergone a risk assessment. Understanding the percentage of applications that
have had a risk assessment performed provides managers wi th a better understanding
The CIS Security Metrics v1.1.0 November 1, 2010

105 | P a g e
2010 The Center for Internet Security
of thei r ri sks among their applications. A key ri sk indicator is the Ri sk Assessment
Coverage for Hi gh Value applications.
4. Security Testing Coverage. The percentage of post-deployment applications that have
experi enced material security testing for weaknesses i s a key i ndicator of the l evel of
appl ication security risk.
The CIS Security Metrics v1.1.0 November 1, 2010

106 | P a g e
2010 The Center for Internet Security
Data Attributes
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
appl ication security data record.
Table 45: Technologies Table
The fol l owing is a l ist of attributes that should be populated as completely as possible for each
technol ogy:
Technologies Table
Name Type De-
identi
fied
Required Description
Technol
ogy ID
Text
/
Num
ber
No Yes Uni que i dentifier for the technology. Generally auto-
generated.
Name Text No No Name from CPE Di cti onary which follows the fol lowing
structure:
cpe:/{PART}:{VENDOR}:{PRODUCT}:{VERSION}:{UPDAT
E}:{EDITION}:{LANGUAGE}.
Part Text No No Pl atform. Use value: H, O, or A. H, O, and A represent
hardware, operating system, and application
envi ronment respectively.
Vendor Text No No Vendor from CPE Di ctionary. This i s the hi ghest
organization-specific l abel of the DNS name.
Product Text No No Product from CPE Di ctionary. Thi s is the most
recogni zable name of the product.
Versi on Text No No Versi on from CPE Di ctionary. Same format as seen
wi th the product.
Update Text No No Update or service pack i nformation from CPE
Di cti onary.
Edi ti on Text No No Edi ti on from CPE Di ctionary. May defi ne specific target
The CIS Security Metrics v1.1.0 November 1, 2010

107 | P a g e
2010 The Center for Internet Security
hardware and software architectures.
Languag
e
Text No No Language from CPE Dictionary.
Technol
ogy
Val ue
Text No Recomme
nded
Impact from the l oss of this technology (C/I/A) to the
organization. Uses value Low, Medium, High, or Not
Defined.
16

Busi ness
Uni t
Text No No Organizational business unit that the technology
bel ongs to.
Owner Text No No Uni que i dentifier for individual within the organization
that i s responsible for the technology.
Cl assific
ati on
Text No No Cl assification of technology: Servers, Workstations,
Laptops, Network Device, Storage Devi ce,
Appl i cations, Operating systems

Table 46: Business Applications Table
Thi s table contains information regarding an organizations business applications.
Business Applications Table
Name Type De-
identified
Required Description
Appl i cation ID Number No Yes Uni que i dentifier for the
appl ication. Generally auto-
generated.
Appl i cation
Name
Text Yes No The name of the busi ness
appl ication from CPE
Di cti onary. This i s the most
recogni zable name of the
product.
Versi on Text No No Versi on from CPE Di ctionary.
Same format as seen wi th the
product.

16
This is adopting 2.3.3 Security Requirements Scoring Evaluation from CVSS v2, http://www.first.org/cvss/cvss-guide.ht ml#i2.3.
The CIS Security Metrics v1.1.0 November 1, 2010

108 | P a g e
2010 The Center for Internet Security
Vendor Text No No Vendor from CPE Di ctionary.
Thi s i s the hi ghest
organization-specific l abel of
the DNS name.
Language Text No No Language from CPE Dictionary.
Use val ue C/C++, JSP, ASP,
.NET, J2EE, CGI, Perl , PHP, Web
Servi ces, or Other.
Web-Enabled Bool ean No No Whether or not the application
i s web-enabled.
In-House
Devel opment
Number No No Percentage of application
devel oped i n-house, i f
appl icable.
Vendor
Devel opment
Number No No Percentage of application
devel oped by vendor, i f
appl icable.
Custom Number No No Percentage of application that
was customized, if applicable.
Database Text No No Pri mary database used. Use
Oracle, MySQL, SQLServer, or
Other.
Appl i cation
Val ue
Text No Recommended A val ue that i ndicates the
i mpact from the l oss of thi s
busi ness system to the
organization. Use values Low,
Medi um, Hi gh, and Not
Defi ned.
Owner Text Yes No Uni que i dentifier for individual
responsible for the application.
Hosti ng Bool ean No No Whether application i s
managed i nternally or
external ly. Use values Internal
or External.

Table 47: Business Application Status Table
Current status of all business applications wi thin the organization:
The CIS Security Metrics v1.1.0 November 1, 2010

109 | P a g e
2010 The Center for Internet Security
Business Application Status Table
Name Type De-
identified
Required Description
Appl i cation ID Number No Yes Uni que i dentifier for the
appl ication. Generally auto-
generated.
Technol ogy ID Text /
Number
No Yes Uni que i dentifier for the
technol ogy. Generally auto-
generated.
Appl i cation
Status
Text No No Indi cator of the applicati ons
current status. Uses values In
Testi ng, In Devel opment, or
Production.
Status Changed
Date
Date /
Ti me
No No Date and ti me when
appl ication status was l ast
changed.
Status Changed
By
Text No No Uni que i dentifier for entity
that updated the application
status.

Table 48: Risk Assessments Table
Thi s table contains information on the ri sk assessments performed i n the organization.
Currentl y for the i nitial set of metri cs, relatively few fi elds are required. Organizations can
i ncl ude additional fields to enhance thei r ability to measure and understand thei r risks.
Risk Assessments Table
Name Type De-
identified
Required Description
Assessment ID Text / Number No Yes Uni que i dentifier for the
assessment. Generally auto-
generated.
Technol ogy ID Text / Number No Yes Uni que i dentifier for the
technol ogy the application
resi des on.
The CIS Security Metrics v1.1.0 November 1, 2010

110 | P a g e
2010 The Center for Internet Security
Date of
Assessment Date / Ti me
No No Date that ri sk assessment
was completed.
Appl i cation ID Text / Number Yes Yes Uni que i dentifier for the
appl ication.
Assessed By Text No No Uni que i dentifier of the
enti ty conducting the
assessment.
Assessment
Type
Text
No No Methodol ogy or process
used for the Ri sk Assessment
such as: FAIR, FRAP, OCTAVE,
SOMAP, ISO 27005, NIST 800-
30
Assessment
Effort Number
No No Total person-hours of the
assessment
Assessment
Cost Number
No No Total cost of the assessment
Assessment
Scope
Text
No No Scope of the ri sk assessment
coveri ng thi s application:
Organization, system, or
application
Compl i ant
Bool ean
No No Whether or not the
appl ication i s compliant wi th
security control standards.
Use val ues Compliant or Not
Compl i ant.
Assessment
Resul ts Text
No No Resul ts of the assessment.

Table 49: Security Testing Table
Thi s table contains information about security tests, such as manual penetration tests, static or
dynami c binary analysis, and other application security testing. Organizations can i nclude
addi tional fi elds to enhance thei r ability to measure and understand their risks.
Security Testing Table
Column Name Type De-
identified
Required Column Description
The CIS Security Metrics v1.1.0 November 1, 2010

111 | P a g e
2010 The Center for Internet Security
Testi ng ID Text / Number No Yes Uni que i dentifier for the
test. General ly auto-
generated.
Technol ogy ID Text / Number No Yes Uni que i dentifier for the
technol ogy the application
resi des on.
Date of Testi ng
Date / Ti me
No No Date that security testing
was performed.
Appl i cation ID Text / Number Yes Yes Reference i denti fier for the
appl ication.
Tested By Text No No Uni que i dentifier of the
enti ty conducting the
testi ng.
Test Type
Text
No No Methodol ogy or process
used for the security testi ng
such as: Source Code
Analysis, Static Binary
Analysis, Dynamic Analysis,
Fuzzing, Penetration Testing
Test Method
Text
No No Whether or not security test
was automated. Use values
Manual or Automated
Test Resul ts
Text
No No Resul ts of the testi ng.
Securi ty Test
Effort Numeri c
No No Total person-hours of test
effort
Securi ty Test
Cost Numeri c
No No Cost of the Securi ty Test
(USD).

Table 50: Business Application Weaknesses Table
Current mi ti gati on status of weaknesses discovered on business applications.
Business Application Weaknesses Table
Column Name Type De-
identified
Required Column Description
Mi ti gati on ID Text / Number No Yes Uni que i dentifier for the
The CIS Security Metrics v1.1.0 November 1, 2010

112 | P a g e
2010 The Center for Internet Security
mi ti gation ti cket. Generally
auto-generated.
Appl i cation ID Text / Number No Yes Uni que i dentifier for the
appl ication. Generally auto-
generated.
Technol ogy ID Text / Number No Yes Uni que i dentifier for the
technol ogy the application
resi des on.
CWE ID Text / Number No Yes Uni que i dentifier for the
weakness.
Di scovered
Date
Date/Ti me No No Date and ti me the weakness
was di scovered. May be the
same date the ri sk
assessment or security
testi ng was performed.
Di scovered By Text No No Uni que i dentifier of the
enti ty that di scovered the
weakness. May be Security
Testi ng ID or Risk
Assessment ID.
Status Text No No Current status of the
mi ti gation effort. Use values
of Mi ti gated or Not
Mi ti gated.
Pri ority Text No No How qui ckly weakness
should be mi tigated. Use
val ues of Hi gh, Medi um,
Low.
Type Text No No Type of weakness.
Mi ti gati on
Date
Date/Ti me No No Date and ti me when the
weakness was mi tigated, if
appl icable.
Mi ti gated By Text No No Uni que i dentifier for entity
that mi ti gated the weakness,
i f applicable.

Table 51: Most Dangerous Programming Errors Table
The CIS Security Metrics v1.1.0 November 1, 2010

113 | P a g e
2010 The Center for Internet Security
CWE/SANS Top 25 Most Dangerous Programming Errors is a l ist of the most wi despread and
cri ti cal programming errors that can l ead to serious software vulnerabilities.
Most Dangerous Programming Errors Table
Column Name Type De-
identified
Required Column Description
CWE ID Text / Number No Yes Uni que i dentifier for the
weakness.
Name Text No No Name of the weakness.
Weakness
Preval ence
Text No No How often the weakness i s
encountered. Use values of
Li mi ted, Medi um, Common,
Hi gh, or Wi despread.
Remedi ati on
Cost
Text No No The amount of effort
requi red to fi x the weakness.
Use val ues of Low, Medi um,
or Hi gh.
Attack
Frequency
Text No No How often the weakness
occurs i n vulnerabilities that
are expl oited by an attacker.
Use val ues of Rarely,
Someti mes, or Often.
Consequences Text No No Impact on the organization
should the weakness be
expl oi ted. Use values of
Code Executi on, Security
Bypass, Data Loss, Code
Executi on, or Denial of
Servi ce.
Ease of
Detecti on
Text No No How easy i t i s for an attacker
to fi nd thi s weakness. Use
val ues of Easy, Moderate, or
Di ffi cult.
Attacker
Awareness
Text No No The l i kelihood that an
attacker is goi ng to be aware
of thi s parti cular weakness,
methods for detecti on, and
The CIS Security Metrics v1.1.0 November 1, 2010

114 | P a g e
2010 The Center for Internet Security
methods for exploitation.
Use val ues of Medium, Hi gh.
Di scussion Text No No Di scussion of the nature of
the weakness and i ts
consequences.
Preventi on Text No No Steps to mi ti gate or
el i mi nate the weakness.

Diagram 6: Relational Diagram for Application Management Data Attributes
The di agram bel ow shows the rel ationship of tables described i n Application Management Data
Attri butes:

The CIS Security Metrics v1.1.0 November 1, 2010

115 | P a g e
2010 The Center for Internet Security
Technologies
PK Technology ID CHAR(10)
Name TEXT(10)
Part CHAR(1)
Vendor TEXT(10)
Product TEXT(10)
Version TEXT(10)
Update CHAR(10)
Edition TEXT(10)
Language TEXT(10)
Technology Value CHAR(10)
Business Unit CHAR(10)
Owner CHAR(10)
Classification CHAR(10)
Business Application
PK Application ID CHAR(10)
Application Name TEXT(10)
Version TEXT(10)
Vendor TEXT(10)
Language TEXT(10)
Web-Enabled BIT
In-house Development SINGLE
Vendor Development SINGLE
Custom SINGLE
Database CHAR(10)
Application Value CHAR(10)
Owner CHAR(10)
Hosting BIT
Business Application Status
PK,FK1 Application ID CHAR(10)
PK,FK2 Technology ID CHAR(10)
Application Status CHAR(10)
Date of Status Change DATETIME
Status Changed By CHAR(10)
Business Application Risk Assessment
PK Assessment ID CHAR(10)
FK2 Application ID CHAR(10)
FK1 Technology ID CHAR(10)
Assessed By CHAR(10)
Date of Assessment DATETIME
Assessment Type CHAR(10)
Compliant BIT
Results TEXT(50)
Assessment Effort SHORT
Assessment Cost CURRENCY
Assessment Scope CHAR(10)
Business Application Security Testing
PK Testing ID CHAR(10)
FK2 Application ID CHAR(10)
FK1 Technology ID CHAR(10)
Tested By CHAR(10)
Date of Testing DATETIME
Test Type CHAR(10)
Test Method BIT
Results TEXT(50)
Test Effort SHORT
Test Cost CURRENCY
Most Dangerous Programming Errors
PK CWE ID CHAR(10)
Name TEXT(10)
Weakness Prevalence CHAR(10)
Remediation Cost CHAR(10)
Attack Frequency CHAR(10)
Consequences CHAR(10)
Ease of Detection CHAR(10)
Attacker Awareness CHAR(10)
Discussion LONGTEXT
Prevention LONGTEXT
Business Application Weaknesses
PK,FK4 Mitigation ID CHAR(10)
FK2 Application ID CHAR(10)
FK3 Technology ID CHAR(10)
FK1 CWE ID CHAR(10)
Discovered Date DATETIME
Discovered By CHAR(10)
Status CHAR(10)
Priority CHAR(10)
Type CHAR(10)
Mitigation Date DATETIME
Mitigated By CHAR(10)
Vulnerability Remediation
PK Vulnerability Remediation ID CHAR(10)
FK1 Vulnerability ID CHAR(10)
Technology ID CHAR(10)
Open Date DATETIME
Status CHAR(10)
Priority CHAR(10)
Close Date DATETIME
Closed By CHAR(10)
Vulnerability
PK Vulnerability ID CHAR(10)
Vulnerability Name TEXT(10)
CVE ID CHAR(10)
CWE ID CHAR(10)
Description TEXT(20)
Release Date DATETIME
Severity CHAR(10)
Classification CHAR(10)

Classifications
Taggi ng of i nformation is a very valuable way to provide context to collected data records.
Cl assification tags provide a way to group change requests, requesting parties, affected
busi ness applications or technologies, i mplementation teams, and change approval and review
methods.
It i s expected that dimensions will be added to these tables to provide the ability to vi ew metri c
resul ts that address key questions and concerns. Examples of dimensions that can be added to
the metri c datasets i nclude:
The CIS Security Metrics v1.1.0 November 1, 2010

116 | P a g e
2010 The Center for Internet Security
Technologies: Appl ication status, business unit, geography, business value, or
technol ogy category by technology
Risk Assessments: Assessment method or development stage
Security Testing: Testi ng effort, testing team, or test duration
Wi thi n an organization, the combination of di mensions can provide key i nsight into
concentrations of risks for an organization such as the percent of critical applications without
ri sk assessments or security testing.
Business Application Value
Busi ness Applications will be rated for their value by adopting a simplified version of the
Common Vul nerability Scoring System (v2) section 2.3.3 Security Requirements Scoring
Eval uation ratings. Each Business Applications i s assigned one of three possible values,
Low, Medi um, Hi gh (or Not Defi ned) depending on the i mpact from loss of this
system to the busi ness. These ratings are reproduced here:
Low (L). Loss of [confidentiality | i ntegrity | availability] is l ikely to have only a
l i mi ted adverse effect on the organization or i ndividuals associated with the
organization (e.g., empl oyees, customers).
Medi um (M). Loss of [confidentiality | i ntegrity | availability] is l ikely to have a
seri ous adverse effect on the organization or i ndividuals associated wi th the
organization (e.g., empl oyees, customers).
Hi gh (H). Loss of [confidentiality | i ntegrity | availability] i s likely to have a
catastrophic adverse effect on the organizati on or i ndividuals associated with the
organization (e.g., empl oyees, customers).
Not Defi ned (ND). Assigning this value to the metri c wi ll not i nfluence the score. It i s
a si gnal to the equation to skip this metri c.
As described i n CVSS v2, these values should be based on network l ocation, business function,
and the potenti al for l oss of revenue or l ife, although no specific methodology is defi ned to
assign these values.
Sources
The data sources for these metri c are applicati on tracking systems that containing application
and values, risk assessment tracking systems that contain the dates and results of assessments,
and security testing histories.
The CIS Security Metrics v1.1.0 November 1, 2010

117 | P a g e
2010 The Center for Internet Security
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be applied or tagged at the l evel of the underlying applicati on record as
described in Application Security Metrics: Data Attri butes. For example:
Val ue of applications allows for analysis of the vol ume of applications that are of hi gh,
medi um, or l ow value to the organization
Locati on or business unit i n the organization allows for the i dentification of
concentrations of risk
Assessment types and scope
Devel opment stage of the application
Testi ng type, such as manual penetration, automated testing, binary analysis
Testi ng organizations (e.g. i n-house or external consultants)
Automation
The abi lity to automate the source data collection for this metric is medium. While most
organizations maintain tracking systems for business applications, risk assessments and security
testi ng, these systems are generally maintained manually. Once the i nitial dataset has been
col l ected, the potenti al for ongoing automation i s high.
Visualization
Appl i cation security metri cs may be vi sually represented i n several ways:
Si mpl e visualizati ons may i nclude a table showing the number of applications for the
organization wi th each row displaying the value for selected time periods (each week or
each month). Columns may be used for different applicati on value levels (e.g. Low,
Medi um, Hi gh).
Graphical vi sualizations may i nclude ti me-series charts where the number of
appl ications i s plotted on the verti cal axis and the ti me periods di splayed on the
hori zontal. To provide maximum i nsight, plotted values for each period may i nclude
stacked series for the di ffering values of applications.
Compl ex vi sualizations should be used for displaying the number of applications for
cross-sections such as by organization or asset value. For example, small mul tiples could
be used to compare the number of hi gh value applications across business units.
The CIS Security Metrics v1.1.0 November 1, 2010

118 | P a g e
2010 The Center for Internet Security
Defined Metrics
Percentage of Critical Applications
Objective
Thi s metri c tracks the percentage of applications that are critical to the business.
Table 52: Percentage of Critical Applications
Metric
Name
Percentage of Critical Applications
Version 1.0.0
Status Fi nal
Description The percentage of critical applications measures the percent of applications
that are cri tical to the organization's business processes as defined by the
appl ications value rating.
Type Techni cal
Audience Securi ty Operations
Question What percentage of the organizations applications i s of criti cal value?
Answer Posi tive i nteger value that i s equal to or greater than zero and less than or
equal to one hundred, reported as a percentage. A value of 100%
i ndi cates that all applications are critical.
Formula The Percentage of Critical Applications (PCA) metri c is calculated by dividing
the number of applications that have high value to the organization by the
total number of applications i n the organization:

PCA
Count(Critical_Applications )
Count(Applications )
*100
Units Percent of applications
Frequency Weekl y, Monthly, Quarterly, Annually.
Targets Because of the l ack of experiential data from the fi eld, no consensus on goal
val ues for the percentage of critical applicati ons. The result wi ll depend on
the organizations business and use of IT.
The CIS Security Metrics v1.1.0 November 1, 2010

119 | P a g e
2010 The Center for Internet Security


Usage
Managers can use thi s metric to gain a better understanding of the quantity of applications that
are cri tical to thei r organization. This metric provides a reference to the scale of the
organizations use of applications and assists managers wi th better understanding of the scope
and scale of thei r application security risk.

Limitations
Vari ations i n application scope. Different organizations might count as a single application a
system that another organization may consider several distinct applications, resulting i n
si gnificantly di fferent numbers of applications between organizations.
Vari ations i n application scale. Applications wi thin or across organizations mi ght be
si gnificantly di fferent in size, so the l evel of effort required to assess, test or fi x vulnerabiliti es
may vary between applications.
The CIS Security Metrics v1.1.0 November 1, 2010

120 | P a g e
2010 The Center for Internet Security
Risk Assessment Coverage
Objective
Thi s metri c reports the percentage of applications that have been subjected to ri sk
assessments.
Table 53: Risk Assessment Coverage
Metric
Name
Ri sk Assessment Coverage
Version 1.0.0
Status Fi nal
Description Ri sk assessment coverage i ndicates the percentage of business applications
that have been subject to a risk assessment at any time.
Type Techni cal
Audience Securi ty Operations
Question What percentage of applications have been the subjected to ri sk
assessments?
Answer A posi tive value between zero and one hundred, reported as a percentage.
A val ue of 100% would i ndicate that all applications have had risk
assessments.
Formula The metri c i s calculated by dividing the number of applicati ons that have
been subject to any risk assessments by the total number of applications i n
the organization:

RAC
Count(Applications_Undergone_Risk_Assessment)
Count(Applications )
*100
Units Percent of applications
Frequency Weekl y, Monthly, Quarterly, Annually.
Targets RAC val ues should trend higher over ti me. A hi gher result would indicate
that more appl ications have been examined for risks. Most security process
frameworks suggest or require risk assessments for applications deployed i n
production environments. Because of the l ack of experiential data from the
The CIS Security Metrics v1.1.0 November 1, 2010

121 | P a g e
2010 The Center for Internet Security
fi el d, no consensus on the range of acceptable goal values for Ri sk
Assessment Coverage exists.

Usage
Managers can use thi s metric to evaluate thei r risk posture i n terms of applicati ons that have
undergone a risk assessment. A better understanding of the quantity of applications that have
not been exposed to a risk assessment allows the organization to eval uate thei r level of
unknown risk associated wi th these applications. With metri c results for different di mensions is
possible to i dentify and evaluate concentrations of risk, such as for results for critical
appl ications or applications containing confidential i nformation.
Sources
The data source for this metric is a risk assessment tracking system.
Limitations
Vari ations i n application scope. Different organizations might count as a single application a
system that another organization may consider several distinct applications, resulting i n
si gnificantly di fferent numbers of applications between organizations.
Vari ations i n application scale. Appl ications wi thin or across organizations mi ght be
si gnificantly di fferent in size, so the l evel of effort required to assess, test or fi x vulnerabiliti es
may vary between applications.
Depth of Ri sk assessments. Risk assessments can vary i n depth due to the methodology used,
the amount of ti me spent, and the quality of the assessment team.
Stage when Assessed. Ri sk assessments can occur at varying ti mes i n an applications
devel opment cycle that may i nfluence the assessment.
References
Web Appl i cation Security Consortium. Web Application Security Statistics Project.
http://www.webappsec.org/projects/statistics/
The CIS Security Metrics v1.1.0 November 1, 2010

122 | P a g e
2010 The Center for Internet Security
Securi ty Testing Coverage
Objective
Thi s metri c i ndicates the percentage of the organizations applications have been tested for
security risks.
Table 54: Security Testing Coverage
Metric
Name
Securi ty Testing Coverage
Version 1.0.0
Status Fi nal
Description Thi s metri c tracks the percentage of applications in the organization that
have been subjected to security testing. Testing can consists of manual or
automated white and/or black-box testing and generally i s performed on
systems post-deployment (although they could be i n pre-production testing).
Studi es have shown that there i s material differences i n the number and
type of appl ication weaknesses found. As a result, testing coverage should
be measured separately from risk assessment coverage.
Type Techni cal
Audience Securi ty Operations
Question What percentage of applications has been subjected to security testing?
Answer A posi tive value between zero and one hundred, reported as a percentage.
A val ue of 100% would i ndicate that all applications have had security
testi ng.
Formula Thi s metri c i s calculated by di viding the number of applications that have
had post-deployment security testing by the total number of depl oyed
appl ications i n the organization:

STC
Count(Applications_Undergone_Security_Testing )
Count(Deployed_Applications )
*100
Units Percent of applications
The CIS Security Metrics v1.1.0 November 1, 2010

123 | P a g e
2010 The Center for Internet Security
Frequency Weekl y, Monthly, Quarterly, Annually.
Targets STC val ues should trend higher over ti me. Generally, the hi gher the value
and the greater the testi ng scope, the more vul nerabilities i n the
organization's application set wi ll be i dentified. A value of 100% i ndicates
that every application has been subject to post-deployment testing. Because
of the l ack of experiential data from the fi el d, no consensus on the range of
acceptable goal values for Security Testing Coverage exi sts.

Usage
Managers can use thi s metric to evaluate the degree to whi ch applications have been tested for
weaknesses during the post-development phase (dimensions could be used to expand this
metri c to cover various stages of the devel opment l ifecycle). Quantifying the applications not
subjected to security testing allows the organization to eval uate thei r application risk.
Automation
The abi lity to automate source data collection for this metric is medium. While the results of
security testing are often maintained in a tracking system, these systems are generally
mai ntained manually. Once the i nitial dataset has been collected, use of the dataset can be
automated for metric calculation purposes.
Limitations
Vari ations i n application scope. Different organizations might count as a single application a
system that another organization may consider several distinct applications, resulting i n
si gnificantly di fferent numbers of applications between organizations.
Vari ations i n application scale. Appl ications wi thin or across organizations mi ght be
si gnificantly di fferent in size, so the l evel of effort required to assess, test or fi x vulnerabiliti es
may vary between applications.
Depth of Ri sk assessments. Risk assessments can vary i n depth due to the methodology used,
the amount of ti me spent, and the quality of the assessment team.
References
Web Appl i cation Security Consortium. Web Application Security Statistics Project.
http://www.webappsec.org/projects/statistics/
The CIS Security Metrics v1.1.0 November 1, 2010

124 | P a g e
2010 The Center for Internet Security
Financial Metrics
The combi nation of security costs and security outcome metri cs can be used to understand i f
security spending i s opti mized, i f projects meet thei r projected goals, and i f organizations are
focusing on the ri ght areas. If cost data i s not available, i f may be possible to use effort data
i nstead (e.g. FTEs and ti me.) For i nstance, metri cs centered around the effort i nvolved i n
security processes, such as the effort to remedi ate a vulnerability can be used to i mprove
effi ci ency. Metri cs around the i mpact and benefits to the organization, such as reductions i n
the number of security i ncidents can improve overall security effectiveness.
When organizations consider their security costs and benefits the three questions they seek to
answer are:
1. How much i s bei ng spent on i nformation security? Companies would l ike to know i f their
security spending i s in-line to other organizations wi th similar characteristics. If they are
over- or under- spending compared to thei r peers and thei r security posture seems
equi valent than they know that thei r spending i s l ikely to be l ess or more effecti ve than
thei r peers. An i ssue wi th comparing financial metri cs i n i solation is that there are
several unobserved values, namely the effecti veness of the security that i s being
purchased.
2. What i s the security budget bei ng spent on? Looking at the ways i n which security budgets
are al located can help optimize spending. This can help identify i f the most resources are
bei ng di rected at the areas of greatest risks, and if spending i s aligned wi th the
organizations strategy.
3. What are the benefi ts received for this spending? Directly measuring the benefi ts of
security spending i s challenging. Currently most benefits can only be captured as reduced
ti me spent by personnel i n maintaining a l evel of security activity, reduced numbers of
common i ncidents (password resets, virus clean-ups), and reduced operational downtime,
but cant easily measure averted threats. It i s also possible to consider the benefi ts of
parti cular projects and spending segments by l ooking at i mprovements i n the performance
of busi ness functions, for example, and the marginal change resulting from additional
spendi ng.
Initial Metrics:
1. Percent of IT budget allocated to information security. How much of i nformation security
spendi ng i s allocated to security, normalized as a percentage of overall IT spending.
The CIS Security Metrics v1.1.0 November 1, 2010

125 | P a g e
2010 The Center for Internet Security
2. Security Budget Allocation. What things is the security budget bei ng spent on, such as
systems, personnel, software l icenses, managed services, etc. Percentage of spending on:
personnel, software and hardware, services (of different types), managed services,
products of various type and purpose, and training.
The CIS Security Metrics v1.1.0 November 1, 2010

126 | P a g e
2010 The Center for Internet Security
Data Attributes
The fol l owing is a l ist of attributes that should be populated as completely as possible.
Table 55: Security Spending Table
Information Security Spending Table
Name Type De-
identified
Requir
ed
Description
Reference ID Number No No Uni que i dentifier for the security
spendi ng. Generally auto-generated.
Ti me Peri od
Start Date
Date No Yes The starting date for the ti me peri od for
whi ch thi s spending occurred
Ti me Peri od
End Date
Date No Yes The endi ng date for the ti me period for
whi ch thi s spending occurred
IT Budget Number Yes Yes* The total IT budget (i ncluding security
acti vities) for thi s ti me period
IT Actual Number Yes No The actual IT spending during this ti me
peri od (i ncluding security activities).
IT Securi ty
Budget
Number Yes Yes* The total amount budgeted for
i nformation security personnel, services,
and systems during the ti me peri od.
IT Securi ty
Actual
Number Yes No The actual spending on i nformation
security personnel, services, and systems
duri ng the ti me period
Spendi ng
Category
Text/Dr
op-
down
Yes No An i ndicator of the purpose of the
security spending, from categories:
Personnel, Systems, Managed Services,
Servi ces, Training, and Other.
Purpose Text/Dr
op-
down
No No Purpose of the spending: Prevention,
Detecti on, Incident Response, Audi ting
Addi ti onal
Di mensions
Text Yes No Addi ti onal di mensional tags such as
busi ness unit, l ocation, etc. These
addi tional fi elds could include references
to technol ogies or applications.

*Thi s table could be assembled with multiple rows for each ti me period, wi th one for the IT
budget, and other rows for the budget for specific security items, summing i n the rows for the
rel evant metric ti me period. For simplicity, i f this i s done, i t i s recommended that all rows
provi de values for the same ti me periods as the metri c calculations.
The CIS Security Metrics v1.1.0 November 1, 2010

127 | P a g e
2010 The Center for Internet Security
Security Spending and Budget
The products, procedures, and personnel (employees and contractors) that are primarily
dedi cated to or used for provision of IT security for the specific IT i nvestment, such as the
acti vities covered under ISO 27002. Al l capital and operational costs for IT Operational Security,
IT Ri sk Management, IT Compliance, IT Privacy, and I T Di saster Recovery should be i ncluded
even through these costs may cross organizational boundaries. Dimensions can be used to
mai ntain i nformation on spending by organizational units.
Fol l owing guidance presented i n OMB Circular No. A-11 Section 53 (2008), security spending i s
defi ned as spending on or i ntended for activities and systems including:

Risk assessment;
Security planning and policy;
Certification and accreditation;
Specific management, operational, and technical security controls (to include access control systems as
well as telecommunications and network security);
Authentication or cryptographic applications;
Security education, awareness, and training;
System reviews/evaluations (including security control testing and evaluation);
Oversight or compliance inspections;
Contingency planning and testing;
Physical and environmental controls for hardware and software;
Auditing and monitoring;
Computer security investigations and forensics; and
Reviews, inspections, audits and other evaluations performed on contractor facilities and operations.
Managed services, consulting services providing any of the above;
Spending Categories and Purpose
Securi ty spending can be tracked i n more detail by i ndicating the category of i tem the spending
i s for, such as Personnel (in-house), Systems (software, appliances, and hardware), Managed
Servi ces, Security Services (such as penetration testing), Training, Auditing, and Other.
The spending can be assigned a purpose, such as prevention (on controls and hardening),
detecti on (IDS systems, l og monitoring, etc.), auditing and measurement, and i ncident response
The CIS Security Metrics v1.1.0 November 1, 2010

128 | P a g e
2010 The Center for Internet Security
and recovery. These dimensions can be used to gain a more complete pi cture of the al location
of security spending and i ts i mpact on the performance of business functions.
Sources
Sources for fi nancial data i nclude published budgets and fi nancial management systems. In
some cases manual effort wi ll be required to separate security spending from IT budgets, or to
sum security spending across multiple divisions or departments.
Dimensions
Thi s metri c may include additional dimensions for grouping and aggregation purposes. These
di mensions should be tagged at the row l evel, and can include:
Business functions to track financial metri cs on security around specific business
acti vities
Business Units owi ng the systems to which the security spending is di rected
Geographic locations for analyzing spending across mul tiple l ocations
Automation
The abi lity to automate source data collection for these metri cs i s medium, because most
organizations use financial management systems for budgeting activities; however these results
may requi re additional work to determi ne total security spending across multiple units, group
l ocations and systems. Calculation of these metrics on an ongoing basis, after source data has
been obtai ned, l ends i tself to a moderate degree of automation, as a process can be defined,
but some recurring analysis is l ikely to be requi red.
Visualization
These metrics may be visually represented in several ways:
Simple visualizations may include a table showing metric results for the organization with each row displaying the
value for selected time periods (each week or each month). Columns may be used for spending categories (e.g.
Personnel) or purposes (e.g. Prevention).
Graphical visualizations may include time-series charts where the metric result is plotted on the vertical axis and
time periods displayed on the horizontal axis. To provide maximum insight, plotted values for each period may
include stacked series for the differing categories or purposes or business units (for Information Security Budget as
% of IT Budget).
Complex visualizations should be used for displaying the metric result for cross-sections by organization,
categories, or purposes. For example, small multiples could be used to compare the spending on systems for
prevention across business units.
The CIS Security Metrics v1.1.0 November 1, 2010

129 | P a g e
2010 The Center for Internet Security
Defined Metrics
Information Security Budget as % of IT Budget
Objective
Organizations are seeking to understand i f their security spending is reasonable for the l evel of
security performance and in-line wi th other organizations. Thi s metric presents the IT security
budget as a percentage of organizations overall IT budget, tracking the rel ative cost of security
compared to IT operations. This result can also be used to benchmark spending against other
organizations.
Table 56: Security Budget as % of IT Budget
Metric
Name
Information Security Budget as a Percentage of IT Budget
Version 1.0.0
Status Fi nal
Description Securi ty budget as a percentage of IT Budget tracks the percentage of IT
spendi ng on security activities and systems. For the purposes of this metric, it
i s assumed that Information Security is i ncluded i n the IT budget.
Type Management
Audience Busi ness Management
Question What percentage of the IT Budget i s allocated to i nformation security?
Answer A posi tive value equal to or between 0 and 1, expressed as a percentage. A
val ue of 100% i ndicates that the enti re Information Technology budget i s
dedi cated to i nformation security.
Formula The total budget allocated for security activities and systems for the metri c
ti me peri od i s divided by the total i nformation security budget.

SBPITB
SecurityBudget
ITBudget

Units Percentage of IT Budget
Frequency Quarterly, Annually depending on budget cycle
The CIS Security Metrics v1.1.0 November 1, 2010

130 | P a g e
2010 The Center for Internet Security
Targets Because of the l ack of experiential data from the fi eld, no strong consensus
on the range of acceptable goal values for security spending exi sts In general,
thi s value should be comparable wi th peer organizations with similar IT
profi les and security activities.

Sources Fi nancial management systems and/or annual budgets

Usage
Exami ning and tracking the percentage of the IT budget allocated to security allows an
organization to compare the costs of securing their i nfrastructure between an organizations
di vi sions, against other organizations, as well as to observe changes over ti me. These results
wi l l also provide a foundation for the opti mization of security spending through comparison of
spendi ng wi th the outcomes of other metri cs such as numbers of i ncidents, ti me to detecti on,
ti me to patch, etc.
The percentage of budget allocated to security should be calculated over ti me, typically per -
quarter or per-year. To gain insight i nto the rel ative performance of one business unit over
another, thi s result may also be calculated for cross-sections of the organization, such as
i ndi vidual business units or geographies where they have discrete budgets.
Limitations
Different threat profiles across organizations. While there i s systemic risk to common vi ruses
and attacks, there i s also fi rm specific risk based on the companies specific activities that may
requi re higher or l ower l evel of security spending relative to peer organizati ons.
Different IT profiles across organizations. Al though in theory all organizations will make
market-effi cient use of IT, l egacy systems and specific implementations wi ll i mpact the costs of
otherwi se-similar IT operations as wel l as the costs of similar l evels of security performance.
Differences in accounting. Di fferent organizations may account for both IT and security
spendi ng i n di fferent ways that make i t hard to compare this value across organizations. Some
may l everage IT resources for security purposes that make i t hard to account for such partial
FTEs wi thout significant activity-based costing exercises; others may have l ump-sum outsourced
IT contracts wi thout specific information on security spending.
The CIS Security Metrics v1.1.0 November 1, 2010

131 | P a g e
2010 The Center for Internet Security
References
Chew, Swanson, Stine, Bartol, Brown and Robinson. Special Publication 800-55: Performance
Measurement Guide for Information Security (Rev 1). US Nati onal Institute of Standards and
Technol ogy, 2008
Open Web Appl ication Security Project, Security Spending Benchmark Project
<https://www.owasp.org/index.php/Category:OWASP_Security_Spending_Benchmarks>
Offi ce of Management and Budget, OMB Ci rcular No. A11 (2008) , Form 300s and 53s
The CIS Security Metrics v1.1.0 November 1, 2010

132 | P a g e
2010 The Center for Internet Security
Information Security Budget Allocation
Objective
Looki ng at the ways i n which security budgets are allocated can help optimize spending. Thi s
can hel p i dentify i s the most resources bei ng di rected at the areas of greatest risks, and i f
spendi ng i s aligned wi th the organizations strategy.
Table 57: Information Security Budget Allocation
Metric
Name
Information Security Budget Al location
Version 1.0.0
Status Fi nal
Description Information security budget allocation tracks the di stribution of security
spendi ng across a variety of security activities, systems, and sources, as a
percentage of overall i nformation security spending.
Type Management
Audience Busi ness Management, Security Management
Question What percentage of the Information Security Budget i s allocated to each
category of spending?
Answer A posi tive value equal to or between 0 and 1, expressed as a percentage for
each spending category. A value of 100% i ndicates that the enti re
Information Security budget i s dedicated to that spending category.
Formula For each budget category, di vide the amount allocated to the category by the
total i nformation security budget. These values should be for the rel evant
i tem peri od only. If the category of any budget costs i s unknown they should
be al located to an unknown category.
Units Percentage of Information Security Budget
Frequency Quarterly, Annually depending on budget cycle
Targets Because of the l ack of experiential data from the fi eld, no consensus on a goal
val ue for the allocation of security spending exists. In general, this value
should be comparable with peer organizations wi th similar security
performance across each of the sending categories, and wi ll vary depending
The CIS Security Metrics v1.1.0 November 1, 2010

133 | P a g e
2010 The Center for Internet Security
on the use of i n-house vs. external resources, software license structures,
rel i ance on outsourcing, etc.


Sources Fi nancial management systems and/or annual budgets

Usage
Exami ning and tracking the percentage of the IT budget allocated to security allows an
organization to compare the rel ative costs of thei r various i nformation security activities. This
can hel p i dentify i f security spending is bei ng directed toward the areas of greatest ri sk to the
organization, i .e. i s security spending aligned with the results of risk assessments? It also
enabl es organizations to start to opti mize spending by observing i ncremental changes i n
busi ness functi on performance correlating to changes i n spending on various security activities,
such as numbers of i ncidents, ti me to detecti on, ti me to patch, etc.
The percentage of i nformation security budget allocated to security should be calculated over
ti me, typi cally per-quarter or per-year.
To gai n i nsight i nto the rel ati ve performance of one business unit over another, this result may
al so be calculated for cross-sections of the organization, such as i ndividual business units or
geographies where they have discrete budgets.
Limitations
Different threat profiles across organizations. While there i s systemic risk to common vi ruses
and attacks, there i s also fi rm specific risk based on the companies specific activities that may
requi re higher or l ower l evel of security spending relative to peer organizati ons.
Different IT profiles across organizations. Al though in theory all organizations will make
market-effi cient use of IT, l egacy systems and specific implementations wi ll i mpact the costs of
otherwi se-similar IT operations as wel l as the costs of similar l evels of security performance.
Differences in accounting. Di fferent organizations may account for both IT and security
spendi ng i n di fferent ways that make i t hard to compare this value across organizations. Some
may l everage IT resources for security purposes that make i t hard to account for such partial
FTEs wi thout significant activity-based costing exercises; others may have l ump-sum outsourced
IT contracts wi thout specific information on security spending.
The CIS Security Metrics v1.1.0 November 1, 2010

134 | P a g e
2010 The Center for Internet Security
References
Chew, Swanson, Stine, Bartol, Brown and Robinson. Special Publication 800-55: Performance
Measurement Guide for Information Security (Rev 1). US Nati onal Institute of Standards and
Technol ogy, 2008
Open Web Appl ication Security Project, Security Spending Benchmak Project
<https://www.owasp.org/index.php/Category:OWASP_Security_Spending_Benchmarks>
Offi ce of Management and Budget, OMB Ci rcular No. A11 (2008) , Form 300s and 53s
The CIS Security Metrics v1.1.0 November 1, 2010

135 | P a g e
2010 The Center for Internet Security
Technical Metrics
Incidents

Number of Incidents
Objective
Number of Inci dents i ndicates the number of detected security i ncidents the organization has
experi enced during the metri c time period. In combination wi th other metri cs, this can indicate
the l evel of threats, effectiveness of security controls, or i ncident detection capabilities.
Table 58: Number of Incidents
Metric
Name
Number of Inci dents
Version 1.0.0
Status Fi nal
Description Number of Inci dents measures the number of security i ncidents for a gi ven
ti me peri od.
Type Techni cal
Audience Securi ty Operations
Question What i s the number of security i ncidents that occurred during the ti me
peri od?
Answer A non-negative i nteger value. A val ue of 0 i ndicates that no security
i nci dents were i dentified.
Formula To cal culate Number of Incidents (NI) , the number of security i ncidents are
counted across a scope of i ncidents, for example a given time period,
category or business unit:
NI = Count(Incidents)
Units Inci dents per period; for example, i ncidents per week or i ncidents per month
Frequency Weekl y, Monthly, Quarterly, Annually
The CIS Security Metrics v1.1.0 November 1, 2010

136 | P a g e
2010 The Center for Internet Security
Targets NI val ues should trend lower over ti me assuming perfect detection
capabilities. The value of 0 i ndicates hypothetical perfect security since
there were no security i ncidents. Because of the l ack of experiential data
from the fi el d, no consensus on range of acceptable goal values for Incident
Rate exi sts.
Sources Si nce humans determine when an i ncident occurs, when the i ncident i s
contained, and when the i ncident i s resolved, the pri mary data sources for
thi s metri c are manual i nputs as defi ned i n Security Incident Metrics: Data
Attri butes. However, these i ncidents may be reported by operational
security systems, such as anti-malware software, security incident and event
management (SIEM) systems, and host l ogs.
Visualization Col umn Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: NI (Incidents)
Usage
Number of Inci dents i s a type of security i ncident metric and relies on the common defi nition of
security i ncident as defined in Glossary.
Opti mal conditions would reflect a l ow number of i ncidents. The l ower the number of i ncidents,
the heal thier the security posture would be assuming perfect detecti on. However, a l ow
number of i ncidents mi ght also indicate a weak capability to detect i ncidents. This metri c can
al so i ndicate the effecti veness of security controls. Assuming similar threat l evels and detection
capabilities, fewer i ncidents could i ndicate better performance of one set of security controls.
The Number of Incidents metric i s calculated over ti me, typi cally per-week or per-month. Not
al l incidents are easily detected, so the trend of i ncidents can be useful for i ndicating patterns
i n the envi ronment.
To gai n i nsight i nto the rel ati ve performance of one business unit over another, the number of
i nci dents may also be calculated for cross-sections of the organizati on such as individual
busi ness units or l ocations.
Limitations
A security program may or may not have di rect control over the number of i ncidents that occur
wi thi n thei r environment. For i nstance, i f all the i ncidents that occur are due to zero-day or
previ ously unidentified attack vectors then there are not many options l eft to i mprove posture.
However, thi s metri c could be used to show that i mproving countermeasures and processes
The CIS Security Metrics v1.1.0 November 1, 2010

137 | P a g e
2010 The Center for Internet Security
wi thi n operations to reduce the number of i ncidents that occur. Thus, Number of Incidents
must be considered i n the context of other metri cs, such as MTTID.
The defi ni tion of Incident may not be consistently applied across organizations. For
meani ngful comparisons, similar defi nitions are necessary.
The i mportance of thi s metric will vary between organizations. Some organizations have much
hi gher profiles than others and would be a more attractive target for attackers whose attack
vectors and capabilities wi ll vary. The Number of Incidents may not be di rectly comparable
between organizations.
References
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security
Inci dent Handling Guide. US Nati onal Institute of Standards and Technology, 2004.
<http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf>
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams (CSIRTs). Carnegie-Mellon Software Engi neering Institute, 2003.
<http://www.cert.org/archive/pdf/03tr001.pdf>
The CIS Security Metrics v1.1.0 November 1, 2010

138 | P a g e
2010 The Center for Internet Security
Vulnerability Management
Vulnerability Scan Coverage
Objective
Vul nerability Scan Coverage (VSC) i ndicates the scope of the organizations vulnerability
i denti fication process. Scanning of systems known to be under the organizations control
provi des the organization the ability to i dentify open known vulnerabiliti es on their systems.
Percentage of systems covered allows the organizati on to become aware of areas of exposure
and proactively remediate vulnerabilities before they are exploited.
Table 59: Vulnerability Scan Coverage
Metric
Name
Vul nerability Scan Coverage
Version 1.0.0
Status Fi nal
Description Vul nerability Scanning Coverage (VSC) measures the percentage of the
organizations systems under management that were checked for
vul nerabilities during vulnerability scanning and i dentification processes.
Thi s metri c i s used to i ndicate the scope of vulnerability i dentification
efforts.
Type Techni cal
Audience Securi ty Operations
Question What percentage of the organizations total systems has been checked for
known vul nerabilities?
Answer Posi tive i nteger value that i s greater than or equal to zero but l ess than or
equal to 100%. A val ue of 100% i ndicates that all systems are covered by
the vul nerability scanning process.
Formula Vul nerability Scanning Coverage i s calculated by di viding the total number of
systems scanned by the total number of systems within the metri c scope
such as the enti re organization:

VSC
Count(Scanned_Systems)
Count(All_Systems_Within_Organization )
*100
The CIS Security Metrics v1.1.0 November 1, 2010

139 | P a g e
2010 The Center for Internet Security
Units Percentage of systems
Frequency Weekl y, Monthly, Quarterly, Annually
Targets VSC val ues should trend higher over ti me. Higher values are obviously better
as i t means more systems have been checked for vulnerabilities. A value of
100% means that all the systems are checked i n vulnerability scans. For
techni cal and operational reasons, this number wi ll l ikely be bel ow the
theoreti cal maximum.
Sources Vul nerability management and asset management systems will provide
i nformation on whi ch systems are scanned for vulnerabilities.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: VSC (%)
Usage
Thi s metri c provides information about how much of the organizations environment i s checked
for known vul nerabilities. Organizations can use this metri c to evaluate thei r risk position in
terms of concentrations of unknown vulnerability states of systems. In combination with other
vul nerability metri cs, i t provides i nsight on the organizations exposure to known
vul nerabilities.
The resul ts of the coverage metri c i ndicate the:
Scope of the vul nerability scanning activiti es
Appl i cability of other metri c results across the organization
Rel ati ve amount of i nformation known about the organizati ons vulnerability
Limitations
Due to techni cal or operational i ncompatibility certain systems may be excluded from scanning
acti vities while other systems such as l aptops and guest systems may be i ntermittently present
for network scans, resulting i n variability of metric results. In addition, scanning activities can
vary i n depth, completeness, and capability.
Thi s metri c assumes that systems scanned for vulnerabilities are systems known to and under
ful l management by the organization. These systems do not i nclude partial or unknown
systems. Future risk metrics may account for these to provide a clearer view of all system
ranges.
The CIS Security Metrics v1.1.0 November 1, 2010

140 | P a g e
2010 The Center for Internet Security
References
ISO/IEC 27002:2005
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.

Number of Known Vulnerability Instances
Objective
Number of Known Vul nerability Instances (NKVI) measures the total number of instances of
known vul nerabilities within an organization among scanned assets based on the scanning
process at a point i n ti me.
Table 60: Number of Known Vulnerability Instances
Metric
Name
Number of Known Vul nerability Instances
Version 1.0.0
Status Fi nal
Description Number of Known Vul nerability Instances (NKVI) measures the number of
known vul nerabilities that have been found on organizations systems during
the vul nerability i dentification process.
Type Techni cal
Audience Securi ty Operations
Question How many open vul nerability i nstances were found during the scanning
process?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0
i ndi cates that no i nstances of known vulnerabilities were found.
Formula Thi s metri c i s calculated by counting the number of open vulnerability
i nstances i denti fied. This count should also be done for each severity value
(Low, Medi um, and Hi gh):
Number of Known Vulnerabilities = Count(Vulnerability Status=Open)
The CIS Security Metrics v1.1.0 November 1, 2010

141 | P a g e
2010 The Center for Internet Security
Units Number of Vul nerabilities
Frequency Weekl y, Monthly, Quarterly, Annually
Targets NKVI val ues should trend l ower over ti me. In the i deal case, there woul d be
no known vul nerability i nstances on any technologies i n the organization.
Because of the l ack of experiential data from the fi eld, no consensus on the
range of acceptable goal values for Number of Known Vulnerability Instances
exi sts.
Sources Vul nerability management systems will provide i nformati on on which
systems were i dentified with severe vulnerabilities.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: NKVI (Number of Vul nerabilities)

Usage
By understanding the number of i nstances of known exploitable vulnerabilities, the
organization can assess relative risk l evels across the organization of ti me, estimate and
management remedi ation efforts, and correlate and predict the vol ume of security incidents.
The vul nerability scanning process can consist of a number of vulnerability scanning acti vities
occurring over a set ti me period i n cases where multiple scans are necessary to cover all of an
organizations technologies or potential vulnerability types.
Thi s metri c should be used in conjunction with other vulnerability metrics to provide context
around the magnitude of known vulnerabilities in an organization. Si nce other metrics are
expressed as ratios, this metri c quantifies the vol ume of known vulnerabilities the organization
i s managing. Combined wi th the mean ti me to mi ti gate vulnerabilities this metri c can provide
vi sibility i nto the ti me and effort required to manage the known vulnerabilities in the
organization.
When comparing performance over ti me and between organizations, thi s metric can be
normalized across the total number of systems. This and additional vulnerability metrics are an
area noted for further development by the CIS metri cs community.
The CIS Security Metrics v1.1.0 November 1, 2010

142 | P a g e
2010 The Center for Internet Security
Limitations
The vul nerability scans may not be comprehensive, i nstead only attempting to i dentify a subset
of potenti al vulnerabilities. Di fferent scanning sessions and products can be checking for
di fferent numbers and types of vulnerabilities, some may consist of thousands of checks for
vul nerabilities, while other products or sessions may only check for hundreds of known
vul nerabilities.
The scope of the scanning effort may not be complete and may also not be representative of
the organizations overall systems. Those systems out of scope may potentially be areas of risk.
In some cases key servers or production systems may be excluded from scanning activities.
Thi s metri c only reports on known vulnerabilities. This does not mean that there are no
unknown vul nerabilities. Severe vulnerabilities that the organization is unaware of can exist,
and potenti ally be expl oited, for years before any public disclosure may occur.
When reporting a total number of vulnerabilities, severe vulnerabilities are considered equal to
i nformational vulnerabilities. Reporting this metric by the di mension of Vul nerability Severity
wi l l provide more actionable i nformation.
References
ISO/IEC 27002:2005
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.
Patch Management
Patch Management Coverage
Objective
Patch Management Coverage (PMC) characterizes the effi ciency of the patch management
process by measuring the percentage of total technologies that are managed i n a regular or
automated patch management process. Thi s metric also serves as an indicator of the ease wi th
whi ch security-related changes can be pushed i nto the organizations environment when
needed.
Table 61: Patch Management Compliance
Metric
Name
Patch Management Coverage
Version 1.0.0
The CIS Security Metrics v1.1.0 November 1, 2010

143 | P a g e
2010 The Center for Internet Security
Status Fi nal
Description Patch Management Coverage (PMC) measures the rel ative amount of an
organizations systems that are managed under a patch management
process such as an automated patch management system. Si nce patching is
a regul ar and recurring process i n an organization, the hi gher the percentage
of technol ogies managed under such a system the ti mel ier and more
effecti vel y patches are depl oyed to reduce the number and duration of
exposed vulnerabilities.
Type Techni cal
Audience Securi ty Operations
Question What percentage of the organizations technology instances are not part of
the patching process and represent potenti al residual risks for
vul nerabilities?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of
100% i ndicates that all technologies are under management.
Formula Patch Management Coverage is calculated by dividing the number of the
technol ogy i nstances under patch management by the total number of all
technol ogy i nstances within the organization. Thi s metric can be calculated
for subsets of technologies such as by asset criticality or business unit.

PMC
Count(Technology_Instances_Under_Patch_Management )
Count(Technology_Instances)
*100
Units Percentage of technology instances
Frequency Weekl y, Monthly, Quarterly, Annually
Targets PMC val ues should trend hi gher over ti me. Gi ven the di fficulties in manually
managing systems at scale, having technologies under patch management
systems is preferred. An i deal result would be 100% of technologies.
However, gi ven i ncompati bilities across technologies and systems this i s
unl i kely to be attai nable. Hi gher values would generally result in more
effi ci ent use of security resources. Because of the l ack of experiential data
from the fi el d, no consensus on the range of acceptable goal values for PMC
exi sts.
The CIS Security Metrics v1.1.0 November 1, 2010

144 | P a g e
2010 The Center for Internet Security
Sources Patch management and IT support tracking systems wi ll provide patch
depl oyment data.
Visualization Bar Chart
X-axis: Ti me (Week, Month, Quarter, Year)
Y-axi s: PMC (%)
Usage
Patch Management Coverage is a type of patch management metric and rel ies on the common
defi ni tion of patch as defined i n Glossary.
Opti mal conditions would reflect a hi gh value i n the metri c. A value of 100% would i ndicate
that every technol ogy i n the envi ronment falls under the patch management system. The
l ower the val ue, the greater the degree of ad-hoc and manual patch deployment and the
l onger and l ess effective i t wi ll be. Gi ven that many known vulnerabilities result from missing
patches, there may be a direct correlation between a hi gher level of Patch Management
coverage and the number of known vulnerabiliti es in an environment. Patch Management
Coverage can be calculated over ti me, typically per-week or per-month. To gain i nsight i nto the
rel ative performance and risk to one business unit over another, Coverage may also be
cal culated for cross-sections of the organizati on, such as individual business units or
geographies.
Limitations
Not al l technologies wi thin an organization may be capable of bei ng under a patch
management system, for technical or performance reasons, so the results and i nterpretation of
thi s metri c wi ll depend on the specifics of an organizations i nfrastructure.
References
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.

Configuration Management
Configuration Management Coverage
Objective
The goal of thi s metric i s to provide an i ndicator of the scope of configuration management
control systems and monitoring.
The CIS Security Metrics v1.1.0 November 1, 2010

145 | P a g e
2010 The Center for Internet Security
Accurate and ti mely detection of configuration changes, as wel l as the ability to assess the state
of the current configuration through regular processes or automated means provides
organizations wi th improved visibility i nto thei r security posture.
If 100% of systems are under configuration monitoring than the organization i s relatively l ess
exposed to expl oits and to unknown threats resulting from un-approved, untested, or unknown
confi guration states.
Table 62: Configuration Management Coverage
Metric
Name
Confi guration Management Coverage
Version 1.0.0
Status Fi nal
Description Thi s metri c attempts to answer the question Are system under
confi guration management control? This question presumes the
organization has a configuration management system to test and monitor
the confi guration states of systems.
The percentage of total computer systems i n an organization that are under
the scope of a configuration monitoring/management system.
Scope of confi guration monitoring is a binary evaluation: a gi ven system i s
ei ther part of a system that can assess and report i ts configuration state or
i t i s not. Configuration state can be evaluated by automated methods,
manual i nspection, or audit, or some combination.
The computer system population base i s the total number of computer
systems with approved configuration standards. This may be all systems or
onl y a subset (i .e. only desktops, or only servers, etc.)
Organizations that do not have approved standards for their computer
systems should report N/A rather than a numeric value (0% or 100%).
In Scope
Exampl es of percentage of systems under configuration management may
i ncl ude :
Confi guration of servers
The CIS Security Metrics v1.1.0 November 1, 2010

146 | P a g e
2010 The Center for Internet Security
Confi guration of workstations/laptops
Confi guration of hand-held devices
Confi guration of other supported computer systems covered by the
organizations configuration policy

Out of Scope
Exampl es of computer system configurations that are not i n scope i nclude:
Temporary guest systems (contractors, vendors)
Lab/test systems performing to or i n support of a specific non-
production project
Networki ng systems (routers, switches, access points)
Storage systems (i .e. network accessible storage)

Type Techni cal
Audience Securi ty Operations
Question What percentage of the organizations systems are under configuration
management?
Answer A posi tive i nteger value between zero and 100 i nclusive, expressed as a
percentage. A val ue of 100% i ndicates that all technologies are i n
confi guration management system scope.
Formula Confi guration Management Coverage (CMC) i s calculated by determining the
number of i n-scope systems within configuration management scope and
then averaging this across the total number of i n-scope systems:
) _ _ (
) _ _ _ _ _ (
Systems Scope In Count
Management ion Configurat Under Systems Scope In
CMC


Units Percentage of Systems
Frequency Monthl y
Targets The expected trend for thi s metric over ti me i s to remain stable or increase
towards 100%.
Sources Confi guration management and asset management systems will provide
The CIS Security Metrics v1.1.0 November 1, 2010

147 | P a g e
2010 The Center for Internet Security
coverage.
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: CMC (%)
Usage
The Confi guration Management Coverage metri c provides i nformation about well the
organization ensures the i ntegrity of thei r network. Organizati ons can use thi s metric to
eval uate thei r risk position i n terms of concentrations of i nconsistent state of systems.
The resul ts of the coverage metri c i ndicate the:
Scope of the confi guration scanning activities
Appl i cability of other metri c results across the organization
Rel ati ve amount of i nformation known about the organizati ons configuration

Limitations
The organizations critical systems (e.g. production servers) maybe out of scope of the
confi guration management system by design, for performance or network architecture reasons.
References
Ross, Katzke, Johnson, Swanson, Stoneburner and Rogers. Special Publication SP 800-53:
Recommended Security Controls for Federal Information Systems (Rev 2). US Nati onal Institute
of Standards and Technology, 2007

IEEE Standard 828-1990, Software Configuration Management Plans.

ISO/IEC 12207:2008, Information technology Software l ife cycle processes and ISO/IEC
15288: 2008, Information technology System l ife cycle processes.

Chew, Swanson, Stine, Bartol, Brown and Robinson. Special Publication 800-55: Performance
Measurement Guide for Information Security (Rev 1). US Nati onal Institute of Standards and
Technol ogy, 2008
The CIS Security Metrics v1.1.0 November 1, 2010

148 | P a g e
2010 The Center for Internet Security
Current Anti-Malware Coverage
Objective
The goal of thi s metric i s to provide an i ndicator of the effecti veness of an organizations anti-
mal ware management. If 100% of systems have current anti-malware detecti on engines and
si gnatures, then those systems are relatively more secure. If this metri c i s less than 100%, then
those systems are rel atively more exposed to vi ruses and other malware.
The expected trend for thi s metric over ti me i s to remain stable or increase towards 100%.
Table 63: Current Anti-Malware Coverage
Metric
Name
Current Anti -Malware Coverage
Version 1.0.0
Status Fi nal
Description Thi s metri c attempts to answer the question Do we have acceptable l evels
of anti -malware coverage? This question presumes the organization has
defi ned what i s an acceptable l evel of compliance, which may be l ess than
100% to account for ongoing changes i n the operational environments.

Mal ware i ncludes computer vi ruses, worms, trojan horses, most rootkits,
spyware, di shonest adware, crimeware and other malicious and unwanted
software [http://en.wikipedia.org/wiki/Malware].

The percentage of total computer systems i n an organization that have
current, up-to-date anti-virus (or anti-malware) software and definition files.
Current i s a binary evaluation: a gi ven system i s ei ther configured wi th
both up-to-date detecti on engines and signatures or it i s not. Compliance
can be eval uated by automated methods, manual i nspecti on, audit, or some
combi nation.

Current coverage of a system i s defined as a the most recent version of the
engi ne, and a signature fi le that is no more than 14 days older than the most
recent si gnature fi le released.

In Scope
Exampl es of systems under considerations for this metric include:
The CIS Security Metrics v1.1.0 November 1, 2010

149 | P a g e
2010 The Center for Internet Security
Servers
Workstations/laptops
Hand-held devices
Other supported computer systems

Out of Scope
Exampl es of systems that are not under consideration for this metri c i nclude:
Temporary guest systems (contractors, vendors)
Lab/test systems performing to or i n support of a specific non-
production project
Networki ng systems (routers, switches, access points)
Storage systems (i .e. network accessible storage)

Type Techni cal
Audience Securi ty Operations
Question What percentage of the organizations systems have current anti-malware
protecti on?
Answer A posi tive i nteger value between zero and 100 i nclusive, expressed as a
percentage. A val ue of 100% i ndicates that all technologies have current
anti -malware coverage.
Formula Current Anti -Malware Coverage (CAMC) i s calculated by determining the
number of i n-scope systems with current coverage and then averaging this
across the total number of i n-scope systems:

CMC
(In_ Scope _Systems _with _current _ Anti Malware)

Count(In_Scope _ Systems)

Units Percentage of Systems
Frequency Monthl y
Targets The expected trend for thi s metric over ti me i s to remain stable or increase
towards 100%.
Sources Confi guration management and Anti-malware systems (l ocally or centrally
managed).
The CIS Security Metrics v1.1.0 November 1, 2010

150 | P a g e
2010 The Center for Internet Security
Visualization Bar Chart
X-axis: Ti me (Month)
Y-axi s: CAMC (%)
Usage
Current Anti -Malware Coverage(CAMC) represents the overall compliance to anti-malware
pol i cies. The hi gher the CAMC the greater the number of systems i n the organization are
runni ng anti-malware wi th recent signature fi les, the l ess l ikely i t i s that exi sting known
mal ware wi ll infect or spread across the organizations systems, or fail to be detected i n a ti mely
manner.
Limitations
Systems critical to the organization (e.g. production servers) maybe out of scope of the
anti -malware management system by design, for performance, or network architecture
reasons.
Vari ation i n type of anti-malware such as inbound email scanning vs. resident process
scanning may be material. The completeness of signature fi les and frequency of updates
may al so vary.
The ti me wi ndow defined as current may not be adequate i f malware has i ts i mpact on
the organization before signature files are developed, or before the current window has
expi red.
References
Ross, Katzke, Johnson, Swanson, Stoneburner and Rogers. Special Publication SP 800-53:
Recommended Security Controls for Federal Information Systems (Rev 2). US Nati onal Institute
of Standards and Technology, 2007

The CIS Security Metrics v1.1.0 November 1, 2010

151 | P a g e
2010 The Center for Internet Security
Application Security
Number of Applications
Objective
The goal of thi s metric i s to provide managers with the number of applications i n the
organization and to hel p translate the results of other metri cs to the scale of the organization's
envi ronment.
Table 64: Number of Applications
Metric
Name
Number of Appl i cations
Version 1.0.0
Status Fi nal
Description Thi s metri c counts the number of applications i n the organization's
envi ronment.
Type Techni cal
Audience Securi ty Operations
Question What i s the number of applications in the organization?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0
i ndi cates that the organization does not have any applications.
Formula The number of applications (NOA) i s determined by simply counting the
number of applications i n the organization:

NOACount(Applications)
Units Number of appl ications
Frequency Weekl y, Monthly, Quarterly, Annually.
Targets NOA val ues generally should trend lower over ti me although this number
wi l l depend on the organization's business, structure, acquisitions, growth
and use of IT. Thi s number wi ll also help organizations interpret the results
of other appl ications security metrics. Because of the l ack of experiential
data from the fi el d, no consensus on the range of acceptable goal values for
The CIS Security Metrics v1.1.0 November 1, 2010

152 | P a g e
2010 The Center for Internet Security
Number of Appl i cations exists.

Usage
Managers can use thi s metric to understand and monitor changes to thei r application
envi ronment. This metric provides a reference point for metri cs around the organizations
appl ications.
Limitations
Vari ations i n application scope. Different organizations might count as a single application a
system that another organization may consider several distinct applications, resulting i n
si gnificantly di fferent numbers of applications between organizations.
Vari ations i n application scale. Appl ications wi thin or across organizations mi ght be
si gnificantly di fferent in size, so the l evel of effort required to assess, test or fi x vulnerabiliti es
may vary between applications.
References
Web Appl i cation Security Consortium. Web Application Security Statistics Project.,
http://www.webappsec.org/projects/statistics/


The CIS Security Metrics v1.1.0 November 1, 2010

153 | P a g e
2010 The Center for Internet Security
Appendix A: Glossary
Anti-malware
Anti -malware i s security software that detects, bl ocks, and neutralizes malware of various types
(see Malware).
Application Security Testing
The term application security testing is defi ned as a material test of the security of a business
appl ication after it has been devel oped and deployed (although i t may be a pre-production
test). It can consist of a combination of one or more of the fol lowing techniques:
Source code analysis (automated and/or manual)
Manual penetration testing (white- or bl ack-box),
Stati c or dynamic binary analysis,
Automated testi ng, or
Fuzzi ng or other techniques that i dentify vulnerabilities in an application.
Bias
Bias i s i dentified as a term that refers to how far the average statistic l ies from the parameter it
i s esti mating, that i s, the error that arises when estimating a quantity. Errors from chance wi ll
cancel each other out i n the l ong run, those from bias wi ll not.
18
Systemic Bias i s i dentified as
the i nherent tendency of a process to favor a particular outcome.
19

Business Application
The term business application can mean many thi ngs i n IT systems ranging from productivity
appl ications on i ndividual desktop computers to complex manufacturing systems existing on
mul ti ple pieces of custom hardware. In this context, the term refers to a set of technologies
that form a system performing a distinct set of business operations. Examples of thi s include an
order processing system, online shopping cart, or an i nventory tracking system.
Si nce applications can consist of more than one technology, the scope of an application i s
defi ned as a process or set of processes that the organization manages and makes decisions
around as a single enti ty. Generally, thi s scope is not i ntended to i nclude i nfrastructure
components of the application, such as the web or application server i tself, although this may
not be separated for certain types of testing.

18
Source: Wikipedia. <http://en.wikipedia.org/wiki/Bias>
19
Source: Wikipedia. <http://en.wikipedia.org/wiki/Systemic_bias>
The CIS Security Metrics v1.1.0 November 1, 2010

154 | P a g e
2010 The Center for Internet Security
Containment
Containment is i dentified as limiting the extent of an attack.
20
Another way to l ook at
containment i s to stop the bl eeding. The i mpact of the i ncident has been constrained and is
not i ncreasing. Measure can now be taken to recover systems, and effective recovery of
pri mary capabilities may be complete.
Data Record
A Data record i s a single sample of data for a particular metric. Each data record roughly
approximates a row i n a relational database table. Data records contain data attributes that
describe the data that should be collected to calculate the metri c. Each data attribute roughly
approximates a column i n the database table. Attri butes contains the fol lowing characteristics:
Name a short, descriptive name.
Type the data type of the attri bute. Types i nclude Boolean, Date/Time
21
, Text,
Numeri c and ISO Country Code.
De-identification a Boolean value describing whether the fi eld of the data record
should optionally be cleansed of personally or organizationally i dentifying i nformation. If
yes, then pri or to consolidation or reporting to a thi rd-party, the data i n thi s field
should be de-i dentified using a privacy-preserving algorithm, or deleted. For example,
severi ty tags for security i ncidents mi ght require de-i dentification.
Description additional i nformation describing the attri bute i n detail.
In thi s document, the begi nning of each major section describes the attri butes that should be
col l ected i n order to calculate the metri c.
De-identified
De-i denti fied i nformation is i nformation from which all potentially i dentifying information that
woul d i ndividually i dentify the provider has been removed. For the purposes of these metri cs,
these are data records for which de-identification needs to occur i n order to mai ntain the
anonymity of the data provider.
Malware
Mal ware i ncludes computer vi ruses, worms, trojan horses, most rootkits, spyware, di shonest
adware, crimeware and other malicious and unwanted software
[http://en.wi ki pedia.org/wiki/Malware].

20
G. Miles, Incident Response Part #3: Containment. Security Horizon, Inc., 2001.
<http://www.securityhorizon.com/whitepaper sTechni cal /Incident Responsepart3.pdf>
21
Also known as a timestamp.
The CIS Security Metrics v1.1.0 November 1, 2010

155 | P a g e
2010 The Center for Internet Security
Risk Assessment
The term risk assessment i s defined as a process for analyzing a system and i denti fying the ri sks
from potenti al threats and vulnerabilities to the i nformation assets or capabilities of the
system. Al though many methodologies can be used, i t should consider threats to the target
systems, potential vulnerabilities of the systems, and impact of system exploitation. It may or
may not i nclude risk mi tigation strategies and countermeasures. Methodologies could i nclude
FAIR, OCTAVE or others.
Security Incident
A security incident results i n the actual outcomes of a business process deviating from the
expected outcomes for confidentiality, i ntegrity & availability due to defi ciencies or failures of
peopl e, process or technology.
22
Incidents that should not be considered security i ncidents
i ncl ude disruption of service due to equi pment failures.
Security Patch
A patch i s a modification to exi sting software i n order to i mprove functionality, fix bugs, or
address security vulnerabiliti es. Security patches are patches that are solely or in part created
and rel eased to address one or more security flaws, such as, but not l imited to publicly
di sclosed vulnerabilities.
Technology
A technology i s an application, operating system, or appliance that supports business processes.
A critical technology is one upon which normal business operati ons depend, and whose
i mpairment would cause such operations to halt.
Third party
An organizational entity unrelated to the organization that calculates a metric, or supplies the
source data for i t. Note that thi rd-party i s a subjective term and may be i nterpreted
di fferently by each recording entity. It may denote another group wi thin the same corporation
or an i ndependent enti ty outside of the corporation.
Vulnerability
Vulnerability i s defined as a weakness i n a system that could be expl oited by an attacker to gain
access or take actions beyond those expected or i ntended by the systems security model.
According to the defi nition used by CVE, Vul nerabilities are mi stakes i n software design and
executi on, while exposures are mi stakes in configuration or mistakes i n software used as a
component of a successful attack. For the purposes of these metri cs, the term vul nerabilities
i ncl ude exposures as wel l as technical vulnerabilities.

22
Source: Operational Risk Exchange. <http://www.orx.org/reporting/>
The CIS Security Metrics v1.1.0 November 1, 2010

156 | P a g e
2010 The Center for Internet Security
Appendix B: Acknowledgements
Over one hundred (100) i ndustry experts contributed prioritizing business functions and guiding
the devel opment of the consensus-based, metric definitions i n this document. Addi tionally, the
fol l owing people significantly contributed to the defi nition of these metrics: Kip Boyle, Rodney
Caudl e, Anton Chuvakin, Dean Farrington, Brad Gobble, Ben Hami lton, Pat Hymes, Andrew
Jaqui th, Adam Kliarsky, Clint Kreitner, David Lam, Charlie Legrand, Bill Marriott, El izabeth
Ni chols, Orlando Padilla, Steven Piliero, Fred Pinkett, Mi ke Rothman, Andrew Sudbury, Chad
Thunberg, Chris Walsh, Li lian Wang, Craig Wright, Chris Wysopal, Caroline Wong and others.
Appendix C: Examples of Additional Metrics
The datasets provided can be used to create additional metrics to suit an organizations specific
need. For exampl e, an organization focusing on i ncident containment could create additional
i nci dent metri cs to track their ability to detect i ncidents i nternally as wel l as provide additional
granularity around incident recovery by measuring the ti me from i ncident discovery to
containment (as wel l as recovery). Two new metri cs, Percentage of Incidents detected by
Internal Controls and Mean Ti me from Discover to Containment can be created using the
Inci dents Dataset. Another organization may wi sh to focus on the patching process and provide
the Mean-Ti me to Depl oy metri c just for critical patches as a key i ndicator to management.
Mean-Ti me to Depl oy Critical Patches can be created from the Patch datasets, using the
severi ty fi eld as a dimension to focus management attenti on on a key ri sk area. The following
defi ni tions of these additional metrics defined using the CI S datasets are provided bel ow:
Percentage of Incidents Detected by Internal Controls
Objective
Percentage of Incidents Detected by Internal Controls (PIDIC) indicates the effecti veness of the
security monitoring program.
Table 65: Percentage of Incidents Detected by Internal Controls
Metric
Name
Percentage of Incidents Detected by Internal Controls
Version 0.9.0
Status Revi ewed
Description Percentage of Incidents Detected by Internal Controls (PIDIC) calculates the
rati o of the i ncidents detected by standard security controls and the total
number of i ncidents i dentified.
The CIS Security Metrics v1.1.0 November 1, 2010

157 | P a g e
2010 The Center for Internet Security
Type
Audience Operations
Question Of al l security i ncidents i dentified during the ti me peri od, what percent were
detected by i nternal controls?
Answer Posi tive fl oating point value between zero and 100. A val ue of 0 i ndicates
that no security incidents were detected by i nternal controls and a value of
100 i ndicates that all security incidents were detected by i nternal controls.
Formula Percentage of Incidents Detected by Internal Controls (PIDIC) is calculated by
di vi ding the number of security incidents for which the Detected by Internal
Control s field is equal to true by the total number of all known security
i nci dents:

PIDIC
Count(Incident_DetectedByInternalControls TRUE)
Count(Incidents)
*100
Units Percentage of i ncidents
Frequency Monthl y, Quarterly, Annually
Targets PIDIC values should trend higher over time. The val ue of 100% indicates
hypotheti cal perfect i nternal controls since no i ncidents were detected by
outsi de parties. Because of the l ack of experiential data from the fi eld, no
consensus on the range of acceptable goal values for Percentage of Incidents
Detected by Internal Controls exists.
Sources Si nce humans determine when an i ncident occurs, when the i ncident i s
contained, and when the i ncident i s resolved, the pri mary data sources for
thi s metri c are manual i nputs as defi ned i n Security Incident Metrics: Data
Attri butes. However, these i ncidents may be reported by operational
security systems, such as anti-malware software, security incident and event
management (SIEM) systems, and host l ogs.

Usage
Thi s metri c measures the effecti veness of a security monitoring program by determining which
i nci dents were detected by the organizations own internal activities (e.g. i ntrusion detecti on
systems, log reviews, employee observations) i nstead of an outside source, such as a business
partner or agency. A l ow value can be due to poor visibility i n the envi ronment, i neffective
The CIS Security Metrics v1.1.0 November 1, 2010

158 | P a g e
2010 The Center for Internet Security
processes for di scovering i ncidents, i neffective alert signatures and other factors. Organizati ons
should report on this metri c over ti me to show i mprovement of the moni toring program.
Limitations
An organization may not have direct control over the percentage of i ncidents that are detected
by thei r security program. For i nstance, i f all the i ncidents that occur are due to zero-day or
previ ously unidentified vectors then there are not many options l eft to i mprove posture.
However, thi s metri c could be used to show that i mproving countermeasures and processes
wi thi n operations could i ncrease the number of i ncidents that are detected by the organization.
References
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security
Inci dent Handling Guide. US Nati onal Institute of Standards and Technology, 2004.
<http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf>
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams (CSIRTs). Carnegie-Mellon Software Engi neering Institute, 2003.
http://www.cert.org/archive/pdf/03tr001.pdf
Baker, Hyl ender and Valentine, 2008 Data Breach Investigations Report. Veri zon Business RISK
Team, 2008. <http://www.verizonbusiness.com/resources/security/databreachreport.pdf>

The CIS Security Metrics v1.1.0 November 1, 2010

159 | P a g e
2010 The Center for Internet Security
Mean Ti me from Di scovery to Containment
Objective
Mean Ti me from Di scovery to Containment (MTDC) characterizes the effecti veness of
containing a security i ncident as measured by the average el apsed ti me between when the
i nci dent has been di scovered and when the i ncident has been contained.
Table 66: Mean Time from Discovery to Containment
Metric
Name
Mean Ti me from Di scovery to Containment
Version 0.9.0
Status Revi ewed
Description Mean Ti me from Di scovery to Containment (MTDC) measures the
effecti veness of the organization to i dentify and contain security incidents.
The sooner the organization can contain an incident, the l ess damage i t i s
l i kely to i ncur. This calculation can be averaged across a ti me period, type of
i nci dent, business unit, or severity.
Audience Operations
Question What i s the average (mean) number of hours from when an incident has
been detected to when i t has been contained?
Answer A posi tive i nteger value that is greater than or equal to zero. A val ue of 0
i ndi cates i nstantaneous containment.
Formula For each i ncident contained i n the metri c ti me period, the mean ti me from
di scovery to containment i s calculated dividing the di fference in hours
between the Date of Containment from the Date of Di scovery for each
i nci dent by the total number of i ncidents contained i n the metri c ti me
peri od:

MTDC
(Date_of_Containment Date_of_Discovery)

Count(Incidents)

Units Hours per i ncident
Frequency Weekl y, Monthly, Quarterly, Annually
The CIS Security Metrics v1.1.0 November 1, 2010

160 | P a g e
2010 The Center for Internet Security
Targets MTDC val ues should trend lower over ti me. The value of 0 i ndicates
hypotheti cal instantaneous containment. Because of the l ack of experiential
data from the fi el d, no consensus on the range of acceptable goal values for
Mean Ti me from Di scovery to Containment exists.
Sources Si nce humans determine when an i ncident occurs, when the i ncident i s
contained, and when the i ncident i s resolved, the pri mary data sources for
thi s metri c are manual i nputs as defi ned i n Security Incident Metrics: Data
Attri butes. However, these i ncidents may be reported by operational
security systems, such as anti-malware software, security incident and event
management (SIEM) systems, and host l ogs.
Usage
MTDC i s a type of security i ncident metric and relies on the common defi nition of security
i nci dents as defined in Glossary.
An i ncident is determi ned to be contained when the i mmediate effect of the i ncident has
been mi ti gated. For example, a DDOS attack has been throttled or unauthorized external
access to a system has been blocked, but the system has not yet been ful ly recovered or
busi ness operations are not restored to pre-incident l evels.
Opti mal conditions would reflect a l ow value i n the MTDC. A l ow MTDC value i ndicates a
heal thier security posture as malicious activity will have l ess ti me to cause harm. Gi ven the
modern threat l andscape and the ability for malicious code to l i nk to other modules once
entrenched, there may be a di rect correlation between a hi gher MTDC and a hi gher incident
cost.
Limitations
Thi s metri c measures i ncident containment capabilities of an organization. As such, the
i mportance of thi s metri c will vary between organizations. Some organizations have much
hi gher profiles than others, and would thus be a more attractive target for attackers, whose
attack vectors and capabilities wi ll vary. As such, MTDCs may not be di rectly comparable
between organizations.

In addi tion, the ability to calculate meaningful MTDCs assumes that i ncidents are detected. A
l ack of participation by the system owners could skew these metri cs. A hi gher rate of
parti cipation i n the reporting of security i ncidents can i ncrease the accuracy of these metri cs.
The date of occurrence of an i ncident may be hard to determi ne precisely. The date of
occurrence fi eld should be the date that the i ncident could have occurred no l ater than given
The CIS Security Metrics v1.1.0 November 1, 2010

161 | P a g e
2010 The Center for Internet Security
the best available i nformation. Thi s date may be subject to revi sion and more i nformation
becomes known about a particular i ncident.
Inci dents can vary i n size and scope. Thi s could result i n a variety of containment ti mes that,
dependi ng on i ts distribution, may not provide meaningful comparisons between organizations
when mean val ues are used.
References
Scarfone, Grance and Masone. Special Publication 800-61 Revision 1:Computer Security
Inci dent Handling Guide. US Nati onal Institute of Standards and Technology, 2004.
<http://csrc.nist.gov/publications/nistpubs/800-61-rev1/SP800-61rev1.pdf>
Ki l lcrece, Kossakowski, Ruefle and Zajicek. State of the Practice of Computer Security Incident
Response Teams (CSIRTs). Carnegie-Mellon Software Engi neering Institute, 2003.
http://www.cert.org/archive/pdf/03tr001.pdf
Baker, Hyl ender and Valentine, 2008 Data Breach Investigations Report. Veri zon Business RISK
Team, 2008. <http://www.verizonbusiness.com/resources/security/databreachreport.pdf>

Mean Time to Deploy Critical Patches
Objective
Mean Ti me to Depl oy Critical Patches (MTDCP) characterizes effectiveness of the patch
management process by measuring the average ti me taken from notification of critical patch
rel ease to i nstallation in the organization. This metri c serves as an i ndicator of the
organizations exposure to severe vulnerabilities by measuring the ti me taken to address
systems in known states of hi gh vulnerability for which security patches are available. Thi s is a
parti al indicator as vulnerabilities may have no patches available or occur for other reasons
such as system configurations.
Table 67: Mean Time to Deploy Critical Patches
Metric
Name
Mean Ti me to Depl oy Critical Patches
Version 0.9.0
Status Draft
Description Mean Ti me to Patch Deploy Patches (MTPCP) measures the average ti me
taken to depl oy a critical patch to the organizations technologies. The
The CIS Security Metrics v1.1.0 November 1, 2010

162 | P a g e
2010 The Center for Internet Security
sooner critical patches can be depl oyed, the l ower the mean ti me to patch
and the l ess time the organization spends wi th systems i n a state known to
be vul nerable.
In order for managers to better understand the exposure of thei r
organization to vul nerabilities, Mean Time to Depl oy Critical Patches should
be cal culated for the scope of patches with Patch Criticality l evels of
Cri ti cal. Thi s metric result, reported separatel y provides more i nsight than
a resul t blending all patch criticality l evels as seen i n the Mean Ti me to Patch
metri c.
Audience Management
Question How many days does i t take the organization to depl oy critical patches i nto
the envi ronment?
Answer A posi tive floating-point value that is greater than or equal to zero. A val ue
of 0 i ndi cates that critical patches were theoreti cally i nstantaneously
depl oyed.
Formula Mean Ti me to Depl oy Critical Patches is calculated by determining the
number of hours between the Date of Noti fi cation and the Date of
Installati on for each criti cal patch completed i n the current scope, for
exampl e by ti me period or business unit. These results are then averaged
across the number of completed critical patches i n the current scope:

MTDCP
(Date_of_Installation Date_of_Notification )

Count(Completed_Critical_Patches )

Units Hours per patch
Frequency Weekl y, Monthly, Quarterly, Annually
Targets MTDCP val ues should trend l ower over time. Most organizations put critical
patches through test and approval cycles prior to depl oyment. Generally, the
target ti me for Mean Ti me to Depl oy Critical Patches i s within several hours
to days. Because of the l ack of experiential data from the fi eld, no
consensus on the range of acceptable goal values for Mean Ti me to Depl oy
Cri ti cal Patches exi sts.
The CIS Security Metrics v1.1.0 November 1, 2010

163 | P a g e
2010 The Center for Internet Security

Usage
Mean Ti me to Depl oy Critical Patches is a type of patch management metri c, and relies on the
common defi nition of patch as defined i n Glossary.
Gi ven that many known severe vulnerabilities result from mi ssing critical patches, there may be
a di rect correlation between l ower MTDCP and lower levels of Security Incidents. MTDCP can
be cal culated over ti me, typically per-week or per-month. To gain i nsight i nto the rel ative
performance and risk to one business unit over another, MTDCP can be compared against
MTTP by cross-sections of the organizati on such as i ndividual business units or geographies.
Limitations
Cri ti cal Technologies. Thi s metric assumes that the cri tical technologies are known and
recorded. If the critical technologies are unknown, this metric cannot be accurately measured.
As new technol ogies are added thei r criticality needs to be determi ned and, i f appropriate,
i ncl uded i n this metric.
Vendor Rel i ance. This metri c i s reliant upon the vendors ability to noti fy organization of
updates and vulnerabilities that need patching. If the vendor does not provide a program for
noti fyi ng thei r customers then the technology, i f critical, wi ll always be a black mark on thi s
metri c.
Cri ti cality Ranking. This metri c i s highly dependent upon the ranking of critical technologies by
the organization. If this ranking i s abused then the metri c wi ll become unreliable.
Patches in Progress. Thi s metric calculation does not account for patch installations that are
i ncomplete or on-going during the ti me peri od measured. It is not cl ear how this wi ll bias the
resul ts, although potentially an extended patch deployment wi ll not appear i n the results for
some ti me.
References
Mel l , Bergeron and Henning. Special Publication 800-40: Creating a Patch and Vulnerability
Management Program. US National Institute of Standards and Technology, 2005.
The CIS Security Metrics v1.1.0 November 1, 2010

164 | P a g e
2010 The Center for Internet Security
Index of Tables
Tabl e 1: Busi ness Functions .................................................................................................................3
Tabl e 2: Metri c Categori es ...................................................................................................................4
Tabl e 3: Securi ty Incidents Tabl e .........................................................................................................6
Tabl e 4: Securi ty Incident Classificati on Tabl e....................................................................................8
Tabl e 5: Securi ty Incident Reporti ng Tabl e ...................................................................................... 11
Tabl e 6: Technol ogi es Tabl e .............................................................................................................. 11
Tabl e 7: Effect Rati ng Tabl e............................................................................................................... 13
Tabl e 8: Cri ticali ty Rati ng Tabl e......................................................................................................... 13
Tabl e 9: Mean Ti me to Incident Discovery....................................................................................... 21
Tabl e 10: Mean Ti me between Securi ty Incidents .......................................................................... 24
Tabl e 11: Mean Ti me to Inci dent Recovery ..................................................................................... 26
Tabl e 12: Cost of Incidents ................................................................................................................ 29
Tabl e 13: Mean Cost of Incidents ..................................................................................................... 33
Tabl e 14: Mean Cost of Incidents ..................................................................................................... 36
Tabl e 15: Technologi es Tabl e............................................................................................................ 41
Tabl e 17: CVSS Score Tabl e ............................................................................................................... 43
Tabl e 18: Identi fi ed Vulnerabili ti es Tabl e ........................................................................................ 46
Tabl e 20: Exempt Technologi es Tabl e .............................................................................................. 47
Tabl e 21: Percentage of Systems wi thout Known Severe Vulnerabili ti es ..................................... 53
Tabl e 22: Mean-Ti me to Mi ti gate Vulnerabili ti es............................................................................ 56
Tabl e 23: Mean Cost to Mi ti gate Vulnerabili ti es ............................................................................. 59
Tabl e 24: Technologi es Tabl e............................................................................................................ 63
Tabl e 26: Patch Informati on Tabl e ................................................................................................... 64
The CIS Security Metrics v1.1.0 November 1, 2010

165 | P a g e
2010 The Center for Internet Security
Tabl e 28: Patch Acti vi ty Revi ew Tabl e.............................................................................................. 67
Tabl e 29: Patch Policy Compliance ................................................................................................... 72
Tabl e 30: Mean Ti me to Patch .......................................................................................................... 75
Tabl e 31: Mean Cost to Patch ........................................................................................................... 78
Tabl e 32: Technologi es Tabl e............................................................................................................ 81
Tabl e 33: Confi guration Status Accounting Tabl e ........................................................................... 83
Tabl e 34: Confi guration Deviati on Tabl e.......................................................................................... 83
Tabl e 35: Confi guration Audi t Tabl e................................................................................................. 84
Tabl e 36: Percentage of Confi gurati on Compliance........................................................................ 85
Tabl e 37: Technologi es Tabl e............................................................................................................ 89
Tabl e 43: Percent of Change wi th Securi ty Revi ew ....................................................................... 100
Tabl e 44: Percent of Changes wi th Securi ty Excepti ons ............................................................... 102
Tabl e 45: Technologi es Tabl e.......................................................................................................... 106
Tabl e 47: Business Applicati on Status Tabl e .................................................................................. 108
Tabl e 48: Risk Assessments Tabl e................................................................................................... 109
Tabl e 50: Business Applicati on Weaknesses Tabl e........................................................................ 111
Tabl e 51: Most Dangerous Programming Errors Tabl e ................................................................. 112
Tabl e 52: Percentage of Cri tical Applicati ons ................................................................................ 118
Tabl e 53: Risk Assessment Coverage.............................................................................................. 120
Tabl e 54: Securi ty Testi ng Coverage............................................................................................... 122
Tabl e 55: Securi ty Spending Tabl e.................................................................................................. 126
Tabl e 56: Securi ty Budget as % of IT Budget.................................................................................. 129
Tabl e 57: Information Securi ty Budget Allocati on ........................................................................ 132
Tabl e 58: Number of Inci dents........................................................................................................ 135
The CIS Security Metrics v1.1.0 November 1, 2010

166 | P a g e
2010 The Center for Internet Security
Tabl e 59: Vulnerability Scan Coverage ........................................................................................... 138
Tabl e 60: Number of Known Vulnerabili ty Instances .................................................................... 140
Tabl e 61: Patch Management Compliance .................................................................................... 142
Tabl e 62: Confi guration Management Coverage........................................................................... 145
Tabl e 63: Current Anti -Mal ware Coverage .................................................................................... 148
Tabl e 64: Number of Applications .................................................................................................. 151
Tabl e 65: Percentage of Incidents Detected by Internal Controls ............................................... 156
Tabl e 66: Mean Ti me from Discovery to Containment ................................................................. 159
Tabl e 67: Mean Ti me to Deploy Cri tical Patches ........................................................................... 161

You might also like