You are on page 1of 18

CISO Scorecard

How do you answer the boss’ “how secure are we” question? Quantifying what really matters in cyber
security is a first step in deciding what to measure, thus develop any sort of scorecard. The current cyber
industry trend is towards a risk based security strategy, to best apply resources where the impact
reduction is the greatest. Then what matters becomes what protection capabilities and vulnerability
mitigations add the best value in reducing the organization’s overall risk. These risk reduction
capabilities should be based on the organization’s key business success factors and support both threat
vector minimization and vulnerability impact reduction as well as enhanced competitive advantage.
Thus any “CISO scorecard” represents the overall “best risk value” for the organization from those key
views. Whereas the ‘why’ of collecting security metrics also includes justification for security spending,
compliance and regulations support, data breach risk minimization (loss of brand / reputation), etc.
Many of downsides of ineffective risk management have hard to quantify intangible effects, specific
damages potentially incurred, cost avoidance, etc – yet in the aggregate are very costly.

Ideally the organization has documented business success objectives to help determine the related
technical objectives by which the security controls are applied. If not, then one needs to use the vision
and mission statements to distill a set of operational success objectives to use in translating those
requirements to technical objectives. After which a risk assessment is applied and security controls are
determined. In many cases this is the hardest step, as it involves two transformations and then mapping
the major controls thereafter, with frequently fuzzy objectives to start with. Nevertheless, the security
strategy must be risk value based, so this is a great exercise for the Security team to undertake to ensure
their major cyber tasks and resources are focused on the most effective risk value elements. In fact, the
ultimate security objective is to provide a competitive advantage, differentiating cyber strategy, which
includes minimizing risks. The following sections will describe how we distill the core risk factors and
major focus areas to develop the CISO scorecard, whereas as the initial notional views suggested above
are suggested as below, to help frame the bigger picture as we start the ‘what matters’ journey.
Figure 1.0 – Notional risk value elements

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 1


The approach shown above is focused on the risk perspective, minimizing risks and barriers, yet all
efforts must also contribute to a competitive advantage, as suggested below:
1. Known to be a trustworthy organization that ‘walks the walk’ - focusing on the P&L benefits of
mitigating cyber risks, aligned with appropriate technical solutions and security architecture.
2. Embody safety, integrity & TRUST - demonstrate due care and due diligence therein.
3. Facilitate a class leader position, enhance BRAND, go beyond the norms - lead the industry.
4. Improve operational efficiency - high resiliency, lower security costs and risk impacts.
5. Improve compliance levels, reduce costs therein (Build in security, map to CIS 20 controls).
6. Show excellence in DATA protection – put security in the forefront, show privacy stewardship.
7. Improve processes and add new security services and products (directly support the new
technology initiatives and eternal services groups).
8. Support programs, processes, and use cases that foster innovation and growth (managed risk,
user transparency, “Special Projects” - etc).
9. Facilitate new business, enhance productivity and collaboration, commoditized services, etc.

Next we apply these major risk reduction and competitive advantage elements to the company’s
strategic plan illustrated below and then distill the major business factors to base our CISO scorecard on.
Figure 2.0 – Company strategic business factors (example - insert YOURS here!)

Using the suggested risk reduction (and differentiating) measures quantified later in the document,
along with these strategic plan objectives, we propose the key operational business success objectives
that have a corresponding technical objectives relationship. The most likely risks and barriers to those
objectives are then assessed by which the security measures will be selected to make up the CISO
scorecard. These measures must be relatively easily captured and processed, to both minimize the
resources overhead and support a clear picture of the ‘best risk value’ outcomes.

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 2


All measures need to be famed and managed within an overall enterprise ‘balanced scorecard (BSC)”
perspective, to help show a clear, relative risk value to senior leadership. Traditionally, the BSC method
describes the cause-and-effect linkages between four high-level perspectives of strategy and execution.
Those four perspectives can be applied to an overall information security strategy:
• Financials / Value is an expression of the information security organization’s strategic objectives and
value that it provides to the organization (Safety, security, integrity, other safeguards for stakeholders).
(e.g., Cost of security / workstation, assessment costs, compliance activities, security services status, etc)
• Customers is an expression of how the security team believes they should be viewed in order to satisfy
and deliver value to target customers or stakeholders in order to achieve its strategic objectives.
(e.g., Reportable privacy breaches, customer & partner compliance, regulatory audits / period, etc.)
• Business / Internal Operations is an identification of the handful of critical capabilities at which the
security team needs to excel and demonstrate the overall company risk value to its target customers.
(e.g., Average patch time, percentage (%) of endpoints with no vulnerabilities, time to de-provision users,
time to fix critical vulnerabilities, cost of data breach, etc.)
• Innovation, Learning and growth is an identification of the most critical people, processes and
systems that enables security to be excellent at and add value to the important business capabilities.
(e.g., Number of security consultations by business, % of users with weak passwords, security training
statistics and effectiveness, etc.)

Scorecards are used for strategic decision support – especially for financials and operations. For business
decision support, scorecards are created to present targeted answers to common leadership questions:
 Are we meeting our fiduciary and statutory requirements?
 How do we compare to our peers? Are we advancing our objectives?
 How are we identifying and managing risk? Are we improving our risk posture?
 Are we investing in and advancing initiatives in the right order?
A Cyber Security Scorecard will help steer an organization towards the desired cyber security strategy,
while providing answers to both the leadership questions and others raised by executives:
Board of Directors Questions
1. What is the status of our cyber resilience capabilities compared to the threat level?
2. What is the impact that cyber security risks have on our business strategy, risk posture?
3. How do our measures and investments compare to the rest of our sector?
4. Are we compliant with the relevant cyber security and related regulations?
CIO Questions
1. What are the key drivers in cyber security risk management and how are they developing?
2. What is the status of our cyber security detection and prevention capabilities?
3. What is the status of the risk framework and processes, including compliance / audit / V&V?
4. What were the root causes and actions taken for any high-impact incidents in last period?

Ultimately, the Security scorecard objective is to help CISOs be more successful at communicating the
business value of information security and at linking the strategy with execution. A pseudo-formula for
how to do it: Strategy Map + Measures and Targets + A Set of Funded Initiatives = A Complete Program
of Action. Table 2.0 outlines the mapping between the company objectives and most likely risks (as
listed in table 1.0 and further categorized later). Whereas the next section assesses the various security
measures suggested by several authoritative sources (plus other operational metrics in the appendix) to
best capture and align to those risks. We note upfront that that there is one over-riding risk that is
pervasive in negatively affecting all objectives – a data breach. Which should not be a surprise to
anyone, given the record numbers of events in 2016, in files, organizations and costs. Security developed
a separate data breach risk posture approach that assessed the top ten vulnerabilities and risks therein,

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 3


where those results are used here. A summary risk table from that approach is excerpted below, then
those risks are further categorized as ‘themes’ to better align with the business objectives in table 2.0.

In table 1.0 the major data breach causal factors are listed as ‘methods’ since they contain the typical
mix of policy, process, people and product (technology) elements. To start the risk estimation journey,
especially for those who have only a general awareness of the security methods, we first start with a
high level sense of the relative ‘goodness’ level, providing a heuristic view of the security ‘posture.’ The
‘key areas / exposure’ column (not shown below) provides a high level view of the major residual
problem areas to help link the relationship and potential impacts in the other methods. The relative
posture level and numeric risk value estimates for each method can provide a sense of the impact
reduction potential for minimizing the vulnerabilities risk level, taking into account the defense in depth
status of the system architecture as a whole (the dependencies, hierarchy, inheritance, etc. at play).
(note - risk values are notional, examples used for illustrative purposes, NOT actuals, insert yours here!)
Table 1.0 – Main Data Breach methods and mitigations
Methods Posture Risk Mitigations (common objectives)
(1-25)

1 – Adequate Below 20 Implement targeted encryption at rest and in motion (HTTPS),


encryption (and key average integrated with DLP solutions that control access, sharing and
management). protection means. Reduce shadow IT.
2 – Incident Response Good 9 Processes exist, need to audit and ensure immediate actions are
Plan clear. Run exercises with the major stakeholders, legal, etc
3 – Identity and Access Average 15 Periodically audit all accounts. PAM policy / PI. Implement
Management (IAM) enterprise MFA & DRM/IRM (align with DLP)
4 – Data Loss Below 15 DLP strategy exists. Keep DLP simple and business focused.
Prevention (DLP) average Start with selecting DLP capabilities (with IRM). Get mission
critical apps buy-in, then other stakeholders, have POA&M.
5 – Vulnerability Average 15 Develop an effective CMDB. Automatic patch processes help.
management / Show CIS 20 security controls adequacy. More effective SIEM
Adequate cyber suite / SOC / tools integration
6 – Insider threat Marginal 15 Must monitor behaviors in both IT and HR. Control / limit file
detection and resource access. Leverage IdAM & DLP. Add controls -
limit users to own workstations or department, limit concurrent
logins and session time/hours,
7 –Cyber Education Average 9 User security actions need to be simple, direct. Defense in
and Awareness depth technical controls are the essential core protection. More
Training periodic, targeted user engagement for cyber awareness,
phishing et al. Clearly identify consequences for poor behavior
8 – Cyber Insurance Good 3 IR plan has effective steps, controls. Practice the steps and
and detailed communicate more often with all major stakeholders. Policies
notification processes can be inadequate (too many caveats). Usually based on client
data loss, versus also IP / PII (which is difficulty to put a value
on, to assess damages)
9 – Audit / monitor / Below 9 Need a data security policy first, then an audit / V&V effort to
verification and average track and report; provide mitigations. Create a security
validation assessments dashboard area / metric, with effective log reviews.
10 –Security and Below 6 Need to enforce PII/Privacy and confidential data storage
Privacy Policy and average policies. IT security policy needs implementing and linking to
process instructions company policies, supported by process instructions.

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 4


As discussed later, we define overarching themes and break those down into components with defined
objectives, with each having defined major initiatives, which are themselves funded, tactical activities
that support key risk mitigations activities. These themes are risk based and thus it’s structurally
beneficial to categorize the key risks in table 1.0 into five key focus areas (as also done in the data
breach risk approach). The risk heat chart therein categorizes the top risks in # 1 – 10 above as: (A) Data
protection (encryption (#1) and DLP (#4)), (B) – IAM (#3), (C) - Vulnerability management / effective
cyber suite (#5), and (D) – Intruder detection (Insider threat (#6), adequate rogue identification). To
those four risk categories we add two management categories: (E) Policy (#10) and Monitoring (#9) and
(F) – Risk Management (effectiveness, processes and metrics).

Risks depend on the threats ability to take advantage of existing vulnerabilities. Whereas a risk based
security strategy should focus on the highest threats to the company, specifically the “threat actors”
that can take advantage of existing vulnerabilities; thus create the highest risks. These ‘intruders’ threat
vectors also need to be minimized and accounted for in the metrics we capture, be they scorecard or
operational. The typical threat sources that will be accounted for in the risk value process are:
(1) - Insiders (85+% of all attacks);
(2) - Privileged users (stolen credentials cause most data breaches);
(3) - Hackers (nation states, criminals, all malevolent security types - typically they use malware), and
(4) – Inattentive Users (phishing attacks, poor social media behavior, poor PWs, etc.).
Table 2.0 aligns the operational objectives with the technical aspects therein and the corresponding key
risks and barriers.
Table 2.0 – Business Objectives and Risks Mapping
Operational objectives Technical aspects Key Risks & overall barriers
A. Grow market share Demonstrate Safety, Service, A, C and F.
Integrity & Trust to show clear Unclear common companywide
value. Overall Industry ecosphere objectives
approach, versus just clients and No integrated capability
customers, maintain visibility…. roadmap
B. Sector Leader Enable Innovation, Establish B, D and E.
Standards, support ISAC, support Limited R&D mapping to future
research and bold / disruptive client’s needs
changes for the industry – leapfrog Limited standards efforts to set
legacy obsolesce…. the industry baseline
C. Drive efficiency Lean processes & remove overlap, B, C and E.
key metrics & data driven Limited companywide process
decisions, smart monitoring and improvement initiatives
Audit / V&V…. No common “CM” or metrics
processes or champion
D. Company R&D drives Special Project, Data / predictive B, C and F.
capability transformation analytics, Cyber Safety, effective Limited top down ERM focus
standards implementation, whole Unclear capability value chain
company Risk Value view….
E. Company divisions Common processes, user A, E and F.
synergies to improve productivity focus, business centric Unclear harmonization of overall
product / services IT/security, new markets, company success factors
cohesion technology agnostic solutions…. No ‘commodity’ view of core
capabilities key attributes

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 5


When it comes to the security metrics themselves, there is a whole litany of perspectives, views,
references, et al to consider. The security team developed a companion security metrics strategy that
covers most of the security metrics background in detail (where several key security measures are
excerpted in the appendix for comparison). Whereas that metrics approach was from a security
operations view versus the risk value determination herein. We will use the security metrics
recommendations therein to help guide our choice of the CISO scorecard elements that follow, as they
are for the most part, intertwined and inseparable – being different views of the same risk environment.
Everyone has their own view on security metrics that matter (e.g., another metrics reference view is):
http://www.csoonline.com/article/2118152/metrics-budgets/a-few-good-information-security-metrics.html
1: Baseline Defenses Coverage (Antivirus, Antispyware, Firewall, and so on
2: Patch Latency
3: Password Strength
4: Platform Compliance Scores
5: Legitimate E-Mail Traffic Analysis
There are many other views on security metrics of course:
http://www.csoonline.com/article/2123361/metrics-budgets/security-metrics--critical-issues.html

The NIST process to develop security metrics is shown below for another process frame of reference.
Even with this high level illustration, it’s clear that it is an extensive process in itself, whereas our
development and execution resources are fairly constrained in addition to not having any formal,
overarching corporate metrics processes to align within. We will account for these steps as we begin to
socialize and phase in the implementation of the recommended CISO scorecard metrics.
Figure 3.0 – NIST Metrics Life Cycle

Effective metrics should have five qualities. Most organizations know how to pick metrics that satisfy the
first four qualities: namely, (1) that they are expressed as numbers, (2) have one or more units of
measure, (3) are measured in a consistent and objective way, and (4) can be gathered efficiently
(cheaply). Yet only a few entities choose metrics that satisfy the most important criterion: contextual
relevance. That is, the metrics must help someone -- usually the boss -- make a decision about an
important risk or business issue. Too many organizations use metrics to erect byzantine processes
(excessively complicated, involving a great deal of administrative detail) to measure the minutiae of

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 6


what they understand rather than what the boss needs to know. Failure to pass the "so-what" test
makes a metric potentially interesting but not insightful to support decisions. We use two sources of
recommended security metrics for comparison and then distill those into a select set to use within the
BSC framework suggested earlier (focusing on both risk reduction and competitive advantage benefits).
(1) The “CISO Desk Reference Guide (DRG)” key suggested metrics and (2) the Center for Internet
Security (CIS) top 20 controls recommendations – both discussed next.

The CISO DRG key metrics (chapter 5 therein) are listed below (numbered for later reference, not
priority) where those in bold text are the initial metrics proposed (phased in there as well):
Administrative metrics:
1---Legal – (a) - Percentage of material contracts (defined as involving data sets that require privacy and
security controls) that have been evaluated by security (typically as a high level risk assessment).
(b) - Percentage of material contracts (defined as involving services with sensitive information that could
impact the core company operations) that have been evaluated by security, including requirements for
breach notification and confidentiality language.
2---Financial – (a) - Percentage of IT budget allocated to security (where this value varies widely
depending on industry and the cyber baseline effectiveness - a minimum of 5% is normally expected).
(b) - As applicable, status of major security projects for the year. (added)
3--- Human Resources (HR) – (a) - Percentage of employees who had a thorough background check
(including any previous questionable activities, be that illegal or unethical).
(b) - Percentage of job descriptions that highlight the employee’s responsibility to protect the
organizations’ assets.
(c) - Percentage of employees who have taken the annual security awareness training (and ideally
passed an assessment of their retention of key cyber safe tenets)
(d) - Percentage of employees who have read, acknowledged and been tested on the security policy.
4 ---Vendor Management – (a) Percentage of material vendors (defined as those that have the potential
for critical or severe risk, should a breach or service interruption occur) who have been audited either
directly or a third party (e.g., SSAE 16 SOC 1 & 2 audit, etc.).
(b) - Percentage of material vendor relationships that are inventoried and documented.
(c) - Percentage of material vendors that participate in quarterly reviews with the security function.
Operational Metrics:
5---Software Inventory – (a) – Existence of an IT Asset Management database (“ITAM”) (added).
(b) - Percentage of known assets & ‘systems’ accounted for in “ITAM/CMDB.”
6---Information (data) Inventory – (a) Percentage of information assets accurately inventoried
(b) Percentage of information classified accurately (assumes a data classifications standard exists)
(c) – Percentage of system documented (e.g., at least a data flow diagram exists, ideally a security plan)
7--- Systems Upgrades & Patching – (a) Percentage of major systems (both hardware and software) that
are still supported by the manufactures’ or validated third party.
(b) – Percentage of systems that are patched within 30 days of critical security patches.
(c) – Percentage of system that are scanned for vulnerabilities on a monthly basis
8 – Multi-Factor Authentication (MFA) – (a) percentages of systems with IP, PII, or other sensitive data
that use MFA
(b) – Percentage of critical capabilities (Servers, firewalls, etc) & remote connections using MFA. (added)
9 – Mean time from incident response to remediation – Time delay from detection to vulnerability fix.
10 – Aggregate threat level – composite score of the leading cyber threat intel (CTI) indicators. (added)
Governance Metrics (compliance):
11 – Incident Response (IR) Plan – (a) Existence of a management reviewed and approved IR Plan
(b) – Time since IR plan was tested (at least in a table top, ideally a short breach exercise yearly).

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 7


12 - Business Impact Assessment (BIA) Analysis - (a) – Existence of a management approved BIA plan.
(b) – Last update date. (c) – Number of high risk business processes with critical “RPOs”
13 – Business Continuity / Disaster Recovery (BP / DR) plans - (a) – Existence of a management approved
BC / DR plan. (b) – Time since last test.
c) – Percentage of systems, processes or applications that met RPO / RTO objectives.
14 – Project Security Plans / risk assessment – % of projects / applications that have one. (added)
15 – Cyber Maturity level – aggregated score of the major policies and processes required. (added)
16 – Cyber events / cost – number of yearly incidents and their cost to remediate. (added)

Next we list the CIS top 20 security controls metrics recommendations (e.g. below table 3.0, 28
measures overall) as one key reference, followed by their recommendation for the top ten metrics to
use in a BSC. These CIS metrics will be harmonized with the CISO DRG metrics recommendations above
to provide our ‘best of breed” key risk reduction and competitive advantage BSC metrics, which maps
back to the major business objectives; and can show a cost benefit where applicable as well.
Table 3.0 – CIS Security Controls Key Metrics
Function Management Perspective Defined Metrics
Incident Management How well do we detect, accurately 1. Cost of Incidents
identify, handle, and recover from 2. Mean Cost of Incidents
security incidents? 3. Mean Incident Recovery Cost
4. Mean-Time to Incident Discovery
5. Number of Incidents
6. Mean-Time Between Security
Incidents
7. Mean-Time to Incident Recovery
Vulnerability Management How well do we manage the 8. Vulnerability Scanning Coverage
exposure of the organization to 9. Percent of Systems with No Known
vulnerabilities by identifying and Severe Vulnerabilities
mitigating known vulnerabilities? 10. Mean-Time to Mitigate Vulnerabilities
11. Number of Known Vulnerabilities
12. Mean Cost to Mitigate Vulnerabilities
Patch Management How well are we able to maintain 13. Patch Policy Compliance
the patch state of our systems? 14. Patch Management Coverage
15. Mean-Time to Patch
16. Mean Cost to Patch
Configuration Management What is the configuration state of 17. Percentage of Configuration
the systems in the organization? Compliance
18. Configuration Management Coverage
19. Current Anti-Malware Compliance
Change Management How do changes to system 20. Mean-Time to Complete Changes
configurations affect the security of 21. % of Changes with Security Reviews
the organization? 22. Percent of Changes with Security
Exceptions
Application Security Can we rely on the security model of 23. Number of Applications
business applications to operate as 24. Percent of Critical Applications
intended? 25. Risk Assessment Coverage
26. Security Testing Coverage
Financial Metrics What is the level and purpose of 27. IT Security Spending as Percent of IT
spending on information security? Budget
28. IT Security Budget Allocation

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 8


We need to also integrate the security strategy key tasks ‘risk value contribution’ into the CISO
scorecard, whereas any cyber strategy is essentially a hypothesis, since risk conditions change
frequently. Strategy is the combination of leveraging resources and executing a plan of action to achieve
a set of desired outcomes. A BSC/Metrics strategy is typically described from the top down (i.e., starting
from the objectives to be achieved), but all strategies must be executed from the bottom up (i.e.,
starting with the allocation and alignment of the people, processes and technologies necessary to carry
out a plan of action). In concert with the latter, the CIS core metrics BSC view is then:
Table 4.0 – CIS “BSC” Top Ten Metrics
Scorecard Area Action Required Metrics
Impact Report on security incidents and their 1. Number of Incidents
impact on the organization. 2. Cost of Incidents
Performance by Function: Report the outcome of business 3. Configuration Policy Compliance (using
Outcomes functions' Configuration Management, CIS benchmarks)
Patch Management and Vulnerability 4. Patch Policy Compliance (using current
Management. patch level)
5. Percent of Systems with No Known
Severe Vulnerabilities (using CVSS base
scores)
Performance by Function: Report the scope of business functions 6. Configuration Management Coverage
Scope and the scope of outcome metrics for 7. Patch Management Coverage
those functions. 8. Vulnerability Scanning Coverage
Financial Metrics Report on the allocation and efficiency 9. IT Security Spending as Percent of IT
of security spending Budget
10. IT Security Budget Allocation

Discussion - So now what? We’ve listed the CISO DRG overall measures and the CIS top 28 and BSC 10
measures and those risk based factors in the previous table that we proposed from the company’s
business strategy. These measures all need to be harmonized into a ‘best of breed’ metric set and then
associated to the four BSC categories described earlier. How do we reconcile them and provide
performance indicators to senior leadership and then also decide which business outcome is weighted
more - risk reduction (cost avoidance) or innovation (increase revenue). Including a competitive
advantage aspect, in adding to the overall business success objectives value? In addition, the security
team has to be able to effectively collect these metrics and transform them where required into the key
leadership level interest items. Then we need to be able to benchmark the key measures, or use some
other well-known company comparison methods, to show relative goodness. These risk focused areas
then serve as the foundation for identifying the most relevant measures to be considered on a
company’s dashboard. They cover the core areas of cyber security: risks, compliance, incidents,
awareness & culture, threat level, maturity state, and key cyber security projects in development. (note
- for more examples see:)
https://assets.kpmg.com/content/dam/kpmg/pdf/2015/04/Cyber-Security-Dashboard1.pdf
http://www.securitymetrics.org/content/Wiki.jsp?page=Chapter10ScorecardDesign

So how do we define “cyber success” using the common BSC format that also demonstrates the security
value sphere; which in turn support the company’s core business model, vision and mission? That is,
aligning the CISO scorecard to show a triad of key contributions: (1) competitive advantage (facilitate
new business, enhance reputation, establish collaborative partnerships), (2) operational efficiency
(reducing security and compliance costs, increase productivity) and (3) risk reduction (cost avoidance,
minimize the number and impact of security events).

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 9


We need to enhance the business strategy map using the security value triad elements to create a visual
tool that illustrates the key company success factors from the four BSC perspectives providing a holistic
view of the key risk mitigation factors. We start by defining overarching themes (the six risk categories
defined earlier) and break those down into components with defined objectives for long term growth.
Each objective then has quantified major initiatives, which are funded, tactical activities that support
delivery and will be measured and tracked – these are the foundation of the triad’s model. This model
then is self-sustaining as it shows the interplay and integrated nature between the overarching strategy,
themes, objectives and initiatives. The four important considerations in the pursuit of a security value
triad are: (a) Effective holistic solutions, (2) Key Stakeholder relationship management, (3) All levels of
decision support and (4) Integrated performance management, which all then must be part of the
planning and execution of each initiative.

The earlier table mapped out our business objectives into technical objectives and main associated risks;
thus we use those to frame our selection of CISO scorecard metrics. In addition the other day-to-day
security metrics we recommend (listed in the appendix) that have an operational utility also need to
show the value of supporting the scorecard view. Security adopted the RBSS approach and uses that to
prioritize our tasks and resources; thus it follows that this approach can also be used to select the best
value metrics for the targeted risks in the table. Given there are numerous ways and frameworks to map
metrics to risks, we start by choosing the CISO DRG metrics to start, while also integrating the CIS
metrics recommendations where needed into a core, best of breed, measures set, then map those
metrics to the targeted risks. This approach provides scorecard traceability to the key business
objectives and provides management with ‘data driven’ decision support to obtain the best risk value
while also enhancing the company’s competitive advantage.

Next we align the main risks with the corresponding metrics and the data they provide to assist in
decision support. The table provides a link to the risks, suggested metrics and decision support data.
Table 5.0 – Risks / Metrics / Decisions supported
Risk themes Associated Metrics Data / Decision support
A – Data 1.A – Material contacts assessed - Potential third party privacy / data break risks
protection 2.A - % IT budget for security - Resource the critical protections for this risk
3..C - % employees annual training - Significant ‘user threat’ vector for this risk
4.A - % Vendors / contractors assessed - Inherited security posture on their side
6.C - % systems documented (data flows) - Lack of documentation means more data risks
11 – Time period since last IR Plan test - Speed / accuracy of damage control is critical
14 - % projects with risk assessment (SP) - Lack of security plan means unknown risks
16 - # cyber incidents / cost to remediate - Precursor to follow on data breach costs
B - IAM 2.A - % IT budget for security - Resource the critical protections for this risk
3.A - % employees background check - Critical ‘insider threat” propensity attribute
8.A - % sensitive data systems using MFA - Stronger access controls for more data security
8.B - % critical IT systems using MFA - Improved protection of essential infrastructure
C – Vulnerability 1.A – Material contacts assessed - Potential third party privacy / data break risks
Mgmt 2.A - % IT budget for security - Resource the critical protections for this risk
4.A - % Vendors / contractors assessed - Inherited security posture on their side
5..B - % key assets / systems in ITAM/CMDB - Ability to prioritize, quantify and track risks
6.C - % systems documented (data flows) - Lack of documentation means more data risks
7.B - # systems patched within 30 days - Risk impact grows with time and exposure
7.C - % key systems monthly scan - Verify & validate remaining vulnerabilities / risk
9 – Mean time from detect to remediation - Risk impact grows with more time exposure
10 – Aggregate threat level (and trends) - Focus / tune tools for increased threat actors
13.B – BCP/DR – time since last test - Resiliency risks increase with time exposure
14 - % projects with risk assessment (SP) - Lack of security plan means unknown risks
16 - # cyber incidents / cost to remediate - Shows where vulnerability management is weak

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 10


D – Intruder 2.A - % IT budget for security - Resource the critical protections for this risk
detection 3.A - % employees background check - Critical ‘insider threat” propensity attribute
3..C - % employees annual training - Significant ‘user threat’ vector for this risk
4.A - % Vendors / contractors assessed - Inherited security posture on their side
8.A - % sensitive data systems using MFA - Stronger access controls for more data security
8.B - % critical IT systems using MFA - Improved protection of essential infrastructure
10 – Aggregate threat level (and trends) - Focus / tune tools for increased threat actors
15 – Cyber maturity level (policy / processes) - General readiness and effectiveness state
E – Policy & 1.A – Material contacts assessed - Potential third party privacy / data break risks
Monitoring 2.A - % IT budget for security - Resource the critical protections for this risk
11 – Time period since last IR Plan test - Speed / accuracy of damage control is critical
12.C – BIA - # high risk / critical “RPOs” - Resiliency and risk time exposure increased
13.B – BCP/DR – time since last test - Resiliency risks increase with time exposure
15 – Cyber maturity level (policy / processes) - General readiness and effectiveness state
F – Risk 2.A - % IT budget for security - Resource the critical protections for this risk
Management 4.A - % Vendors / contractors assessed - Inherited security posture on their side
5..B - % key assets / systems in ITAM/CMDB - Ability to prioritize, quantify and track risks
6.C - % systems documents (Sec Plan) - Lack of documentation means more data risks
7.B - # systems patched within 30 days - Risk impact grows with time and exposure
7.C - % key systems monthly scan - Verify & validate remaining vulnerabilities / risk
9 – Mean time from detect to remediation - Risk impact grows with more time exposure
10 – Aggregate threat level (and trends) - Focus / tune tools for increased threat actors
11 – Time period since last IR Plan test - Speed / accuracy of damage control is critical
12.C – BIA - # high risk / critical “RPOs” - Resiliency and risk time exposure increased
13.B – BCP/DR – time since last test - Resiliency risks increase with time exposure
14 - % projects with risk assessment (SP) - Lack of security plan means unknown risks
15 – Cyber maturity level (policy / processes) - General readiness and effectiveness state
16 - # cyber incidents / cost to remediate - Shows where risk management is weak

Any CISO Scorecard effort also needs to link those metrics into their security strategy of course and do
that in one slide too! We offer proposed IT Security strategy statements below, which should capture
the security value triad intent, and an example strategy slide. Once the BSC based measures are finalized
that clarify the security value triad, then update this slide, whereas this proposed draft serves to help
frame the final scorecard recommendations. (again, an example, notional strategy, put yours here!)

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 11


Next steps / Execution Plans.
The key metrics proposed in table 5.0 can now be assessed for their utility to our end-state set of
measures that matter. Whichever ones are chosen, we need to have an implementation plan, including
how to best capture those not used currently (which is the majority of them). In addition, we need to
decide on the best ‘starter set’ metrics to use in the CISO Scorecard – both visually and the key themes
to highlight initially. We believe that leadership’s major expectation of security is to protect the
company and minimize risk, whereas, operational efficiency and competitive advantage are then
complementary to the former; thus we use a risk lens for our CISO scorecard.

Process note - Developing a CISO Scorecard (or any type for that matter) requires quite a lot of effort to
get the final formalized and approved metrics and format (for example, see appendix item “D” / SANS
Scorecard checklist, with over two dozen recommended steps). Our scorecard efforts start with this
research / community paper to set the background, provide key references with potential
recommended security metrics, etc., to provide an initial “strawman / notional” output to facilitate
those steps and discussions required in the checklist. It’s rather like a ‘rapid programing’ effort where
the general intent is known, along with best practices therein, so a baseline is developed from those key
sources, wherein it is then generally to easier verify requirements, et al. Especially since many in
management don’t have a full grasp of what security does or the high potential damage that a data
breach risk can cause, for one example; thus showing a sample end-state scorecard can help facilitate
their understanding and concurrence on the utility and value of the whole effort.

The operative question then becomes, after all the options, mapping and assessment so far, which
metrics do we start with and what format should we use? Given we do not have a formal metrics
program approved yet and we’ll have a long road to complete the checklist steps and educate
management, a simple and short format is best. The risk vernacular is well known to all management,
results based on impact to business success objectives; thus a natural format to start our CISO Scorecard
is using the risk theme. In this case we’ll start with using the top four risks (less the policy and
management ones) to tell the current cyber story, using an example ‘risk heat map’ illustration, followed
by our proposed CISO Scorecard. This two slide communication method both sets the stage on what
matters to the organization and provides a snapshot of the metrics that move the risks to lower levels.

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 12


Next we select the initial metrics that best captures those top four risks, starting with the suggested
metrics in Table 3.0, which we can potentially capture in our current environment (Note – Once again,
the values used below are notional examples for illustrative purpose only, insert YOURS therein).
Table 6.0 – Notional CISO Scorecard elements
Perspective Objective Measures Value / trend (sample #’s) Risks / activities
Financial Manage cost 2A - % IT security 5% / steady 2017 budget approved
Data efficiency 9 – Time to fix critical risks 29 days - steady Assess more automation
Customers Breaches / Incidents 16 - # incidents / cost $ 1 (Ransomware) / 0$ CSC training on IR Plan
Partner compliance 4A - % vendors assessed 22% / decreasing Working on backlog
Threat awareness 10 – Aggregate threat level Moderate - Increasing Ransomware & Insiders
Business IdAM 3A - % background check 94% - steady user password changes
8B - % key IT using MFA 10% -increasing tokens for critical devices
Find risks / issues 7C - % systems monthly scan 15% - Increasing Targeted scans in progress
Fix vulnerabilities 5B - % assets in CMDB 63% - Increasing Developing IT / OPS CMDB
7B - % patched in 30 days 19% - increasing Legacy servers exposed
Innovation User Education 3C - % user train 88% / decreasing user awareness / phishing
Project support 14 - # of risk assessments 29% - steady Working on backlog
Cyber Maturity 15 – Maturity level 66% - increasing V&V and Conduct survey

We then translate the table into a C-Suite / BoD level set of graphics, as ‘presentation’ counts!

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 13


Now that we have our notional CISO Scorecard, we need to develop an implementation plan based on
the key recommended steps in the SANS guide, appendix item D, providing a draft POA&M, for the
below suggested follow on activities:
 Develop a formal implementation plan (treat as a project - include objectives, stakeholders,
charter as needed)(define team: IT / OPS, PMO and Sec (external POCs?)
 Identify primary audience (CIO, BoD, others?). Determine major organizational and business unit
objectives; review recent audit reports and recommendations; develop security survey
questions and audience -- identify key risk concerns and priorities therein
 Establish inventory of security controls associated with mitigation of aforementioned key risks;
Identify roles for security controls using RACI and identify associated sponsors (with resources)
 Research benchmarks and authoritative sources for security controls (e.g., best practices, quality
standards, etc.); Research relevant regulation and contract obligations
 Update the metrics that are to be presented in Security Scorecard; Establish data sources and
data owners; Creation/Selection of tools to gather data and create Security Scorecard; Update
draft Security Scorecard template.
 Circulate template and request comment based on RACI / key stakeholders; Perform resource
planning with functional managers and data owners; Establish communication protocol for
Security Scorecard audience
 Pilot first draft of Security Scorecard; Document Security Scorecard update procedures;
Document lessons learned and make appropriate corrections to Security Scorecard; Finalize
Security Scorecard layout and content; Establish Feedback mechanisms. Formally announce
Security Scorecard, et al….
(Part of this metrics effort is benchmarking… http://www.bankinfosecurity.com/whitepapers/cybersecurity-
benchmarking-cios-guide-for-reducing-security-anxiety-w-2888 )

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 14


Appendix:
So the operative challenge then is to first decide WHAT information to collect to support security
metrics that enable management decisions. We use the NIST, CISO DRG and CIS metrics
recommendations to start and bound our selection set (which also aligns to ISO27001/2). Following
items are taken from the Security Metrics Strategy to align that view with this CISO Scorecard view.

A---NIST Special Publication (SP) 800-55, Performance Measurement Guide for Information Security
http://csrc.nist.gov/publications/nistpubs/800-55-Rev1/SP800-55-rev1.pdf
1 – Security budget
2 – Vulnerability management
3 – Access control
4 – Awareness and training
5 – Audit and accountability
6 – Certification and accreditation
7 - Configuration management
8 – Contingency planning
9 – Identification and authentication
10 – Incident response
11 – Maintenance
12 – Media protection
13 – Physical environment
14 – Planning
15 – Personnel security
16 – Risk assessment
17 – Systems and services acquisition
18 – Systems and communications protection
19 – System and information integrity
(see also NIST 800-100, Section 7.0 (summarizes 800-55))

B---CIS top 28 metrics suggestions (based / mapped on their CIS top 20 controls) (listed in main body):
https://benchmarks.cisecurity.org/tools2/metrics/CIS_Security_Metrics-Quick_Start_Guide_v1.0.0.pdf
The CIS Measurement Companion to the CIS Critical Security Controls provides specific information
sets and suggested thresholds to measure over 90 of the CSCs; these will be used to execute and
implement the final recommended metrics set that supports our BSC view.
https://www.cisecurity.org/critical-controls/
In addition, SANS has a CIS 20 CSC based paper on metrics for each of the controls, to assist in execution:
https://www.sans.edu/student-files/projects/jwp-caincouture-whitepaper.doc

C---A SAMPLE Security Metrics list:


Process Area Defined Metric
---Vulnerability Management
VM-1: Vulnerability Scan Coverage
VM-2: Percent of Systems without Known High Vulnerabilities
VM-3: Number of Known Vulnerability Instances (High, Med, Low)
---Patch Management
PM-1: Mean Time to Patch Covered Systems (Clients, Servers)

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 15


PM-2: Number of Patches Deployed
---Configuration Management
CM-1: Number of Devices (Clients, Servers, Network, Other)
CM-2: Number of Blocked Internet Emails
CM-3: Number of Blocked Internet Access Attempts
---Incident Management
IM-1: Number of Investigative Support Requests
IM-2: Number of Incidents
IM-3: Number of Discovered Malware Types
IM-4: Number of Malware Agents Remediated
IM-5: Number of Compromised Clients
---Risk Management
RM-1: Number of Risk Assessments
RM-2: List of Risk Assessments Completed
RM-3: Number of US-CERT items that apply / remediated
---Awareness and Training
AT-1: Number of Awareness Briefings/Communications
---Program Management
PG-1: List of Audits (YTD with # of Findings…)
PG-2: Project status

D- Security Scorecard Checklist (SANS)


1. Create statement of objectives
2. Establish target audience and stakeholders
3. Create a formal project charter and plan
4. Present plan and obtain letter of authorization from CIO to stakeholders
5. Host project kick--‐off meeting (ideally with CIO present)
6. Identify major organizational and business unit objectives
7. Review recent audit reports and recommendations
8. Author questions for Executive Interview(s)
9. Perform executive interviews to identify key risk concerns and priorities
10. Establish inventory of security controls associated with mitigation of aforementioned key risks
11. Identify roles for security controls using RACI and identify associated sponsors ($)
12. Research benchmarks and authoritative sources for security controls (e.g., best practices, quality
standards, etc.)
13. Research relevant regulation and contract obligations
14. Identify and document metrics that are to be presented in Security Scorecard
15. Establish data sources and data owners
16. Creation/Selection of tools to gather data and create Security Scorecard
17. Draft Security Scorecard template
18. Circulate template and request comment based on RACI / $
19. Perform resource planning with functional managers and data owners
20. Establish communication protocol for Security Scorecard audience
21. Pilot first draft of Security Scorecard
22. Document Security Scorecard update procedures
23. Document lessons learned and make appropriate corrections to Security Scorecard
24. Finalize Security Scorecard layout and content
25. Establish Feedback mechanisms
26. Formally announce Security Scorecard
27. Close project

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 16


E--- The final “IT” Security metrics approach should account for several perspectives:
(1) Supporting the vision / mission / objectives as implemented in each major business unit,
(2) Set of effective overall programmatic performance metrics that most organization should employ
and
(3) User / customer support metrics (e.g., satisfaction, productivity issues and value / RoI).
Each “IT” Security effort has their own metrics to support their key efforts iso the overall “IT” Security
objectives. These metrics must be well integrated and complementary between themselves and link to
the overall organizational metrics where applicable.

All security metrics must support key organizational goals and objectives, as listed earlier; thus working
backwards to quantify the specific data that can be easily captured at some level and then used to
support decision making, resource allocation, etc. For “COMPANY” we propose the following top down
measurements, which also must support a balanced scorecard (BSC) view. We start with three major
areas to report on: (1) key residual risks and impacts, (2) business value supported and (3) the overall
security state. These can then be further defined with suggested supporting data elements.
(1). Top technical security risks, which then must be put into a business impact view; (a) weak and
inconsistently enforced IdAM (CIS #5, 9, 14, 15 & 16), (b) minimal ITAM / CMDB process and adhoc
release management (CIS # 1,2, 3, & 11), (c) intruders & malware (CIS # 6, 12, 19), (d) unclear privacy
and data protection methods (CIS # 10 & 13), and (e) weakly correlated security products (CIS
#20)(minimizing the ability to quickly spot an intruder). These must all be quantified in terms of business
impacts.
(2). Business value as measured against the company’s key success factors. (… starting with the company
objectives listed earlier, then list / map them…). Translate the key security priorities into supporting
those factors, as well as enhanced productivity overall and added competitive advantage.
(3) Overall state of security. This high level cyber metric can be overly complex if not confined to
‘security data that matters’ (e.g., the voluminous IT/security logs kept) and founded on ‘outcome’ based
metrics. Areas to include at some level are: Cyber vulnerabilities, threat report, Security projects status,
and process compliance (and maturity), again as mapped into a BSC view showing risk value and
competitive advantage.

Proposed ‘operational’ security metrics


(Still a master wish list, need to iterate and phase in key values measures)
Metric process Map to scorecard comments
Trouble tickets
Security tool #1 statistics
Security tool #1 statistics
Security tool #1 statistics
Firewall statistics
Time to detect IoCs ???? Speed to halt attack / damages
Number of compromised Botnet / malware Help Desk collect? How prevalent malware is
systems infections per device
Access controls / Snapshot of IdAM how often are employee accounts updated /
attempts entry points re-verified
User password status Assess passwords Relative state of PWs, time from termination
monthly to account deletion
Admin account status Assess Admin PWs Relative state of PWs
weekly (% / change)
Third party / suppliers % reviewed, how Verifies controls
often
% of outbound DNS
monitored

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 17


peer-to-peer sharing
number of open ports during some time frame
Incident volume Potential attacks / Total “B” events per month?
week?
Blocked emails
Blocked internet access
Antivirus activity AV statistics? Devices reimaged that month
patch management Vulnerabilities listed versus still open
efficiency
# patched systems
Mean time to patch
Systems with NO highs
ALL known vulnerabilities
2nd phase ---- ---- ----
baseline defense coverage devices that have some sort of defense
metric
Data in/data out how much data is moving
in and out of the network and through which
ports
Physical access Failures, IoT, CCTV, etc.
Industry benchmarks What others measure / report
Cyber maturity Deloitte follow on
Risk Assessments Programs statistic
Audit Status
Training / awareness Phishing, SME courses, new methods

Preliminary DRAFT for community review Mike.Davis.SD@gmail.com 18

You might also like