A Taxonomy of Operational Cyber Security Risks

James J. Cebula Lisa R. Young

December 2010 TECHNICAL NOTE CMU/SEI-2010-TN-028 CERT Program
Unlimited distribution subject to the copyright.
®

http://www.sei.cmu.edu

This report was prepared for the SEI Administrative Agent ESC/XPK 5 Eglin Street Hanscom AFB, MA 01731-2100 The ideas and findings in this report should not be construed as an official DoD position. It is published in the interest of scientific and technical information exchange. This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a federally funded research and development center sponsored by the U.S. Department of Defense. Copyright 2010 Carnegie Mellon University. NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN "AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder. Internal use. Permission to reproduce this document and to prepare derivative works from this document for internal use is granted, provided the copyright and "No Warranty" statements are included with all reproductions and derivative works. External use. This document may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other external and/or commercial use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013. For information about SEI publications, please visit the library on the SEI website (http://www.sei.cmu.edu/library).

Table of Contents

Acknowledgements Abstract Introduction Taxonomy of Operational Cyber Security Risks Class 1 Actions of People Subclass 1.1 Inadvertent Subclass 1.2 Deliberate Subclass 1.3 Inaction Class 2 Systems and Technology Failures Subclass 2.1 Hardware Subclass 2.2 Software Subclass 2.3 Systems Class 3 Failed Internal Processes Subclass 3.1 Process Design or Execution Subclass 3.2 Process Controls Subclass 3.3 Supporting Processes Class 4 External Events Subclass 4.1 Subclass 4.2 Subclass 4.3 Subclass 4.4 Hazards Legal Issues Business Issues Service Dependencies

vii ix 1 2 3 3 4 4 4 4 5 5 5 5 6 6 6 7 7 7 7 9 10 10 11 16

Harmonization with Other Risk Practices FISMA NIST Special Publications SEI OCTAVE Threat Profiles Conclusion

Appendix A: Mapping of NIST SP 800-53 Rev. 3 Controls to Selected Taxonomy Subclasses and Elements 17 Appendix B: Mapping of Selected Taxonomy Subclasses and Elements to NIST SP 800-53 Rev. 3 Controls 27 References 33

CMU/SEI-2010-TN-028 | i

CMU/SEI-2010-TN-028 | ii

Business Processes. and Risk [Caralli 2010a] 9 10 OCTAVE Generic Threat Profile for Human Actors Using Network Access [Alberts 2001b]12 OCTAVE Generic Threat Profile for Human Actors Using Physical Access [Alberts 2001b]13 OCTAVE Generic Threat Profile for System Problems [Alberts 2001b] OCTAVE Generic Threat Profile for Other Problems [Alberts 2001b] 14 15 CMU/SEI-2010-TN-028 | iii .List of Figures Figure 1: Figure 2: Figure 3: Figure 4: Figure 5: Figure 6: Relationships Among Assets. Sustainability. and Services [Caralli 2010a] Protection.

CMU/SEI-2010-TN-028 | iv .

List of Tables Table 1: Table 2: Table 3: Taxonomy of Operational Risk Mapping of NIST Control Families to Selected Taxonomy Subclasses and Elements Mapping of Taxonomy Subclasses and Elements to NIST Controls 3 17 27 CMU/SEI-2010-TN-028 | v .

CMU/SEI-2010-TN-028 | vi .

and Richard Caralli of the Software Engineering Institute’s CERT® Program for their thorough review and input to this document. James Stevens.Acknowledgements We wish to acknowledge David Mundie. CMU/SEI-2010-TN-028 | vii .

CMU/SEI-2010-TN-028 | viii .

which are described by their elements.Abstract This report presents a taxonomy of operational cyber security risks that attempts to identify and organize the sources of operational cyber security risk into four classes: (1) actions of people. and the CERT Operationally Critical Threat. This report discusses the harmonization of the taxonomy with other risk and security activities. Each class is broken down into subclasses. (3) failed internal processes. particularly those described by the Federal Information Security Management Act (FISMA). CMU/SEI-2010-TN-028 | ix . the National Institute of Standards and Technology (NIST) Special Publications. Asset. and (4) external events. (2) systems and technology failures. and Vulnerability EvaluationSM (OCTAVE®) method.

CMU/SEI-2010-TN-028 | x .

Failure of these assets has a direct. Operational risks are defined as those arising due to the actions of people. This taxonomy can be used as a tool to assist in the identification of all applicable operational cyber security risks in an organization. part of Carnegie Mellon University’s Software Engineering Institute (SEI). and the threat profile concept contained within the CERT Operationally Critical Threat. This. this report also discusses the harmonization of the taxonomy with other risk identification and analysis activities such as those described by the Federal Information Security Management Act of 2002 [FISMA 2002]. negative impact on the business processes they support. systems and technology failures.Introduction Organizations of all sizes in both the public and private sectors are increasingly reliant on information and technology assets. CMU/SEI-2010-TN-028 | 1 . (3) failed internal processes. availability. failed internal processes. Operational cyber security risks are defined as operational risks to information and technology assets that have consequences affecting the confidentiality. which ultimately impacts the organizational mission. The CERT® Program. supported by people and facility assets. (2) systems and technology failures. Asset. and Vulnerability Evaluation is a service mark of Carnegie Mellon University. Within the cyber security space. developed these four classes of operational risk in the CERT® Resilience Management Model [Caralli 2010b]. and Vulnerability EvaluationSM (OCTAVE®) method. support the delivery of services. in turn. can cascade into an inability to deliver services. and external events. which draws upon the definition of operational risk adopted by the banking sector in the Basel II framework [BIS 2006]. in turn. or integrity of information or information systems. the risk management focus is primarily on operational risks to information and technology assets. Operationally Critical Threat. to successfully execute business processes that. which are described by their elements. Asset. People and facility assets are also considered to the extent that they support information and technology assets. ® SM CERT and OCTAVE are registered marks owned by Carnegie Mellon University. Each class is broken down into subclasses. Toward that end. and (4) external events. This report presents a taxonomy of operational cyber security risks that attempts to identify and organize the sources of operational cyber security risk into four classes: (1) actions of people. security guidance contained within the National Institute of Standards and Technology (NIST) Special Publications series. Given these relationships. the management of risks to these assets is a key factor in positioning the organization for success.

or lack of action. execution. software. In this case. and information systems failed internal processes—problems in the internal business processes that impact the ability to implement. Additionally. and sustain cyber security. and elements. It is important to note that risks can cascade: risks in one class can trigger risks in another class. a software failure due to improper security settings could be caused by any of the elements of inadvertent or deliberate actions of people. such as process design. For example. such as disasters. legal issues. summarized in Table 1 and detailed in this section. The taxonomy has four main classes: • actions of people—action. and each subclass is described by its elements. taken by people either deliberately or accidentally that impact cyber security • • systems and technology failures—failure of hardware. the analysis of a particular risk may involve several elements from different classes. and control external events—issues often outside the control of the organization. These risks are a small subset of the universe of risks of concern to DHS and covered by its lexicon. subclasses. this taxonomy complements the Department of Homeland Security (DHS) Risk Lexicon [DHS 2008] by describing instances of operational cyber security risks in greater detail.Taxonomy of Operational Cyber Security Risks The taxonomy of operational cyber security risks. business issues. manage. and service provider dependencies • Each of these four classes is further decomposed into subclasses. CMU/SEI-2010-TN-028 | 2 . is structured around a hierarchy of classes. The structure of this taxonomy is derived from risk taxonomies previously developed by the SEI in the engineered systems operations [Gallagher 2005] and high-performance computing software development [Kendall 2007] subject areas.

1.1 Hardware 2.1 Fraud 1.3 Litigation 4. Subclass 1.2 Errors 1.1.2 Configuration management 2.5 Information flow 3.2 Metrics 3.2 Fire 4.2.2 error—individual without knowledge of the correct procedure taking incorrect action 1.3 Theft 1.3 Inaction 1.1 Supplier failure 4.1.4 Complexity 3.3. Failed Internal Processes 3. This subclass is composed of the elements mistakes.3 Omissions 1.3 Maintenance 2.3 Periodic review 3.2.2 Software 2. Inadvertent actions are usually.2.4 Process ownership 3.1.4 Obsolescence 2.3 Roles and responsibilities 3.1.2 Legal issues 4.4 Procurement 4.4 Availability Class 1 Actions of People Actions of people describes a class of operational risk characterized by problems caused by the action taken or not taken by individuals in a given situation.4 Security settings 2.2.1 Status monitoring 3.1.3.2.1.1. deliberate actions (by insiders or outsiders). associated with an individual internal to the organization.1 Utilities 4.4.3 Systems 2.3.7 Service level agreements 3.1 Design 2.2 Sabotage 1.3. This class covers actions by both insiders and outsiders.4 Earthquake 4. errors. and inaction (generally by insiders).3 Change control 2.1 Capacity 2.2.1.1.3.1.1.2 Market conditions 4.3 Business issues 4.3.1.1 Process design or execution 3.1.3 Supporting processes 3.3.3.3.1.3.5 Coding practices 2.1.2 Specifications 2. and omissions.3.3.4 Transportation 1.3 Guidance 1.2.2 Emergency services 4.4.1 Regulatory compliance 4.2.3 Integration 2.3 Flood 4.2.5 Unrest 4.3.1.4.1 Inadvertent 1.1 Process flow 3.2 Knowledge 1.2.1.1.1.2.1 Skills 1. Actions of People 1.4 Vandalism 1.3 omission—individual not taking a known correct action often due to hasty performance of a procedure CMU/SEI-2010-TN-028 | 3 . Its supporting subclasses include inadvertent actions (generally by insiders).2 Legislation 4.1 Staffing 3.6 Pandemic 4.2 Performance 2.1.2.1 Mistakes 1.6 Testing 2.1 Disasters 4.2.2. though not exclusively.3 Training and development 3. Systems and Technology Failures 2.2 Funding 3.4.2 Process documentation 3.3.3 Economic conditions 4.1 mistake—individual with knowledge of the correct procedure accidentally taking incorrect action 1.1.Table 1: Taxonomy of Operational Risk 2. 1. External Events 4.3.2 Process controls 3.2.6 Escalation of issues 3.1 Compatibility 2.1.1.8 Task hand-off 3.2.2 Deliberate 1.4 Notifications and alerts 3.4 Service dependencies 4.1 Weather event 4.2.1 Inadvertent The inadvertent subclass refers to unintentional actions taken without malicious or harmful intent.3 Fuel 4.

2 sabotage—a deliberate action taken to cause a failure in an organizational asset or process.1.2 performance—inability to complete instructions or process information within acceptable parameters (speed. unauthorized taking of organizational assets.1.2. maintenance.4 obsolescence—operation of the equipment beyond its supported service life CMU/SEI-2010-TN-028 | 4 . 1.) 2.2.2. software.3 theft—the intentional.1 fraud—a deliberate action taken to benefit oneself or a collaborator at the expense of the organization 1.3 Inaction The inaction subclass describes a lack of action or failure to act upon a given situation.1 capacity—inability to handle a given load or volume of information 2. often at random Subclass 1.1 Hardware The hardware subclass addresses risks traceable to failures in physical equipment due to capacity. Deliberate actions could be carried out by either insiders or outsiders.1 skills—an individual’s lack of ability to undertake the necessary action 1.3. This subclass is described by the elements fraud. in particular information assets 1. etc. sabotage. Its supporting subclasses include failures of hardware.2 Deliberate The deliberate subclass of actions of people describes actions taken intentionally and with intent to do harm.2.3 guidance—a knowledgeable individual lacking the proper guidance or direction to act 1. and a lack of availability of the correct person to take action. heat load.3. and obsolescence.4 availability—the unavailability or nonexistence of the appropriate resource needed to carry out the action Class 2 Systems and Technology Failures Systems and technology failures describes a class of operational risk characterized by problematic abnormal or unexpected functioning of technology assets. Elements of inaction include a failure to act because of a lack of appropriate skills. generally carried out against targeted key assets by someone possessing or with access to inside knowledge 1. performance. a lack of knowledge. theft. 2.2 knowledge—an individual’s ignorance of the need to take action 1. Subclass 2.3 maintenance—failure to perform required or recommended upkeep of the equipment 2.Subclass 1.1.3. power consumption.3. 1.4 vandalism—the deliberate damaging of organizational assets. a lack of guidance. and integrated systems. and vandalism.1.

and stakeholders CMU/SEI-2010-TN-028 | 5 . 2. process documentation. information flow. change control.3 integration—failure of various components of the system to function together or interface correctly.3. outputs. and supporting processes. and testing.1.6 testing—inadequate or atypical testing of the software application or configuration Subclass 2. coding practices. The elements of software failures are compatibility. Subclass 3. either too relaxed or too restrictive.3 Systems The systems subclass deals with failures of integrated systems to perform as expected.1 design—improper fitness of the system for the intended application or use 2.2. flow. service level agreements.2. 3. The elements of process design or execution are process flow.1. and rigor 2.2.4 complexity—system intricacy or a large number or interrelationships between components Class 3 Failed Internal Processes Failed internal processes describes a class of operational risk associated with problematic failures of internal processes to perform as needed or expected. specifications. and complexity.2 specifications—improper or inadequate definition of requirements or failure to adhere to the requirements during system construction 2.2 Software The software subclass addresses risks stemming from software assets of all types. integration.2 process documentation—inadequate documentation of the process inputs.2.2. also includes inadequate testing of the system 2. escalation of issues.1 Process Design or Execution The process design or execution subclass deals with failures of processes to achieve their desired outcomes due to process design that is improper for the task or due to poor execution of a properly designed process.1 compatibility—inability of two or more pieces of software to work together as expected 2. including programs. including syntax and logic problems and failure to follow secure coding practices 2. process controls. configuration management. Systems failures are described by the elements design.1 process flow—poor design of the movement of process outputs to their intended consumers 3.3 change control—changes made to the application or its configuration by a process lacking appropriate authorization. Its supporting subclasses include process design or execution.3.3. within the program or application 2.4 security settings—improper application of security settings.Subclass 2. applications.3.2 configuration management—improper application and management of the appropriate settings and parameters for the intended use 2. roles and responsibilities. and operating systems.5 coding practices—failures due to programming errors. 2.2. review. security settings. notifications and alerts. and task hand-off.

4 process ownership—failure of a process to deliver the expected outcome because of poor definition of its ownership or poor governance practices Subclass 3.2 metrics—failure to review process measurements over time for the purpose of determining performance trends 3. and process ownership. accounting. 3.5 information flow—poor design of the movement of process information to interested parties and stakeholders 3. periodic review.3 Supporting Processes The supporting processes subclass deals with operational risks introduced due to failure of organizational supporting processes to deliver the appropriate resources.3.8 task hand-off—“dropping the ball” due to the inefficient handing off of a task in progress from one responsible party to another Subclass 3.3. legal issues.1.2 funding—failure to provide appropriate financial resources to support its operations 3. and service dependencies.1 status monitoring—failure to review and respond to routine information about the operation of a process 3.2 Process Controls The process controls subclass addresses process failures due to inadequate controls on the operation of the process. 3.3. The elements of this subclass are status monitoring.3. training and development.1. business issues. The supporting processes of concern are the elements staffing.1.2.4 procurement—failure to provide the proper purchased service and goods necessary to support operations Class 4 External Events External events describes a class of operational risk associated with events generally outside the organization’s control.1 staffing—failure to provide appropriate human resources to support its operations 3. and procurement.2.2. CMU/SEI-2010-TN-028 | 6 .3 training and development—Failure to maintain the appropriate skills within the workforce 3.4 notifications and alerts—inadequate notification regarding a potential process problem or issue 3.6 escalation of issues—the inadequate or nonexistent ability to escalate abnormal or unexpected conditions for action by appropriate personnel 3.7 service level agreements—the lack of agreement among process stakeholders on service expectations that causes a failure to complete expected actions 3.1.1.3 roles and responsibilities—insufficient definition and understanding of process stakeholder roles and responsibilities 3.1. Often the timing or occurrence of such events cannot be planned or predicted.3. metrics.2.3 periodic review—failure to review the end-to-end operation of the process on a periodic basis and make any needed changes 3. The supporting subclasses of this class include disasters.

4.2. water supply.1 supplier failure—the temporary or permanent inability of a supplier to deliver needed products or services to the organization 4.2 legislation—new legislation that impacts the organization 4.3. 4.3 fuel—failure of external fuel supplies. 4. flood.1.4 Service Dependencies The service dependencies subclass deals with risks arising from the organization’s dependence on external parties to continue operations.1 utilities—failure of the organization’s electric power supply. 4. legislation. deals with operational risks arising from changes in the business environment of the organization. fire.1.3 litigation—legal action taken against the organization by any stakeholder.3 flood—flooding within a facility or disruption caused by a flood external to a facility 4.4 earthquake—disruption of organizational operations due to an earthquake 4.1. riot. or hurricane 4. fuel.2.3 Business Issues The business issues subclass.2 Legal Issues The legal issues subclass deals with risks potentially impacting the organization due to the elements regulatory compliance.2 emergency services—dependencies on public response services such as fire.1 Hazards The hazards subclass deals with risks owing to events.4. and emergency medical services 4.3 economic conditions—the inability of the organization to obtain needed funding for its operations Subclass 4. earthquake. and transportation. The subclass is associated with the elements of utilities.6 pandemic—widespread medical conditions that disrupt organizational operations Subclass 4.3. for example to power a backup generator CMU/SEI-2010-TN-028 | 7 . tornado. including employees and customers Subclass 4.4. and pandemic.1 regulatory compliance—new governmental regulation or failure to comply with existing regulation 4. snow. over which the organization has no control and that can occur without notice. both natural and of human origin.2 market conditions—the diminished ability of the organization to sell its products and services in the market 4. described by the elements of supplier failure.5 unrest—disruption of operations due to civil disorder.1. or telecommunications services 4.2 fire—fire within a facility or disruption caused by a fire external to a facility 4.2. and economic conditions. and litigation. unrest.3. emergency services. police. The elements supporting this subclass include weather event. or terrorist acts 4.1 weather event—adverse weather situations such as rain.1.Subclass 4. market conditions.1.4.

4. inability of employees to report to work and inability to make and receive deliveries CMU/SEI-2010-TN-028 | 8 .4 transportation—failures in external transportation systems.4. for example.

To provide context and prioritize and manage these risks in a structure maned ner. As part of a risk management strategy. facilities. CMU/SEI-2010-TN-028 | 9 . failed internal processes. The relationhat ships among assets. This. and Services [Caralli 2010a] Failure of these assets can have a direct. The taxonom can assist in identifying operational risks in all four cl my lasses (actions of people. negative impact on the business processes that th suphey an e port. business proc cesses. There are f re four primary types of assets: people. and externa events) al to each of the four asset types. Risk management involves a bala ance between risk conditions (such as threats and vulnera abilities) and risk consequences. a basic understanding of the r relationships among assets. cascades into a inability to deliver services and ultimately impacts the mission of the organization. info ormation. the primary focus is on operationa risks to information and technology assets. ness processes are the activities th support the organization’s delivery of services. Assets are the building blocks of business processe Busies. and services are shown in Figure 1.Harmonization with O Other Risk Practices The taxonomy can be used as a to to help identify all applicable operational cyber secur risks ool rity in an organization. and servi ices needs to be established. In the cyber security arena. protective and sustaining c f controls are applied to assets. Business Processes. Assets ar the basic units of value in the organization. business processes. in turn. and technology. Figure 1: Relationships Among As ssets. systems and te echnology failures. although p al people and facility assets are also considered. as shown in Figure 2.

The Federal Information Security Management Act of 2002 (FISMA). federal government agencies. and (C) availability. integrity. use. Sustainability. These guidelines are known as the CMU/SEI-2010-TN-028 | 10 . This definition links the identified operational cyber security risks to specific examples of consequences impacting confidentiality. and includes ensuring information non repudiation and authenticity. disclosure. and Risk [Caralli 2010a] Protective controls are intended to help manage risk conditions. and availability.S. NIST Special Publications In addition to providing the definition of information security described above. The FISMA definition of information security reads as follows: The term “information security” means protecting information and information systems from unauthorized access. FISMA The taxonomy provides a structured set of terms that covers all of the significant risk elements that could impact cyber security operations. disruption. This is an important building block in the control selection and risk mitigation process. the FISMA legislation also tasked the National Institute of Standards and Technology (NIST) with developing information security guidelines for use by federal agencies. while sustaining controls are intended to help manage risk consequences. (B) confidentiality. or destruction in order to provide— (A) integrity. controls are applied at the asset level. modification. including means for protecting personal privacy and proprietary information.Figure 2: Protection. which applies to U. which means ensuring timely and reliable access to and use of information. which means preserving authorized restrictions on access and disclosure. which means guarding against improper information modification or destruction. provides a working definition of information security. In both cases.

The taxonomy and the techniques described in OCTAVE can serve as cross-checks to each other to ensure coverage of all classes of operational cyber security risk. 3 [NIST 2009]. 3 control catalog. The controls specified in NIST SP 800-53 are primarily protective in nature and are applied tactically at the information-system level. and (4) other problems. CMU/SEI-2010-TN-028 | 11 . In general. Of particular interest is NIST SP 800-53 rev. SEI OCTAVE Threat Profiles The OCTAVE method. Appendix B provides the reverse: a mapping of taxonomy subclasses to the NIST SP 800-53 rev. Appendix B can be used to determine which NIST controls to consider in order to mitigate specific operational cyber security risks.NIST Special Publications (SP). Phase 1 of the OCTAVE method uses the concept of asset-based threat profiles [Alberts 2001b]. While it is not the intent of this report to provide a detailed discussion of OCTAVE. 3 to the risk subclasses identified in the taxonomy is provided in Appendix A. the threat profiles are introduced here as a useful. the threat categories from OCTAVE align with the classes in the risk taxonomy as follows: • humans with network access – actions of people class • • • humans with physical access – actions of people class system problems – systems and technology failures class other problems – failed internal processes and external events classes The threat profiles are represented graphically in a tree structure. which provides a control catalog to be applied to federal information systems based on an analysis of the system’s relative importance and consequence of loss. A mapping of the control catalog in NIST SP 800-53 rev. Figure 3 through Figure 6 below illustrate the OCTAVE generic threat profiles for the four threat categories. (3) system problems. in-line with the definition of operational security risks. These generic categories can easily be extended or tailored to suit the particular need. (2) human actors using physical access. This appendix can be used to match NIST control families to types of operational cyber security risk. provides a process for an organization to perform a comprehensive security risk evaluation. OCTAVE uses four standard threat categories: (1) human actors using network access. developed by the SEI [Alberts 2001a]. In general. the controls specified in NIST SP 800-53 are at a lower level than the elements in the taxonomy. The taxonomy can be used as a tool to link the application of these controls into a broader risk management strategy. graphical vehicle to link assets to risks and consequences.

Figure 3: OCTAVE Generic Threat Profile for Human Actors Using Network Access [Alberts 2001b] CMU/SEI-2010-TN-028 | 12 .

Figure 4: OCTAVE Generic Threat Profile for Human Actors Using Physical Access [Alberts 2001b] CMU/SEI-2010-TN-028 | 13 .

Figure 5: OCTAVE Generic Threat Profile for System Problems [Alberts 2001b] CMU/SEI-2010-TN-028 | 14 .

Figure 6: OCTAVE Generic Threat Profile for Other Problems [Alberts 2001b] CMU/SEI-2010-TN-028 | 15 .

Conclusion This report presents a taxonomy of operational cyber security risks and also discusses the relationship of the taxonomy to other risk and security activities. The taxonomy organizes the definition of operational risk into the following four classes: (1) actions of people. and integrity of information and information systems. (2) systems and technology failures. The results of this fieldwork will inform taxonomy revisions. Each of these four classes is further decomposed into subclasses and elements. The present taxonomy is being validated through fieldwork with organizations under varying levels of regulatory compliance obligation and risk tolerance. The relationship of operational risks to consequences is discussed in the context of FISMA. Operational cyber security risks are defined as operational risks to information and technology assets that have consequences affecting the confidentiality. CMU/SEI-2010-TN-028 | 16 . and (4) external events. and the OCTAVE method. the NIST Special Publications. availability. (3) failed internal processes. We anticipate that revisions to the taxonomy will be necessary to account for changes in the cyber risk landscape.

1 N/A 1. 2. 2. 2. 2. 2.3. 2.1.2.Appendix A: Mapping of NIST SP 800-53 Rev. 3.3.2.2.2.3.1 1. References to taxonomy subclasses and elements refer to the numbering scheme shown in Table 1 and the body of this report.1 1.2 refers to the configuration management element of systems and technology failures – software. 1.1. 1. 3. 2.2. For example. 1. and item 2.2. 2.3 2.3 1. 3. 2.2. 3. 3.3.2. 3 Controls to Selected Taxonomy Subclasses and Elements Table 2 can be used to match NIST control families to types of operational cyber security risk.3 1.1.1. 2.2.2.3 2. 2. 3.1.2.4.1.2.2.2. 2.3 2. 3.2.2.3.2 1. 2.2. 3 Control Number AC-1 AC-2 AC-3 AC-4 AC-5 AC-6 AC-7 AC-8 AC-9 AC-10 AC-11 AC-12 AC-13 AC-14 Access Control Policy and Procedures Account Management Access Enforcement Information Flow Enforcement Separation of Duties Least Privilege Unsuccessful Login Attempts System Use Notification Previous Logon (Access) Notification Concurrent Session Control Session Lock Withdrawn Withdrawn Permitted Actions Without Identification or Authentication AC-15 AC-16 AC-17 Withdrawn Security Elements Remote Access Control Description 3.2. 2. 2.3 2.1.1.1.2. 2. 1. 3. 2.4. 2.2.2. 2.2.2.2.1 refers to the subclass inadvertent actions of people (all elements apply).2 2.1. 2. 2.2 CMU/SEI-2010-TN-028 | 17 . 2. Table 2: Mapping of NIST Control Families to Selected Taxonomy Subclasses and Elements Taxonomy Subclasses and Elements NIST SP 800-53 Rev.2.2.2.2.3 1. item 1.3 N/A N/A 2.

3 1.2. 3.3. 3.2 1. 1.2. and Reporting Audit Reduction and Report Generation Time Stamps Protection of Audit Information Non-Repudiation Audit Record Retention Audit Generation Control Description Taxonomy Subclasses and Elements 1.3. 3. 3. 2.2.2.3.2.3. 3.1.3 1. 1.2 1. 3.2 2.2.1. 4.1 2.4.3 1. 3.1.1.2.2. 1.2 1.2. 3.1 CMU/SEI-2010-TN-028 | 18 .3.3. 2.3 1.1. 4. 3.1.1. 1.1. 4. 3.2 3. 4.3. 3. 3. 2.2. 4.2.3.1 1.3 3.1. 3. 2. 3.1.2 2.2.1. 3. 2. 1. 1. 3.1. 3.2.3 1.3. 3. 4.3. 3.3.2.1.2.3.2.2 1.3.3. 2. 1. 1. 3.1. 3.1.2.2. 2.NIST SP 800-53 Rev. 3.1. 2. 3. 3.3. 1. 3.2 1.3. 1. 3 Control Number AC-18 AC-19 AC-20 AC-21 Wireless Access Access Control for Mobile Devices Use of External Information Systems User-Based Collaboration and Information Sharing AC-22 AT-1 Publicly Accessible Content Security Awareness and Training Policy and Procedures AT-2 AT-3 AT-4 AT-5 Security Awareness Security Training Security Training Records Contacts with Security Groups and Associations AU-1 Audit and Accountability Policy and Procedures AU-2 AU-3 AU-4 AU-5 AU-6 AU-7 AU-8 AU-9 AU-10 AU-11 AU-12 Auditable Events Content of Audit Records Audit Storage Capacity Response to Audit Processing Failures Audit Review.1 1.3. 1.2 3.1. Analysis.1.2.3.2.2.2. 3. 4.3 1. 2.2 1.3.1.1. 3.2. 2.1.2 2.1. 3. 3.1.1.1.1.2. 2.2.2.1. 1.2.1.3.1. 1.1.

2.2. 3.2.1.2 2.1. 1.2.4.2. 4.1.1. 2. 2.3 3.3. 4.2.5.3 1.1. 2.3.2. 2.3.2.3. 4.4 N/A CMU/SEI-2010-TN-028 | 19 .1.2. 2. 4.4.2.1.1.2 1.1 2.2. 3. 3.2.1.2.2.2. 2. 3.1. 3.2.2.2.3.1.1. 3.2. 3.2.3. 3. 4. 4.NIST SP 800-53 Rev. 2.2.2.4. 2. 4.2.2. 1.3.2.2.1 1.1. 2. 3. 3.2 1.3.1.2 2. 3.2. 2.2. 2. 2. 3. 2.3.2. 4. 3.1.1.1. 4. 3. 1.5.1. 4.2. 3.2.1. 2.4 1. 2. 3. 3.3. 3.2 2. 4.2 2.3. 3. 1.2. 3.2.1 2. 4. 3.2.2 2.3.4. 3.3.2.2. 2.3. 4.2. 3. 2.2.2.2.1.1 N/A 1.2.3. 3.4 1. 2. 1. 2. 3. 3 Control Number AU-13 AU-14 CA-1 Monitoring for Information Disclosure Session Audit Security Assessment and Authorization Policies and Procedures CA-2 CA-3 CA-4 CA-5 CA-6 CA-7 CM-1 Security Assessments Information System Connections Withdrawn Plan of Action and Milestones Security Authorization Continuous Monitoring Configuration Management Policy and Procedures CM-2 CM-3 CM-4 CM-5 CM-6 CM-7 CM-8 CM-9 CP-1 Baseline Configuration Configuration Change Control Security Impact Analysis Access Restrictions for Change Configuration Settings Least Functionality Information System Component Inventory Configuration Management Plan Contingency Planning Policy and Procedures CP-2 CP-3 CP-4 CP-5 Contingency Plan Contingency Training Contingency Plan Testing and Exercises Withdrawn Control Description Taxonomy Subclasses and Elements 3. 3.1.3.3.2.3.1.3.4 1.4.3.2. 3.2.2 2. 2. 1. 3. 4.3 3. 3.3.4.2 2. 2.2.2 1. 2.

1.1. 1.3. 3. 2.2 1. 1. 1.3.3. 2.2.2.1. 1. 1. 4.2.2.1. 3.2. 2.1. 3. 4. 1. 3. 2. 1.3. 2.2 1.NIST SP 800-53 Rev. 1. 3. 3.2 2. 4.3 1.1.1.1.3.3. 3. 1. 2.1 1.2.1.1.1. 3. 4.2. 3.3.3 1.2.1.2.3.1.2 1.1. 3. 3.1.2. 1.4 1. 4.3. 2.2.2 1.3.2. 3.3.1 3. 2. 3.2.2. 4.3 2.2. 3.1.1.3.2.3 1. 1. 2.1. 1. 2.2.3.3. 2. 2.1. 1.1. 2.2. 2. 3.3. 4.2.1.1.2.3. 1. 4. 4.4 2.1. 4.2. 3. 1.2 1.1.3.1.1. 3.2 1.1. 3. 2. 4.3.2.1. 2. 2. 3. 3.1. 1. 3. 3 Control Number CP-6 CP-7 CP-8 CP-9 CP-10 Alternate Storage Site Alternate Processing Site Telecommunications Services Information System Backup Information System Recovery and Reconstitution IA-1 Identification and Authentication Policy and Procedures IA-2 Identification and Authentication (Organizational Users) IA-3 IA-4 IA-5 IA-6 IA-7 IA-8 Device Identification and Authentication Identifier Management Authenticator Management Authenticator Feedback Cryptographic Module Authentication Identification and Authentication (Non-Organizational Users) IR-1 IR-2 IR-3 IR-4 IR-5 IR-6 IR-7 IR-8 MA-1 Incident Response Policy and Procedures Incident Response Training Incident Response Testing and Exercises Incident Handling Incident Monitoring Incident Reporting Incident Response Assistance Incident Response Plan System Maintenance Policy and Procedures Control Description Taxonomy Subclasses and Elements 2.1. 3. 1.2.1.2.3 2.2.2.2 1.3.1. 3.3. 1. 3. 1.1.1.2 1.2 1. 3. 3. 4.2.2 1. 3.1. 3.1.2.2 1. 3. 3. 2.2.1. 1. 2.2. 2.3. 3. 1. 4. 3. 2. 3. 1.2.3.1. 3. 2.2. 3.3.2 CMU/SEI-2010-TN-028 | 20 .1.3.

2. 2.2.1. 2. 3.1.1.1. 3.3. 3.1.2. 1.1. 4.1. 2. 3.1. 4.1.1.2.2.1.2. 3. 3.2. 3.2 1. 1. 3. 3.1.1. 4.2. 3. 3.3.1. 1. 4. 1. 3.2 1.2.1.2. 1.3. 4.2. 3. 1.3.1. 3. 2. 1. 3.1. 3. 3.2 1.1. 2. 1. 3.1.1. 1.3. 2.1. 3. 1.1. 4. 4.1. 1.3.1. 2.2. 3.3. 2. 2.1.1. 3.1.2 1.2. 1. 3.1. 2. 2. 4.2.2.1. 4. 2.1. 3.1.NIST SP 800-53 Rev.4 1.1.1. 3.1.2.3. 3.3 1.1.1. 2. 3.1.1.2.2. 2.1. 2.4 1. 3.1. 2. 3.3.1.2.1.1. 2. 2. 2. 3.3. 1. 1.1. 4. 3.1.1.2. 1. 3. 3.1. 3. 3 Control Number MA-2 MA-3 MA-4 MA-5 MA-6 MP-1 MP-2 MP-3 MP-4 MP-5 MP-6 PE-1 Controlled Maintenance Maintenance Tools Non-Local Maintenance Maintenance Personnel Timely Maintenance Media Protection Policy and Procedures Media Access Media Marking Media Storage Media Transport Media Sanitization Physical and Environmental Protection Policy and Procedures PE-2 PE-3 PE-4 PE-5 PE-6 PE-7 PE-8 PE-9 PE-10 PE-11 PE-12 PE-13 Physical Access Authorizations Physical Access Control Access Control for Transmission Medium Access Control for Output Devices Monitoring Physical Access Visitor Control Access Records Power Equipment and Power Cabling Emergency Shutoff Emergency Power Emergency Lighting Fire Protection Control Description Taxonomy Subclasses and Elements 1.3. 4.1. 2. 3. 3. 3. 3.2.1. 2. 2. 1.3. 1. 1.2.1. 3.2. 3.2 1.2. 3. 1.2 1. 2.1.1.3 1. 2. 4. 3. 2. 3. 3. 4. 1.1. 2.3 1.2 1.2. 3.1.1.3. 3.1.2 1. 4. 1.1.1. 2.1.1.2. 3.2.3. 3.2.2.1.1.3 1.2 1.1.2.2 1. 3.2 1.2.2. 1. 2.1.2. 2. 3.2.2.1. 3. 4.1.3.1.4 1. 3.2. 2. 2.1.1.2 1.4 CMU/SEI-2010-TN-028 | 21 . 2. 2.3.2.2.2 1.1.2. 3. 1.1.3 1.1. 3.1.2 1.2.1.2. 1.1.2 1.2.1. 2.

1.3.2 1.3.2. 1.1. 1. 1. 3. 1.3.1. 3.1. 3.3 N/A 1. 1.2.2.1. 3.1.3.3.2 1.2.1.1. 3.2 1.1.1.2. 4. 2. 3.2 CMU/SEI-2010-TN-028 | 22 .1. 1.1.2.1. 4.3.1. 3.2. 1.2 3. 4. 1. 2. 4.1.1.3.3.2 1. 4. 3.3. 3. 4. 3.1. 3.3.2 1.3 1.3.2.3.3 1.3. 2. 4.1.4 1.3.4 1. 1.2. 4. 3. 3. 3.1.2.2. 3. 3.1.3.1.1.4. 4.2.3. 2.1.2. 1. 1. 3.3. 4. 1. 4. 1. 3.2. 1. 1.2.NIST SP 800-53 Rev. 3.1. 4.4 1. 3.1.2.2. 4. 3.3. 2. 2. 4.1.2.1. 4.3 N/A 1. 1. 4.2. 1.3.2.3 4.3. 1.2.3.2. 3.1.2 1.2 3.1.2. 3.3 3.2. 4. 3.2 1.2.1.3 3.2.2 3.4 1. 3.2. 4. 3. 1.3.1. 1. 1. 3. 3.3. 1. 3.2. 4. 3 Control Number PE-14 PE-15 PE-16 PE-17 PE-18 PE-19 PL-1 PL-2 PL-3 PL-4 PL-5 PL-6 PS-1 PS-2 PS-3 PS-4 PS-5 PS-6 PS-7 PS-8 RA-1 RA-2 RA-3 RA-4 RA-5 Temperature and Humidity Controls Water Damage Protection Delivery and Removal Alternate Work Site Location of Information System Components Information Leakage Security Planning Policy and Procedures System Security Plan Withdrawn Rules of Behavior Privacy Impact Assessment Security-Related Activity Planning Personnel Security Policy and Procedures Position Categorization Personnel Screening Personnel Termination Personnel Transfer Access Agreements Third-Party Personnel Security Personnel Sanctions Risk Assessment Policy and Procedures Security Categorization Risk Assessment Withdrawn Vulnerability Scanning Control Description Taxonomy Subclasses and Elements 1.1. 3.1.1.1.3. 3. 1.1. 4. 1. 4.3. 4.1. 1.2.2.3.3.1. 1. 2. 3.3.3 3.1.

2.2. 3. 4.2.3 2.2. 3.2.1. 2. 4.1. 3. 2. 4.2. 2.2.3 2. 1.2. 4. 4.3 1.2. 1. 4.2.1.2.3.4 2.3 2. 2.1. 2. 2. 2.3 2. 3.2.3.2. 3.3 1.1.3.2. 2. 2. 2. 4.3 1. 3.1. 4.3. 2.3 1. 4. 3.2.1.3. 4. 2.2. 2.2.3 2. 4.2.2.3. 4. 2.2.3.1.1. 3. 3 Control Number SA-1 System and Services Acquisition Policy and Procedures SA-2 SA-3 SA-4 SA-5 SA-6 SA-7 SA-8 SA-9 SA-10 SA-11 SA-12 SA-13 SA-14 SC-1 Allocation of Resources Life Cycle Support Acquisitions Information System Documentation Software Usage Restrictions User-Installed Software Security Engineering Principles External Information System Services Developer Configuration Management Developer Security Testing Supply Chain Protections Trustworthiness Critical Information System Components System and Communications Protection Policy and Procedures SC-2 SC-3 SC-4 SC-5 SC-6 SC-7 SC-8 SC-9 SC-10 Application Partitioning Security Function Isolation Information in Shared Resources Denial of Service Protection Resource Priority Boundary Protection Transmission Integrity Transmission Confidentiality Network Disconnect Control Description Taxonomy Subclasses and Elements 2. 4.3.3 1. 2.4. 2.3 1. 2. 2. 2. 2.3.2. 4.1.3. 2. 2.3.1.3.1. 4.2.3 2.3.3 2. 2.2.3 CMU/SEI-2010-TN-028 | 23 . 2. 2. 2.1. 3.1. 2. 4. 2.2. 2.1.2. 2. 2. 2.1. 4.2.2.2. 4. 3. 2.3.3 2.2.1.3.3.3 1.3 2.NIST SP 800-53 Rev. 2. 4. 3.3.2. 2.2. 2. 1.4 2.3.2.3 1.3. 2. 1. 2. 1.3.2.1. 1. 4. 4. 1.1. 2.3.3.1.2.1. 3. 4.1. 2.1.3 2. 2.4.2. 2. 1. 4. 2. 2.3.3 2.1. 4.2. 2. 3. 2.3.3.1.1.2.1.2.3.1. 2. 2.3 1. 4.2.1.3. 2. 4.2. 2.3 1. 2.3. 1.1. 3.2. 1.1.1. 2.2.2. 2.2. 2.

2. 2.1. 2.2.2. 2. 2. 1. 2.2. 2.2.1. 2. 1. 2.3 1.2. 2. 2.2. 1. 1. 1.2.NIST SP 800-53 Rev.1. 2.3 1.1.1.3 1. 2.2. 1. 2.2.3 1. 2. 1.2.1. 2.1.1. 1.1. 2.2. 1. 2. 2.2. 1. 1. 2. 2.1.1. 1. 2.3 1. 2.1. 1. 2.2.1.3 1.1. 2.1.1.3 1. 1. 2.2. 1. 1.1. 1. 2.2.2. 2. 2. 1. 2.2. 2.2.3 1.2. 2.2. 2. 2.3 CMU/SEI-2010-TN-028 | 24 . 3 Control Number SC-11 SC-12 Trusted Path Cryptographic Key Establishment and Management SC-13 SC-14 SC-15 SC-16 SC-17 SC-18 SC-19 SC-20 Use of Cryptography Public Access Protections Collaborative Computing Devices Transmission of Security Elements Public Key Infrastructure Certificates Mobile Code Voice Over Internet Protocol Secure Name/Address Resolution Service (Authoritative Source) SC-21 Secure Name/Address Resolution Service (Recursive or Caching Resolver) SC-22 Architecture and Provisioning for Name/Address Resolution Service SC-23 SC-24 SC-25 SC-26 SC-27 SC-28 SC-29 SC-30 SC-31 SC-32 Session Authenticity Fail in Known State Thin Nodes Honeypots Operating System-Independent Applications Protection of Information at Rest Heterogeneity Virtualization Techniques Covert Channel Analysis Information System Partitioning Control Description Taxonomy Subclasses and Elements 1. 2.1.2.2.3 1.1.3 1. 2. 2. 2.3 1. 2. 1. 2. 1. 2.1.2.2.2.3 1.2.1. 2.2.2.2. 2.2.3 1.2.1. 2.2. 2.1.1.1.3 1.1.2. 2.2. 2. 2.3 1. 2. 2. 1.3 1.1.1.2. 2.1. 2.3 1.3 1. 2.2.2. 2. 1.1.1. 2.1.2.3 1.3 1.1. 2.2.2. 2.1. 2. 2.2.3 1. 2.1.1. 2.1. 2.1.1.2. 2.1.2.1. 2. 2.1. 2.2.1.1.

3 3. 3. 3.1.1.3 CMU/SEI-2010-TN-028 | 25 . 3.1.2 1.1. 1.1.2.1. 2.2. 2.1. 3. 2.2.3 3. 3. 2. 3.2.2.3.2. 2. 2.2. 3. 3.1.2 1. and Directives Security Functionality Verification Software and Information Integrity Spam Protection Information Input Restrictions Information Input Validation Error Handling Information Output Handling and Retention Predictable Failure Prevention Information Security Program Plan Senior Information Security Officer Information Security Resources Plan of Action and Milestones Process Information System Inventory Information Security Measures of Performance PM-7 PM-8 PM-9 Enterprise Architecture Critical Infrastructure Plan Risk Management Strategy Control Description Taxonomy Subclasses and Elements 1.2. 3. Advisories.3 3. 4.1. 3.2 1. 3. 2. 1.2.1. 1. 2.1. 2. 3.1.3.NIST SP 800-53 Rev. 3.2 1.2.1.1. 2.2. 1.2.2 1. 2. 1. 2.2. 2.2. 1. 3. 2.2 1.1. 3.3 1.1.1.1. 3.3.2.3.2.2.1. 3. 2.1. 2.1. 3. 2.2.3. 2.3. 3.2.1. 3.3.2.1.1.3. 2.2 1. 3. 2. 3.2. 2. 3 Control Number SC-33 SC-34 SI-1 Transmission Preparation Integrity Non-Modifiable Executable Programs System and Information Integrity Policy and Procedures SI-2 SI-3 SI-4 SI-5 SI-6 SI-7 SI-8 SI-9 SI-10 SI-11 SI-12 SI-13 PM-1 PM-2 PM-3 PM-4 PM-5 PM-6 Flaw Remediation Malicious Code Protection Information System Monitoring Security Alerts. 3.2.2 1. 3. 2. 3.2. 3.1.2. 3.1. 3.1. 1.2 1.2.1.2. 2.2.1.2 3. 1. 1.3.1. 3.1. 3. 3. 3.1.2 1. 3. 2.2.3.2. 1. 1. 3.1. 2. 2.2.1.2. 3. 3. 1. 3. 3.3. 2.1.2 1. 2.2.3 3. 3.2. 1. 3. 3. 3.2. 2.2. 3.2.3 3. 2.1.1. 1.3.3 1.3 3. 3.2.3. 2.2.3 3.2.1.2.3 3.2 1.1. 1. 2.

3.1.3 3.NIST SP 800-53 Rev. 3. 3.3 CMU/SEI-2010-TN-028 | 26 .2. 3 Control Number PM-10 PM-11 Security Authorization Process Mission/Business Process Definition Control Description Taxonomy Subclasses and Elements 3.1. 3.2.

3 1. 5. 16.3 Omissions 1. 16-18.4 Fraud Sabotage Theft Vandalism CMU/SEI-2010-TN-028 | 27 . 7 CP – 2 IA – 1-8 IR –1. 3 Controls Taxonomy Class. 5-6. Table 3: Mapping of Taxonomy Subclasses and Elements to NIST Controls NIST SP 800-53 Rev. 19-22 AT – 4 AU – 1-3.2 Deliberate 1.2. 3-8 MP – 1-6 PE – 1-19 PL – 4 PS – 1-8 RA – 5 SC – 1-34 SI – 1-13 1. 9-10 CA – 1-3 CM – 5. 9-10 CA – 1-3 CM – 5.1.1 Mistakes 1.1 Inadvertent AC – 2-6.1 1.1.2 1.Appendix B: Mapping of Selected Taxonomy Subclasses and Elements to NIST SP 800-53 Rev. Subclass.1. 3 Controls The following table can be used to determine which NIST controls to consider in order to mitigate a specific operational cyber security risks. Element 1 Actions of People 1. 7 IA – 1-8 IR – 1. 3-8 MA – 2-6 MP – 1-6 PE – 1-19 PL – 4 PS – 1-8 RA – 5 SC – 1-34 SI – 1-13 AC – 2.2.2. 5.2 Errors 1. 19-22 AU – 1-3.2.

1 1. 8 CP – 6-9 IA – 1. 4. 12.2 1.1 2.1.2.6 Compatibility Configuration Management Change Control Security Settings Coding Practices AC – 3.3 Inaction NIST SP 800-53 Rev. 5 AT – 2.3 2.2 Capacity Performance Maintenance Obsolescence MA – 1-6 Software 2. 6 CM – 1-6 AC – 5.4 2. 3 Controls CA – 5 CP – 2-4 IR – 1-8 MA – 2-6 PL – 4 PS – 1-8 RA – 5 AT – 3.3 1.2 2. 5-14 SC – 1-34 SI – 1-13 AC – 2. 5. 14 CA – 5. 3.4 Skills Knowledge Guidance Availability 2 Systems and Technology Failures 2.Taxonomy Class. 7 IA – 3 MP – 1-6 PE – 1-19 SA – 2.2. Subclass. 3.1 2. 6 CM – 1-6.3. 2.5 2.1 Hardware AC – 17.3. 18.4 2.2. 5.2.2 2.2. 5-14 SC – 1-34 AU – 4 2. 5 Testing CMU/SEI-2010-TN-028 | 28 . 5 AT – 2. 19 AU – 7-10.3 2.1.3. 17. 19 CM – 8 CP – 6. 7-11.1. 6 CM – 1-6 RA – 5 CM – 4. 9 AC – 2.3.2. 7 CM – 7. 18. 4-8 MP – 1-6 SA – 2. Element 1.1. 3 1.

Subclass. 10 IA – 1-8 IR – 1.3. Element 2.2 3.1. 14 CA – 7 CM – 7-9 CP – 7-9 IA – 1-8 MA – 1-6 SA – 1-14 SC – 1-34 SI – 1-13 2.Taxonomy Class.8 Process Flow Process Documentation Roles and Responsibilities Notifications and Alerts Information Flow Escalation of Issues Service Level Agreements Task Hand-Off CMU/SEI-2010-TN-028 | 29 .1. 12. 3 Controls AC – 3. 16-22 AT – 1-4 AU – 2.1. 6.1. 6 RA – 1-3.6 3.1 2.3 2. 7. 5.1. 2.7 3. 4.5 3.1.1 Process Design and/or Execution AC – 1.1. 5. 9-13 CA – 1-3.4 Design Specifications Integration Complexity AC – 5.3 3. 14.3 Systems NIST SP 800-53 Rev. 5 SI – 1-13 PM – 1-11 3.3. 9.4 3. 16 AU – 7-10.1. 3.3. 6 3 Failed Internal Processes 3. 7-11. 4. 3-8 MA – 1-6 MP – 1-6 PE – 1-19 PL – 1.3.1 3.2 2. 6 CM – 1-8 CP – 1.

1.4 3.3 3. 6 RA – 1-3.1. 5-14 PM – 1-11 3. 6.2 3. 4 4 External Events 4. 17.2. 6. Element 3.1. 8 MA – 2-6 PL – 1.4 Procurement AC – 21. Subclass. 7 CP – 1 IA – 2-8 IR – 1.Taxonomy Class.3.3. 14. 6 SA – 2.1 CMU/SEI-2010-TN-028 | 30 . 5-7 CM – 1-3. 2.3 Status Monitoring Metrics Periodic Review Process Ownership Supporting Processes 3.2.3.5 4. 22 AT – 1-4 CP – 3.1.2.2 3. 9-11. 2. 21 AU – 1-3.2.2 Process Controls NIST SP 800-53 Rev. 6-8. 16-18 Weather Event PE – 13 4. 13 CA – 1. 3. 18.3 Staffing Funding Training and Development 3.1.6 Fire PE – 15 Flood Earthquake Unrest Pandemic 4.1 3. 6.1. 5 SI – 1-13 PM – 1-11 CA – 5 CP – 2. 2. 3-8 MA – 1-6 MP – 1-6 PE – 1-19 PL – 1. 10 PE – 10-12.1 3.2 4. 4 IR – 2 PL – 4 PS – 1-8 AC – 20 IR – 7 SA – 1.4 4.3 4.1 Hazards AT – 3 CP – 1-4.3. 3 Controls AC – 5.

2.2 4. 4. 2 AU – 1. 3 SA – 2. 8. 2. 5-14 AU – 1. 8 RA – 2. 3.Taxonomy Class.4 Utilities Emergency services Fuel Transportation CMU/SEI-2010-TN-028 | 31 . 3 SA – 2. 4.4.3 4.2. 8.3.3. 15-18 SA – 1. 10 IA – 8 IR – 6 MP – 1-6 PE – 19 PL – 5 PS – 1-6.2 4.3. Element 4.1 4.3 4.1 4. 2 4. 3. 7 PE – 10-13.2 Legal Issues NIST SP 800-53 Rev. 6. 4 4.4. 10 IA – 8 MA – 5 PL – 5 PS – 7 RA – 2.4.3 4.4 Supplier Failure Market Conditions Economic Conditions Service Dependencies CP – 1-4. 5-14 SI – 12 AC – 20 CP – 1.4.2 4. 3 Controls AC – 19 AU – 13 CP – 2.3 Regulatory compliance Legislation Litigation Business Issues 4. Subclass.2. 4.

CMU/SEI-2010-TN-028 | 32 .

White.dhs. [Caralli 2010a] Caralli.0 .References URLs are valid as of the publication date of this document. Brian P.html [DHS 2008] Department of Homeland Security (DHS) Risk Steering Committee. § 3542(b)(1). David W. and Vulnerability Evaluation (OCTAVE) Method Implementation Guide. v1.bis. http://www. Software Engineering Institute. A Taxonomy of Operational Risks (CMU/SEI-2005-TN-036). International Convergence of Capital Measurement and Capital Standards: A Revised Framework Comprehensive Version. v2.pdf [BIS 2006] Bank for International Settlements (BIS)..sei. http://www.. Carnegie Mellon University. Case. & Young. & Williams.. Julia H.cert.edu/library/abstracts/reports/10tr012. http://www. OCTAVE Threat Profiles.pdf [FISMA 2002] Federal Information Systems Management Act of 2002.. DHS Risk Lexicon. Office of the Law Revision Counsel.cfm CMU/SEI-2010-TN-028 | 33 . Carnegie Mellon University. v1. Software Engineering Institute. Richard A..0 (CMU/SEI-2010-TR-012). David W. Operationally Critical Threat. http://www. 2010. Lisa R. Carnegie Mellon University.pdf (2006). Software Engineering Institute. Lisa R. 2010.cert.cert. Allen.txt [Gallagher 2005] Gallagher. Rita C.gov/download/pls/44C35..Risk Management (RISK).cmu.cfm [Caralli 2010b] Caralli. Carnegie Mellon University.S. Susan. & Young. Department of Homeland Security... 2002. Software Engineering Institute. 2001. Asset. CERT® Resilience Management Model.cmu. 2001. Julia H. Kushner.org/octave/ [Alberts 2001b] Alberts. [Alberts 2001a] Alberts. Richard A. 44 U. Carnegie Mellon University.org/archive/pdf/OCTAVEthreatProfiles. http://www. Audrey.org/publ/bcbs128. Software Engineering Institute. Ray C.house.edu/library/abstracts/reports/05tn036.C. September 2008.gov/xlibrary/assets/dhs_risk_lexicon. Pamela J. CERT® Resilience Management Model. 2005..org/resilience/rmm. http://www. http://uscode. White.. Christopher & Dorofee. Audrey. Pamela D.0. http://www. Curtis.sei.. Allen. Curtis. Pamela D. Creel. Christopher & Dorofee.

Henderson.pdf CMU/SEI-2010-TN-028 | 34 .cmu. David A.cfm [NIST 2009] National Institute of Standards and Technology. U. 2009. Department of Commerce. 2007. Post. http://csrc.. Recommended Security Controls for Federal Information Systems and Organizations (NIST Special Publication 800-53. A Proposed Taxonomy for Software Development Risks for High-Performance Computing (HPC) Scientific/Engineering Applications (CMU/SEI-2006-TN-039).nist.. Revision 3).edu/library/abstracts/reports/06tn039. Dale B. Carnegie Mellon University.. Richard P.gov/publications/nistpubs/800-53-Rev3/ sp800-53-rev3-final_updated-errata_05-01-2010..[Kendall 2007] Kendall. http://www. Software Engineering Institute. Carver.S.sei. Jeffrey C. Douglass E. & Fisher.

LIMITATION OF ABSTRACT Unclassified NSN 7540-01-280-5500 Unclassified Unclassified UL Standard Form 298 (Rev. ABSTRACT (MAXIMUM 200 WORDS) 12B DISTRIBUTION CODE This report presents a taxonomy of operational cyber security risks that attempts to identify and organize the sources of operational cyber security risk into four classes: (1) actions of people. NUMBER OF PAGES 47 18. particularly those described by the Federal Information Security Management Act (FISMA). Asset. Washington. PRICE CODE 17. DTIC. 14. Directorate for information Operations and Reports. REPORT DATE 3. DC 20503. NTIS 13. gathering and maintaining the data needed. 1. (3) failed internal processes. and completing and reviewing the collection of information. VA 22202-4302. OCTAVE. searching existing data sources. cyber security. TITLE AND SUBTITLE December 2010 5. 1215 Jefferson Davis Highway. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) PERFORMING ORGANIZATION REPORT NUMBER CMU/SEI-2010-TN-028 10. Arlington. Send comments regarding this burden estimate or any other aspect of this collection of information. Lisa R. Young PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. Cebula. SPONSORING/MONITORING AGENCY REPORT NUMBER HQ ESC/XPK 5 Eglin Street Hanscom AFB. the National Institute of Standards and Technology (NIST) Special Publications. REPORT TYPE AND DATES COVERED Final FUNDING NUMBERS A Taxonomy of Operational Cyber Security Risks AUTHOR(S) FA8721-05-C-0003 James J. operational risk. SECURITY CLASSIFICATION OF ABSTRACT 20. AGENCY USE ONLY 2. Each class is broken down into subclasses. This report discusses the harmonization of the taxonomy with other risk and security activities. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response. resilience 16. Suite 1204. NIST. SUBJECT TERMS taxonomy. 2-89) Prescribed by ANSI Std. 6. and the CERT Operationally Critical Threat. including suggestions for reducing this burden. (Leave Blank) 4. which are described by their elements. and to the Office of Management and Budget. Software Engineering Institute Carnegie Mellon University Pittsburgh. SUPPLEMENTARY NOTES 12A DISTRIBUTION/AVAILABILITY STATEMENT Unclassified/Unlimited. 7.REPORT DOCUMENTATION PAGE Form Approved OMB No. including the time for reviewing instructions. PA 15213 9. to Washington Headquarters Services. Z39-18 298-102 . (2) systems and technology failures. FISMA. MA 01731-2116 11. SECURITY CLASSIFICATION OF THIS PAGE 19. and (4) external events. Paperwork Reduction Project (0704-0188). SECURITY CLASSIFICATION OF REPORT 15. and Vulnerability EvaluationSM (OCTAVE®) method.

Sign up to vote on this title
UsefulNot useful