You are on page 1of 5

Proceedings of the Human Factors and Ergonomics Society 2019 Annual Meeting 437

OBSERVING CYBER SECURITY INCIDENT RESPONSE:


QUALITATIVE THEMES FROM FIELD RESEARCH
Megan Nyre-Yu, Purdue University
Robert S. Gutzwiller, PhD, Arizona State University
Barrett S. Caldwell, PhD, Purdue University

Cyber security increasingly focuses on the challenges faced by network defenders. Cultural and security-
driven sentiments about external observation, as well as publication concerns, limit the ability of
researchers to understand the context surrounding incident response. Context awareness is crucial to inform
design and engineering. Furthermore, these perspectives can be heavily influenced by the targeted sector or
industry of the research. Together, a lack of broad contextual understanding may be biasing approaches to
improving operations, and driving faulty assumptions in cyber teams. A qualitative field study was
conducted in three computer security incident response teams (CSIRTs) and included perspectives of
government, academia, and private sector teams. Themes emerged and provide insights across multiple
aspects of incident response, including information sharing, organization, learning, and automation. The
need to focus on vertical integration of issues at different levels of the incident response system is also
discussed. Future research will build upon these results, using them to inform technology advancement in
CSIR settings.

INTRODUCTION Professional analysts in cyber defense face a wide


variety of human factors challenges (Gutzwiller et al., 2015).
Research in the intersection of cybersecurity and human Research addressing human performance of analysts,
performance is a growing focus within the human factors however, is sparse though increasing (Gutzwiller, Hunt, &
research community and at large (e.g., Aggarwal, Gonzalez, & Lange, 2016; Mancuso, Funke, Strang, & Eckold, 2015;
Dutt, 2018; Gutzwiller, Fugate, Sawyer, & Hancock, 2015; Vieane et al., 2016, 2017). Renewed emphasis on observation
Mancuso et al., 2014; Martin, Dubé, & Coovert, 2018; and contextualization of analyst performance for cybersecurity
Rajivan, Janssen, & Cooke, 2013; Vieane et al., 2016). (Paul & Whitley, 2013) mostly focuses on intrusion detection,
Considering the increasing importance of cybersecurity in but interest in studying analyst team activity is also emerging.
society, this timely development is critical to address projected Computer Security Incident Response (CSIR) is based on
shortages in cybersecurity professionals and increasing fears the need for a functioning team of professional cybersecurity
of cybersecurity threats (Bureau of Labor Statistics, 2016; analysts to respond to cyber threats and handle their aftermath.
Coats, 2017). CSIR originates in the late 1980’s following the spread of a
As cyber is a dense socio-technical system, influences on computer worm which seriously disrupted connected systems
Copyright 2019 by Human Factors and Ergonomics Society. DOI 10.1177/1071181319631016

global cyber (and cyber-physical) system performance come (Schmidt & Darby, 2001) and imposed large costs. In
in many different forms and have many actors behind them. response, DARPA funded the establishment of the first CERT
From the intended users of commercialized software and - Computer Emergency Response Team (Goodwyn et al.,
hardware, to the information technologists managing a 1997). The team was tasked specifically with detecting and
corporate environment, to a watch officer on board a ship, responding to information security issues affecting the
each are part of a broader cyber ecosystem. Most users are on networks of computers. In the last 30 years, the concept
the naïve end of a spectrum of control and responsibility for caught on. CSIR teams (CSIRTs) are now present
cybersecurity; these users can attempt vigilance, carefully use internationally in large numbers, across governments, industry
email, and protect their information and credentials through and academic institutions. These teams can be defined, as in
various means such as encryption, complexity, and proper Alberts et al. (2004), as “an organization or team that provides
handling. As both a major, continuing point of vulnerability services and support to a defined constituency for preventing,
(Silberner, 2009) and as an easily-simulated and familiar task handling, and responding to computer security incidents.”
which requires little expertise, research studies regarding (p.1).
phishing exploits are numerous (Kumaraguru et al., 2010;
Martin et al., 2018; Sawyer & Hancock, 2018; Vishwanath, Prior Research
Harrison, & Ng, 2016). Yet, elsewhere, in another part of the
system, cybersecurity professionals are dealing with cognitive A small body of research in CSIR was reviewed in
issues in performance as well (Bos et al., 2016; Conti, preparation for a series of interviews and on-site visits to
Ahamad, & Stasko, 2005; Lemay & Leblanc, 2018; Libicki & CSIRTs. The general roles of CSIRTs have been described in
Pfleeger, 2004). Security is not a zero-sum game between detail elsewhere (e.g., Huis et al., 2017; Ruefle et al., 2014),
these groups, yet they each have different responsibilities; end but generally include analysts performing proactive, reactive,
users are not directly responsible for a network, its defensive and maintenance tasks within an organization, and
posture, or responding to various cyber attacks. coordinating and sharing information to balance the wide
array of skills and expertise in a given team. In general, the
Proceedings of the Human Factors and Ergonomics Society 2019 Annual Meeting 438

major CSIRT processes include preparation, detection, study aimed to provide context-driven insights for analysts
analysis, containment, eradication, recovery, and post-incident and managers working in this area, and to allow researchers
actions (Huis et al., 2017). The study of CSIRTs is different better understanding of vertical issues in CSIR operations.
than cognitive expertise studies of individual analysts because This article is the output of one analysis method which was
CSIR is a distributed, team-based activity. A team-focused used in a multi-study dissertation by the first author.
perspective was pervasive in the literature, including the study
of situation awareness (e.g., Tyworth, Giacobe, & Mancuso, Research Goals
2012). Work has examined CSIRTs in terms of large-scale
issues, such as workforce development, team effectiveness and The goal of this research was to gather context-informed
the social maturity of teams (Hoffman, Burley, & Toregas, insights in CSIRTs to address both the contextual and vertical
2012; Steinke et al., 2015; Tetrick et al., 2016). assessment gaps. A long-term goal of the first author was to
Additionally, others have used ethnographic and determine technology design requirements to improve the
anthropological approaches for studying CSIRTs human-system integration in CSIRT environments. A variety
(Sundaramurthy,et al., 2014), though they encountered access of CSIR settings were chosen to capture variation between
problems in reaching their subjects. Others have applied more environments. Methods included observation and on-site
traditional methods in examining knowledge structures in interviews designed to capture insights with high ecological
cybersecurity, including cognitive task analysis for examining validity, and are described below.
team effectiveness and mental models (Chen et al., 2014), and
for understanding the diagnostic work process (Werlinger et METHOD
al., 2010).
CSIR capabilities are struggling in many areas, including Ethnographic methods used included observations and
staffing of analysts and full-cycle response. Available on-site interviews with three (3) active incident response
literature does not always provide context across CSIR teams. Semi-structured observation (Gillham, 2008) and
settings to focus design and improvement within these interview techniques (Galletta, 2013) allowed the researcher to
organizations, or of the technology used in CSIR activities. gain insights, provided high ecological validity and a level of
Where there was literature, it tended to focus on specific consistency across participant teams.
aspects of CSIR operations, and did not adequately illustrate CSIR teams. The study included three CSIRTs: one each
vertical integration of issues affecting CSIR functions (such from a state government, a large company, and an academic
as interpersonal, organizational and policy issues). Borrowed institution. Each team performed incident response functions,
from management literature, the term vertical integration in and all teams had a tiered organization in which different
this context is the understanding that, within the CSIRT, each levels of response were broken into different tiers of
issue at each level of the organizational hierarchy must be responders. In general, higher tiers correlated with higher
dealt with successfully, as it affects the others. levels of operator expertise and specialization.
Data collection. The researcher spent 3-5 business days
with each team, observing daily operations, interviewing
Mission
analysts and managers, and walking through simulated
Policy incidents with employees. Observations included sitting with
individual analysts during their daily activities and observing
Organization shift handoff meetings. When direct observation was not
Process available, the researcher conducted simulated incident
response exercises with different tiers of analysts, using
Interpersonal manager-approved historical incident data from within the
respective organization. Simulations included Tier 1 (novices)
Technology
and Tier 2 (experienced generalist) analysts, as well as a lead
or manager if they were involved in the historical incident
Figure 1. Vertical Integration of CSIR Issues being reconstructed.
Interviews with managers informed key attributes for
Human factors has long considered hierarchical and
each team (Pelto, 2016), including: general work context,
systemic influences, particularly with respect to safety
work environment, mission-driven goals, operations,
(Shappell & Wiegmann, 2000). The vertical integration of
organization, and general procedures during incident response.
issues within CSIR (e.g., Figure 1) presents similar elements
Interviews with analysts further explored these topics, adding
and illustrates the complexity of this sociotechnical system as
information on specific practices regarding: information
more than just a set of usability issues in software, or of
sharing during incident response, automated tools, and the
technology development and deployment. Each of these
roles and responsibilities of different tiers of analysts.
components represents perspectives and considerations for
As many security operation centers (SOC) were secure
human factors research.
and sensitive to the potential of data breach, the researcher
Qualitative research can provide context and reveal new
was unable to use any recording devices. Data were generated
aspects affecting security operations. Combined methods and
solely from observations, interviews, and interactions with
topics around ethnographic data collection can help create
participants (recorded by the researcher’s handwritten notes).
richer contextual understanding (Schwartzman, 1993). This
Proceedings of the Human Factors and Ergonomics Society 2019 Annual Meeting 439

Analysis as solutions - yet rarely address the need for effective


integration and interaction between the software and human
Coding. Generated data from the field studies were components of CSIRTs. Again, these are not part of a zero-
transcribed digitally for analysis. The researcher employed a sum game, and these gaps should be addressed; the findings in
bottom-up qualitative coding procedure (Auerbach & this paper make strides toward a better, systems-level
Silverstein, 2003), conducting three rounds of this understanding that includes context.
synthesizing technique. This process started with line-by-line
coding of all data generated from the ethnographic cases using Table 1. Themes developed from interviews in three CSIRTs
NVivo 11 software, resulting in 229 unique codes. The second
round used Microsoft Excel to group similar or related Rank Theme Codes
statements, resulting in 99 second round codes. Communication, feedback and accountability
Theme development. In the third round, second-round are necessary for IR, awareness, and learning; If
codes were compared across all teams to understand higher- 1 38
lacking within or between levels of org, issues
level concepts. The researcher systematically and iteratively arise
grouped related or repeated codes into categories. This process Organizational alignment on security priorities
resulted in 11 themes. Developing themes using qualitative 2 and awareness of IR issues is important for 36
methods (Merriam & Tisdell, 2016) is similar to developing "full-cycle" IR process
affinity diagrams (Holtzblatt, 2016) used in contextual inquiry. 3
Continuity of awareness and documentation
35
An example of the theme development from two levels of around incidents is important
codes is depicted in Figure 2. IR requires a wide range of skills and
4 flexibility; Workforce may not be able to 23
maintain if not designed to do so
Automation is seen as a potential solution for low­level tasks
and coordination, but considered out of reach for teams who IR requires a wide range of activities, including
don't have the support resources (20) 5 filtering and decision making; These can be 22
split based on expertise or authority of analysts
Automation is seen as a potential solution for
low-level tasks and coordination, but considered
Automation as a Automation needs 6 20
Automation has out of reach for teams who don't have the
solution (10) additional support
disadvantages (4) support resources
(6)
Knowledge sharing (in a repository, in person,
7 or through other channels) may be important for 14
Automation of low Automation learning and shared awareness
level competes with considered more Formal and informal roles emerge to meet an
training work 8 organizational need for management, 12
communication, and decision making
Automation not Automation can Identity and culture of the team affect
9 11
prevalent or common create more noise communications and responsibilities
Handoffs are varied in terms of procedure,
Figure 2. Example of Theme Development formality, and documentation; In whatever
10 10
form, they are important for continuity in
RESULTS several contexts
Incident handling methods may be indicators of
The results of the qualitative data analysis included 11 organizational maturity; Maturity as a focus 8
may drive incident handling methods
eleven themes grouped around various aspects of incident
response settings (Table 1). Each of the thematic statements is
ranked by the volume of codes subsumed under them, with the Creating shared and consistent awareness
number of included codes indicated on the right side. Specific
thematic concepts addressed in the Discussion subsections are Shared awareness has often been identified as a key
shared awareness, organizational influences, and expertise. component of effective cyber defense operations (Champion et
al., 2012; Gutzwiller et al., 2015; Mahoney et al., 2010;
DISCUSSION Rajivan & Cooke, 2018). The data confirmed and identified
several factors that contribute to shared awareness, moving the
The themes presented suggest a series of issues spanning need in cybersecurity operations toward team-based,
different levels of the security organizations, all of which distributed situation awareness (e.g., Salmon et al., 2009;
impact incident response operations. One challenge in Stanton et al., 2006) when coupled with the geographic and
understanding CSIR operations is a lack of focus on system- temporal distribution of analysts commonly seen in CSIR.
level analysis. Many other approaches consider improvements Facilitated by information sharing, the process of establishing
in machine learning / artificial intelligence algorithms, or shared awareness is not a one-way information flow, but
enhancements to cybersecurity analyst education and skillsets rather a cycle that needs to include a feedback flow sometimes
lacking in organizations. Feedback ensures the information
Proceedings of the Human Factors and Ergonomics Society 2019 Annual Meeting 440

was received and understood; it can also be a channel for new (such as security) can greatly affect the growth and maturity of
inputs from stakeholders in the communication loop. That is, the security team in terms of human capital (how much the
feedback can further establish an operational picture, guide team is allowed to expand in size and skill) and technology
incident response decision making, and enable learning (what investments can be made to advance the current state)
between analysts. This finding also applies across the larger (and see Tetrick et al., 2016).
organization. Points of contact in other parts of the
organization that receive ongoing incidents for further action Expertise distribution and sharing
may not acknowledge receipt of the incident, which can create
uncertainty regarding action and resolution. Feedback is for Many security operations centers are tiered organizations
the analyst sending or escalating an incident to know that the across which expertise is distributed; capability is structured
handoff was completed. into different levels of incident response. Lower (Tier 1)
Mid- and senior-level managers, chief information analysts are typically responsible for filtering and triaging
security officers (CISOs), and security personnel that liaise to incidents as they are detected, helping to determine whether or
other divisions or groups are all included in the list of not the alert is signal or noise, then directing the incident to
individuals that may be notified during an incident to create another analyst at a higher tier (Tier 2). Tier 2 analysts
shared awareness among management. For the most part, their typically handle the incident by conducting a short
role is to know (not necessarily acknowledge receipt of) the investigation and taking actions to contain and even remediate
information being shared, and use it for strategic decision the threat. Should the incident require further investigation or
making or incident mitigation. Thus, much communication to remediation actions, it is again escalated to a higher tier (Tier
this class of people is one-way. Shared awareness may be 3) of specialists who perform deeper analysis of the incident to
assumed at the risk of security. better understand origins and effects, then determine or
One caveat is that shared awareness must remain develop remediation actions and prevention activities.
consistent throughout the incident response process, and is The tiered nature of security operations creates
typically maintained through documentation. Documentation, significant differences in the expertise available at each level,
as an additional and unique channel for information sharing, which can diminish development and decision making at
plays a critical role in both real-time and historical analysis of lower levels of response. Exacerbating this effect was the
incident data, including threat data, key metrics of response, geographic, and sometimes temporal, separation of the tier
who is or was involved, and so on. From this perspective, operators. Many teams had higher-tier analysts who worked at
documentation can ensure persistent awareness (Caldwell, a different site or worked from home. The separation of
2011), especially if kept in a centrally accessible location, with expertise, coupled with different levels of authority and
adequate mechanisms for real-time data input from multiple clearance, were observed as hindrances to expertise sharing
parties. However, if not sufficiently maintained, between analysts. The different tiers sometimes maintain
documentation (and access thereto) can inhibit accurate and separate knowledge sharing platforms that limit access for
consistent awareness of a developing incident and hobble the team members of a different tier level. While there may be
capability of a CSIRT. mission-relevant reasons for these separations including
security itself, when CSIR is examined vertically, separations
Organizational influences and maturity reveal themselves as serious impediments to a team-based
response action. Essentially, these factors can disrupt
Consistent with systems approaches to human factors collaboration and knowledge sharing between tiers during
(Proctor, 2008), recognizing the CSIRT as a part of a larger incident response, potentially degrading security.
system widens the scope of potential influences on
performance. The findings indicate that structure and policy CONCLUSIONS
within an organization may greatly affect a group’s ability to
perform. In part, this is due to the constraints placed on the Computer security incident response is an increasingly
agency and communication within and between subgroups. important domain in providing defensive and offensive
The researcher noted that, in two of the three teams observed, support to firms around the globe. As the field advances
the authority of the security team to address incidents was technologically, it is imperative that human factors efforts
limited. Either the team was instructed to escalate the incident keep pace in representing the human half of the human-
to specialists (effectively making the security team a triage machine system. Research to strengthen contextual
step), or the team was instructed to escalate to the highest- understanding of this complex environment provides balance.
ranking security officer, who could manage the potential The work presented here reveals valuable insights grounded in
political fallout of a security-driven decision. Either way, context from three different CSIRTs, and the findings indicate
actions driving decisions out of the hands of the CSIRT may that examining the vertical integration of issues uncovers
limit the team’s efficacy. connections between factors across a variety of perspectives
The organizational mission was an immediate and and disciplines.
significant driver in determining whether or not security was The findings in this study indicate there are a multitude
an organizational priority. Prioritization at the organizational of areas in which research can expand to help build a more
level is critical, as security activities can directly compete with complete understanding of CSIR environments. Further
operational uptime. In short, the goals of the organization investigation regarding distributed (but shared) situation
Proceedings of the Human Factors and Ergonomics Society 2019 Annual Meeting 441

awareness is warranted to better define information needs of Mancuso, V. F., Christensen, J. C., Cowley, J., Finomore, V., Gonzalez, C., &
Knott, B. (2014). Human factors in cyber warfare II: Emerging perspectives.
analysts, and development of strategies and interventions for
Proceedings of the Human Factors and Ergonomics Society Annual
strengthening security operations may help alleviate issues Meeting, 58(1), 415–418.
regarding staffing and process inefficiencies. Finally, this Mancuso, V. F., Funke, G. J., Strang, A. J., & Eckold, M. B. (2015).
paper encourages additional studies in defining distributed Capturing performance in cyber human supervisory control. Proceedings of
the Human Factors and Ergonomics Society Annual Meeting, 59(1), 317–
expertise in CSIRTs, and how analyst teams gain and share
321.
knowledge in secure, mission-critical task contexts. Martin, J., Dubé, C., & Coovert, M. D. (2018). Signal detection theory (SDT)
is effective for modeling user behavior toward phishing and spear-phishing
References attacks. Human Factors, 60(8), 1179–1191.
Aggarwal, P., Gonzalez, C., & Dutt, V. (2018). HackIt: A real-time simulation Merriam, S. B., & Tisdell, E. (2016). Qualitative research : a guide to design
tool for studying real-world cyber-attacks in the laboratory. CNCS 2019, and implementation (Fourth edi). Jossey-Bass.
(September). Paul, C. L., & Whitley, K. (2013). A taxonomy of cyber awareness questions
Alberts, C., Dorofee, A., Killcrece, G., Ruefle, R., & Zajicek, M. (2004). for the user-centered design of cyber situation awareness. HCII 2013, 436–
Management Processes for CSIRTs: A Work in Progress. Carnegie Mellon 448.
Technical Report CMU/SEI-2004-TR-015. Pelto, P. J. (2016). Applied ethnography: Guidelines for field research. In
Auerbach, C., & Silverstein, L. B. (2003). Qualitative data: An introduction to Applied ethnography: Guidelines for field research. Routledge.
coding and analysis. New York University Press. Proctor, R. W. (2008). Human factors in simple and complex systems (2nd
Bos, N., Paul, C. L., Gersh, J. R., Greenberg, A., Piatko, C., Sperling, S., … ed..). Boca Raton, Fla.: CRC Press.
Burtner, R. (2016). Effects of gain/loss framing in Cyber defense decision- Rajivan, P., & Cooke, N. J. (2018). Information-pooling bias in collaborative
making. Proceedings of the Human Factors and Ergonomics Society, 168– security incident correlation analysis. Human Factors, 60(5), 626–639.
172. Rajivan, P., Janssen, M. A., & Cooke, N. J. (2013). Agent-based model of a
Bureau of Labor Statistics. (2016). Information Security Analysts. cyber security defense analyst team. Proceedings of the Human Factors and
Caldwell, B. S. (2011). Connection, Coupling, and Persistence in Online Ergonomics Society Annual Meeting, 57(1), 314–318.
Social Networks. In D. Haftor & A. Mirijamdotter (Eds.), Information and Ruefle, R., Dorofee, A., Mundie, D., Householder, A. D., Murray, M., & Perl,
Communication Technologies, Society and Human Beings: Theory and S. J. (2014). Computer security incident response team development and
Framework (Festschrift in honor of Gunilla Bradley) (pp. 346–354). evolution. IEEE Security & Privacy, 12(5), 16–26.
Hershey, PA: IGI Global. Salmon, P. M., Stanton, N. A., Walker, G. H., & Jenkins, D. P. (2009).
Champion, M., Rajivan, P., Cooke, N. J., & Jariwala, S. (2012). Team-based Distributed situation awareness: Theory, measurement and application to
cyber defense analysis. Cognitive Methods in Situation Awareness and teamwork. Boca Raton, FL: CRC Press.
Decision Support (CogSIMA), 218–221. Sawyer, B. D., & Hancock, P. A. (2018). Hacking the human: the prevalence
Chen, T. R., Shore, D. B., Zaccaro, S. J., Dalal, R. S., Tetrick, L. E., & Gorab, paradox in cybersecurity. Human Factors, 60(5), 597–609.
A. K. (2014). An organizational psychology perspective to examining Schmidt, C., & Darby, T. (2001). The What, Why, and How of the 1988
computer security incident response teams. IEEE Security & Privacy, 12(5), Internet Worm.
61–67. Schwartzman, H. B. (1993). Ethnography in organizations (Vol. 27). Sage.
Coats, D. R. (2017). Worldwide Threat Assessment of the US Intelligence Shappell, S., & Wiegmann, D. A. (2000). The Human Factors Analysis
Community. Washington, DC, USA. Classification System (HFACS). Federal Aviation Administration, 1–19.
Conti, G., Ahamad, M., & Stasko, J. (2005). Attacking information Silberner, J. (2009, December). Don’t fall for CDC swine flu phishing scam.
visualization system usability overloading and deceiving the human. NPR.
Proceedings of the 2005 Symposium on Usable Privacy and Security - Stanton, N. A., Stewart, R., Harris, D., Houghton, R. J., Baber, C., McMaster,
SOUPS ’05, 89–100. R., … Green, D. (2006). Distributed situation awareness in dynamic
Galletta, A. (2013). Mastering the Semi-Structured Interview and Beyond systems: theoretical development and application of an ergonomics
From Research Design to Analysis and Publication. New York: NYU Press. methodology. Ergonomics, 49(12–13), 1288–1311.
Gillham, B. (2008). Observation techniques : structured to unstructured. Steinke, J., Bolunmez, B., Fletcher, L., Wang, V., Tomassetti, A. J., Repchick,
London ; New York: Continuum International Pub. K. M., … Tetrick, L. E. (2015). Improving cybersecurity incident response
Goodwyn, J. C., Moore, R. A., Jordan, W., Richardson, J., Roosild, S., Worch, team effectiveness using teams-based research. IEEE Security and Privacy,
P., … Adams, D. (1997). Defense Advanced Research Projects Agency 13(4), 20–29.
Technology Transition. DARPA, 1–185. Sundaramurthy, S. C., McHugh, J., Ou, X., Rajagopalan, S. R., & Wesch, M.
Gutzwiller, R.S., Hunt, S. M., & Lange, D. S. (2016). A task analysis toward (2014). An anthropological approach to studying CSIRTs. IEEE Security &
characterizing cyber-cognitive situation awareness (CCSA) in cyber defense Privacy, 12(5), 52-60.
analysts. In 2016 IEEE International Multi-Disciplinary Conference on Tetrick, L. E., Zaccaro, S. J., Dalal, R. S., Steinke, J. A., Repchick, K. M.,
Cognitive Methods in Situation Awareness and Decision Support, CogSIMA Hargrove, A. K., … Wang, V. (2016). Improving Social Maturity of
2016. Cybersecurity Incident Response Teams. Fairfax, VA: George Mason
Gutzwiller, R. S, Fugate, S., Sawyer, B. D., & Hancock, P. A. (2015). The University.
human factors of cyber network defense. Proceedings of the Human Factors Tyworth, M., Giacobe, N. A., & Mancuso, V. (2012). Cyber situation
and Ergonomics Society Annual Meeting, 59(1), 322–326. awareness as distributed socio-cognitive work. In Cyber Sensing 2012 (Vol.
Hoffman, L., Burley, D., & Toregas, C. (2012). Holistically building the 8408, p. 84080F). International Society for Optics and Photonics.
cybersecurity workforce. IEEE Security & Privacy, 10(2), 33-39. Vieane, A. Z., Funke, G., Greenlee, E., Mancuso, V., Borghetti, B., Miller, B.,
Holtzblatt, K. (2016). Contextual design: design for life (2nd ed.). Elsevier. … Boehm-Davis, D. (2017). Task interruptions undermine cyber defense.
Huis, M., van der Kleij, R., Kleinhaus, G., de Koning, L., Kort, J., Meiler, P., Proceedings of the Human Factors and Ergonomics Society, 375–379.
… Young, H. (2017). Human factors in cyber incident response: Needs, Vieane, A.,Z. Funke, G., Mancuso, V., Greenlee, E., Dye, G., Borghetti, B.,
collaboration and the Reporter. TNO Report 2017-R11575, 1–47. … Brown, R. (2016). Coordinated displays to assist cyber defenders.
Kumaraguru, P., Sheng, S., Acquisti, A., Cranor, L. F., & Hong, J. (2010). Proceedings of the Human Factors and Ergonomics Society, 344–348.
Teaching Johnny not to fall for phish. ACM Transactions on Internet Vieane, A. Z., Funke, G. J., Gutzwiller, R. S., Mancuso, V. F., Sawyer, B. D.,
Technology, 10(2), 1–31. & Wickens, C. D. (2016). Addressing human factors gaps in cyber defense.
Lemay, A., & Leblanc, S. (2018). Cognitive biases in cyber decision-making. Proceedings of the Human Factors and Ergonomics Society, 60, 770–773.
Proceedings of the 13th International Conference on Cyber Warfare and Vishwanath, A., Harrison, B., & Ng, Y. J. (2016). Suspicion, cognition, and
Security, 395. automaticity model of phishing susceptibility. Communication Research,
Libicki, M., & Pfleeger, S. (2004). Collecting the dots: Problem formulation 45(8), 1146-1166.
and solution elements. Santa Monica, CA: RAND Corporation. Werlinger, R., Muldner, K., Hawkey, K., & Beznosov, K. (2010). Preparation,
Mahoney, S., Roth, E., Steinke, K., Pfautz, J., Wu, C., & Farry, M. (2010). A detection, and analysis: the diagnostic work of IT security incident response.
cognitive task analysis for cyber situational awareness. Proceedings of the Information Management & Computer Security, 18(1), 26–42.
Human Factors and Ergonomics Society Annual Meeting, 54, 279–283.

You might also like