You are on page 1of 22

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/331945442

Cybersecurity Awareness and Skills of Senior Citizens: A Motivation


Perspective

Article  in  Journal of Computer Information Systems · March 2019


DOI: 10.1080/08874417.2019.1579076

CITATION READS

1 2,662

3 authors:

Carlene Blackwood-Brown Yair Levy


Nova Southeastern University Nova Southeastern University
2 PUBLICATIONS   1 CITATION    107 PUBLICATIONS   2,868 CITATIONS   

SEE PROFILE SEE PROFILE

John D'Arcy
University of Delaware
20 PUBLICATIONS   1,810 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Conference Presentation View project

Development of a Cybersecurity Skills Index: A Scenarios-Based, Hands-On Measure of Non-IT Professionals' Cybersecurity Skills View project

All content following this page was uploaded by Yair Levy on 02 March 2020.

The user has requested enhancement of the downloaded file.


The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2056-4961.htm

Mitigating
Mitigating cyber attacks through cyber attacks
the measurement of non-IT
professionals’ cybersecurity skills
Melissa Carlton 101
College of Engineering, Houston Baptist University, Houston, Texas, USA
Received 25 November 2016
Yair Levy Revised 9 September 2017
9 March 2018
College of Engineering and Computing, 25 April 2018
Nova Southeastern University (NSU), Ft. Lauderdale, Florida, USA, and 9 July 2018
Accepted 10 July 2018

Michelle Ramim
School of Information Technology,
Center for Cybersecurity Education and Applied Research,
Middle Georgia State University, Macon, Georgia, USA

Abstract
Purpose – Users’ mistakes due to poor cybersecurity skills result in up to 95 per cent of cyber threats to
organizations. Threats to organizational information systems continue to result in substantial financial and
intellectual property losses. This paper aims to design, develop and empirically test a set of scenarios-based
hands-on tasks to measure the cybersecurity skills of non-information technology (IT) professionals.
Design/methodology/approach – This study was classified as developmental in nature and used a
sequential qualitative and quantitative method to validate the reliability of the Cybersecurity Skills Index
(CSI) as a prototype-benchmarking tool. Next, the prototype was used to empirically test the demonstrated
observable hands-on skills level of 173 non-IT professionals.
Findings – The importance of skills and hands-on assessment appears applicable to cybersecurity skills of
non-IT professionals. Therefore, by using an expert-validated set of cybersecurity skills and scenario-driven
tasks, this study established and validated a set of hands-on tasks that measure observable cybersecurity
skills of non-IT professionals without bias or the high-stakes risk to IT.
Research limitations/implications – Data collection was limited to the southeastern USA and while
the sample size of 173 non-IT professionals is valid, further studies are required to increase validation of the
results and generalizability.
Originality/value – The validated and reliable CSI operationalized as a tool that measures the
cybersecurity skills of non-IT professionals. This benchmarking tool could assist organizations with
mitigating threats due to vulnerabilities and breaches caused by employees due to poor cybersecurity skills.
Keywords Cyber threat mitigation, Cybersecurity assessment of non-IT professionals,
Cybersecurity skills index
Paper type Research paper

1. Introduction
The threat to organizational information systems (IS) due to vulnerabilities and breaches
caused by employees continue to result in substantial financial and information losses (Carlton Information & Computer Security
et al., 2016; Hovav and Gray, 2014). About 72-95 per cent of cyber threats to organizations are Vol. 27 No. 1, 2019
pp. 101-121
due to users’ mistakes due to poor cybersecurity skills. Embedded security features of the © Emerald Publishing Limited
2056-4961
information technology (IT) products and services are relied upon to protect organizations and DOI 10.1108/ICS-11-2016-0088
ICS individuals (Peha, 2013). However, the weakest link is viewed as the strongest protection of IS.
27,1 Individuals are usually seen as the weakest link in an organization’s IS security chain (Mitnick
and Simon, 2002; D’Arcy et al., 2009; Whitman and Mattord, 2018). Even with sophisticated
intrusion detection systems, organizations are still at risk of cyber attacks. Human errors and
social engineering (phishing attacks, drive-by-downloads, etc). appear to prevail in
circumventing such IT protections (Siponen and Vance, 2010). Even with the best intentions, an
102 employee may work in an insecure manner or under stress and cause a threat either because of
poor security tool usability, lack or skills or human error (PricewaterhouseCoopers (PwC), 2013;
Sasse et al., 2016; Sombatruang et al., 2016). Because cyber attacks have intensified over time,
organizations are increasing the priority of cybersecurity awareness program but must include
also training and education to enhance their employees’ cybersecurity skills (Anti-Phishing
Working Group (APWG), 2016; PwC, 2016; Whitman and Mattord, 2018).
Cybersecurity is “the prevention of damage to, unauthorized use of exploitation of, and if
needed, the restoration of electronic information and communications systems to ensure
confidentiality, integrity, and availability” (Axelrod, 2006, p. 1). Skill is defined as “a combination
of ability, knowledge, and experience that enables a person to do something well” (Boyatzis and
Kolb, 1991, p. 280). Therefore, cybersecurity skills (e.g. preventing malware, personal identifiable
information (PII) theft, work information system (WIS) breaches) correspond to an individual’s
technical knowledge, ability, and experience surrounding the hardware and software required to
execute IS security to mitigate cyber attacks (Carlton et al., 2016).
This research study addresses the need for additional empirical investigation and
measurement of cybersecurity skills, especially of non-IT professionals (Carlton and Levy,
2017; Choi et al., 2013; Thomson and von Solms, 2005). The US Department of Homeland
Security (2012) recognized the great escalation of the nation’s cyber threat in recent years.
Recommendations were made to develop and advance technical cybersecurity skills as a
way to encourage qualified candidates (US Department of Homeland Security, 2012). Those
users armed with the skills needed to quickly identify and report possible cyber-espionage
occurrences “discovered more breaches than any other internal process or technology”
(Verizon Enterprise Solutions, 2014, p. 42). “Yet we ignore the human factor in corporate
security at our peril, as it is all too clear that technology alone can’t guarantee security”
(Kaspersky Lab, 2013, p. 15). Thus, it appears that additional empirical investigation on
cybersecurity skills of non-IT professionals is warranted (Burley et al., 2014). Prior research
(D’Arcy et al., 2009; Hu et al., 2011; and Vassiliou et al., 2014) found the use of observable
hands-on skills provides unbiased evidence of competence and participant willingness to
report actual behavior without harm to systems or people. Therefore, the main goal of this
research study was to design, develop, and empirically test a set of scenarios-based, hands-
on tasks set to measure the cybersecurity skills of non-IT professionals.

2. Literature review
To ensure breadth, depth, rigor, consistency, clarity, brevity and effective analysis and
synthesis, an extensive search of the literature domain was conducted using
interdisciplinary fields including aviation, computer security, IS, medical and transportation
(Levy and Ellis, 2006; Silber-Varod et al., 2016). The theoretical foundation for the
cybersecurity skills index (CSI) used in this study emerged from existing knowledge
discovered during the literature review. This theoretical foundation served as the starting
point for the design, development and empirical testing of the CSI. Information concerning
cybersecurity skills shortages, risk mitigation and tools were also outputs from the
literature review; as were the information needed to operationalize the CSI and to design and
develop the scenarios and hands-on tasks on which the CSI is based.
2.1 Skills defined Mitigating
Boyatzis and Kolb (1991) and Levy (2005) defined skill as a combination of knowledge, cyber attacks
experience, and abilities that enables users to perform well. The acquisition of a skill is a
learning process and generally adopts three incremental stages (Anderson, 1982; Gravill
et al., 2006). These stages begin with the initial acquisition of a skill known as declarative
knowledge (Stage 1). At this stage, instruction and information about a skill are given to the
user (Anderson, 1982; Fitts, 1964). Moreover, Stage 1 allows the user to establish the
fundamental knowledge needed as a foundation for later learning stages (Gravill et al., 2006).
103
The second stage of skill acquisition (Stage 2) allows the learner to practice declarative
knowledge and convert it to procedural knowledge (Fitts, 1964; Neves and Anderson, 1981).
Knowledge becomes better organized and users start to connect the actions needed to
complete an activity (Gravill et al., 2006). Next, at the third stage, comes automaticity
(Fitts, 1964; Marcolin et al., 2000). Users progress beyond the initial acquisition stage into an
efficient and autonomous (Stage 3) by increasing their experience level by practicing tasks
repeatedly to develop their skills further (Anderson, 1982; Gravill et al., 2006; Kraiger et al.,
1993). Experience positively influences a user’s computer usage, which helps establish the
needed experience of the skill (Gravill et al., 2006). The ability to generalize procedures and
increase performance occurs during the acquisition of knowledge stages (Marcolin et al.,
2000). Eschenbrenner and Nah (2014) identified that over time, skills are honed and
competencies are acquired. The skill development stages are shown in Figure 1.

2.2 Competency
Hiring managers expect graduates to be prepared with the skill sets that are applicable to
the job (Levy and Ramim, 2015; Torkzadeh and Lee, 2003). The need to include
competencies, skills, knowledge and abilities in the classroom so that students have the
skills necessary for future employment were indicated in prior research (Havelka and
Merhout, 2009; Joint Task Force on Cybersecurity Education, 2017; Levy and Ramim, 2015;
Rubin and Dierdorff, 2009). The skills learned during an academic degree program or
professional training must be demonstrated by pilots, physicians, nurses, surgeons and
therapists before graduating. Competency is that high level of demonstrable skill expected
for certification and licensure (Levy and Ramim, 2015). Eschenbrenner and Nah (2014)
discovered that the maturing of an individual’s knowledge improves skills, which then
develops user’s competency. Moreover, it was previously noted in literature that knowledge
gathered by users honed skills in a certain functional area and developed competencies
(Toth and Klein, 2014). A user’s computer competence is vital for an organization that relies
on its employees to possess skills (i.e. knowledge, experiences, and abilities) to complete
technical tasks (Downey and Smith, 2011). Thus, it appears competency is acquired after a
skill is practiced over time (Levy and Ramim, 2015). The relationship between skill level and
competency is highlighted in Figure 2 (Levy and Ramim, 2015).

Figure 1.
Skill development
stages over time
ICS 2.3 Information technology skills
27,1 IT skills have predominantly been measured in research based on self-report survey
instruments (Levy, 2005; Torkzadeh and Lee, 2003; Weigel and Hazen, 2014). Adomßent and
Hoffman (2013) and Beaudoin et al. (2009) identified that competencies are important to
accomplish something successfully and responsibly. Lerouge et al. (2005) and Goode and
Levy (2017) defined IT skills as those skills that correspond to the technical knowledge
104 regarding the hardware, software, and programming features of IS. Weigel and Hazen (2014)
concluded that the increase of technology skills across users is important as IT becomes a
mainstay in the daily lives of individuals. New technologies are adopted by organizations
regularly, while competency with cybersecurity is one of the most important parts of IT
skills (Burley et al., 2014). Moreover, competence with IT not only empowers users, it has an
effect on their workplace productivity, and effectiveness of their leadership (Marcolin et al.,
2000). To effectively use IT for the benefit of the organization, a user needs to first acquire
skills working with new IS and technologies (Eargle et al., 2014). IT skills are essential for an
organization to gain competitive equality, but the management of those IT skills sustain an
organization’s competitive advantage (Downey and Smith, 2011). Thus, the importance of
assessing those skills warrants additional research (Levy and Ramim, 2015; Weigel and
Hazen, 2014).

2.4 Data breaches


The Identity Theft Resource Center (ITRC) (2015) identified that 25 per cent of the 64 data
breaches reported in January 2015 resulted in the exposure of 455,337 individual records.
About 75 per cent of the remaining data breaches resulted in an unknown number of
records that were exposed (ITRC, 2015). By July 2017, nearly 35 per cent of data breach
incidents resulted in over 14 million compromised records (ITRC, 2017). In the past, the
government only became involved in egregious privacy breaches (Boritz and No, 2011).
However, former President Barack Obama signed Executive Order No. 13,681 (2014) in
response to numerous data breaches occurring in the USA. This requires multiple factor
authentication when a US citizen’s personal data are made available through digital
applications in response to the numerous data breaches occurring in the USA. A survey
by Verizon Enterprise Solutions (2014) revealed that more data breaches were discovered
by users, armed with appropriate identification and reporting skills, than were
discovered by any internal process or technology. Technology alone cannot guarantee
security. In many instances, this is because users do not implement technical
countermeasures when they are perceived to interfere with work productivity (Choi et al.,
2013; D’Arcy et al., 2009). Therefore, the human factor cannot be ignored in corporate
information security without peril (Kaspersky Lab, 2013).
As further confirmation of the human factor, a survey by PwC in 2013, showed that most
security incidents could be attributed to actions by current or former employees (PwC, 2013).
Since 2003, four of the top nine security indent-patterns (e.g. miscellaneous errors,

Figure 2.
Skill-level and
competency-level
relationship
crimeware, insider misuse, and physical theft/loss) involved human error or misuse (Verizon Mitigating
Enterprise Solutions, 2015). Not all insider threats are intentional; 84 per cent of insider cyber attacks
related data breaches reported were due to an unintentional act or failure to secure a
computer, networking device, or drive (Symantec Corporation, 2015). Of the 1,670 data
breaches reported in 2015 in one source by Gemalto (2016), 38 per cent were due to an
accidental loss or malicious insiders. Moreover, social engineering concerns are not new, as
over a decade ago, it was noted that “unfortunately, even the best security mechanisms can
be bypassed through social engineering” (Winkler and Dealy, 1995, p. 1). However, it still 105
continues to be a major cybersecurity problem, as Algarni et al. (2014) noted that social
engineering “is now considered the great security threat to people and organizations” (p. 1).
Kvedar et al. (2010)’s findings suggested that a well-implemented social engineering ploy is
capable of succeeding even with those who classified themselves as being aware of social
engineering techniques.
Even with the continued importance of social engineering, malware, use of stolen
credentials, and phishing were identified as the top three cyber threats by Verizon
Enterprise Solutions (2016). Moreover, the skills needed to mitigate such cybersecurity
threats and to protect corporate IT systems from such data breach attacks can mean the
difference between experiencing a breach or mitigating it; as it is clear that nowadays no
enterprise is immune from a cyber attack (Verizon Enterprise Solutions, 2016). Therefore,
the importance of being able to assess the level of cybersecurity skills possessed by
organizational employees, especially non-IT professionals, is critical for business
continuity.

2.5 Cybersecurity risk mitigation and tools


Cybersecurity involves both technical and human ability “to protect or defend against cyber
attacks” (CNSS, 2010, p. 22). Risk is defined as a measure of the extent to which an entity is
threatened by a potential circumstance or event, and typically is a function of:
 the adverse impacts that would rise if the circumstance or event occurs; and
 the likelihood of occurrence (National Institute of Standards and Technology (NIST),
Computer Security Division, 2006).

Therefore, cybersecurity risk describes any disruption of business and monetary loss
caused by a malicious cyber event (Mukhopadhyay et al., 2013; NIST, 2014). Risk mitigation
is the process of “prioritizing, evaluating, and implementing the appropriate risk-reducing
controls/countermeasures recommended from the risk management process” by an
organization or individual (CNSS, 2010, p. 62). Maxion and Reeder (2005) identified risk
mitigation as necessary to protect systems in response to human errors and compromises in
IS security. Human errors include unprotected sensitive files, erroneously configured
systems, and mistakenly sending clear text to correspondents (Maxion and Reeder, 2005).
As new technologies (e.g. social media, cloud computing, critical infrastructure, embedded
systems and sensors, etc.) emerge, the need to mitigate cyber threats and vulnerabilities
increases (Jang-Jaccard and Nepal, 2014; Ransbotham et al., 2012). However, the role of
observable cybersecurity skills in risk mitigation of cyber threats is relatively under
researched.
Through government and private sector collaboration, NIST (2014) created a common
language, cost effective, non-regulatory “Cybersecurity Framework” that addresses and
manages cybersecurity risk. The “Framework” is to complement, not replace, an organization’s
risk mitigation and cybersecurity program (NIST, 2014, 2017). Based on the existing standards,
guidelines, and practices, the “Cybersecurity Framework” is scalable and evolving as
ICS technology advances and business requires (NIST, 2014, 2017). It is technology neutral to
27,1 provide a flexible and risk-based implementation that may be used with a broad array of
cybersecurity risk management processes (NIST, 2014). The “Cybersecurity Framework Core”
consists of five functions identified by industry as helpful in managing cybersecurity risk
(NIST, 2014). These functions (identify, protect, detect, respond and recover) assist
management of cybersecurity activities at their highest level (NIST, 2014). As described in
106 NIST (2014), the functions may be performed simultaneously and continuously to build an
operational culture that tackles the dynamic cybersecurity risk to ensure business continuity.
To enable discussion on how this research fits within the NIST (2014)’s Cybersecurity
Framework, each of the framework function is defined:
(1) Identify: Develop the organizational understanding to manage cybersecurity risk to
systems, assets, data and capabilities, i.e. asset management, business
environment, governance, risk assessment and risk management strategy;
(2) Protect: Develop and implement the appropriate safeguards to ensure delivery of
critical infrastructure services, i.e., access control, awareness and training, data
security, information protection processes and procedures, maintenance and
protective technology.
(3) Detect: Development and implement the appropriate activities to identify the
occurrence of a cybersecurity event, i.e. anomalies and events, security continuous
monitoring and detection processes.
(4) Respond: Develop and implement the appropriate activities to take action
regarding a detected cybersecurity event, i.e., response planning, communications,
analysis, mitigation and improvements.
(5) Recover: Develop and implement the appropriate activities to maintain plans for
resilience and restore any capabilities or services that were impaired due to a
cybersecurity event, i.e. recovery planning, improvements and communications.
(NIST, 2014, pp. 8-9).

The five core functions’ concurrent and continual cyclical nature are represented in Figure 3.
A human element exists in each of the NIST (2014)’s Cybersecurity Framework
functions. Thus, the CSI benchmarking index assists in identifying cybersecurity skills of
non-IT professionals by providing scenarios-based, hands-on tasks to measure those skills.
This identification assists in protecting and mitigating cyber threats to the organizational
infrastructure. Individuals not scoring at an acceptable competency threshold level may be
granted restricted access until they demonstrate that their competency levels exceed the
minimum threshold, as measured by the CSI. An individual with cybersecurity skills
demonstrates through the CSI benchmarking index the skills necessary in detecting

Figure 3.
NIST’s cybersecurity
framework functions
anomalies and malicious cybersecurity events in a timely manner. Furthermore, the CSI Mitigating
benchmarking index documents the individual’s existing skills and competencies levels in cyber attacks
responding to a set of cybersecurity tasks. Over time as an individual obtains additional
knowledge, experience, and ability, the individual’s skills levels increase, and ultimately
their full competency is achieved (Eschenbrenner and Nah, 2014; Marcolin et al., 2000).
Therefore, the CSI assists in the continuous monitoring of cybersecurity skills needed to
mitigate and recover from cybersecurity incidents, which encourages a strong critical
infrastructure. Moreover, the CSI benchmarking index promotes the adoption of the NIST 107
(2014) Cybersecurity Framework.

3. Methodology
Classified as a developmental research, this study tried to answer the question of how an
artifact could be constructed to address a problem (Ellis and Levy, 2009). Richey and Klein
(2014, p. 1) defined developmental research as a way to “create knowledge grounded in data
systematically derived from practice”. Developmental research is comprised of three major
elements:
(1) Product criteria are established and validated.
(2) A process for product development is accepted and formalized.
(3) Determination of the product’s criteria is met through a formalized, accepted
process (Ellis and Levy, 2009).

Tracey and Richey (2007) used a systematic process to develop and then validate their
model using the Delphi technique. In this an expert panel analyses the model and offers
feedback on the proposed design. The model is revised in accordance with the feedback
and suggested revisions are incorporated (Ramim and Lichvar, 2014; Tracey, 2009).
Figure 4 illustrates the research design this study followed. To begin the three-phased
research, Institutional Review Board (IRB) and data collection site approvals were
attained prior to utilizing the Delphi technique to design and validate the scenarios-
based, hands-on benchmarking index for measuring cybersecurity skills (Ramim and
Lichvar, 2014). IRB is an official academic and research process at research organizations
(universities, research centers, government agencies, etc.) in the USA where the study
methodology is proposed to a review committee, prior to approval of data collection, to
ensure the welfare and rights of the study participants (Nova Southeastern University,
2018). The privacy and ethical concerns were both reviewed and approved by IRB prior
to initiating the data collection process. Specifically, an approval and exempt from
further review was provided given that data collection in all phases of this study
including the subject matter experts (SMEs) and the non-IT professional study
participants were conducted using anonymous Web-based tools that do not capture any
personal information. Phase two of this study operationalized the previously developed
and validated scenarios-based, hands-on benchmarking index into an application
prototype that was used to assess the cybersecurity skills of non-IT professionals. Phase
three of this research study used the previously developed and validated application
prototype to conduct a quantitative empirical study by collecting data from 173 non-IT
professionals and documenting the results measured.

4. Findings
The results of this study were gathered in three phases. Phase 1 used the Delphi technique
as a way of consensus-building with a number of SMEs from industry, the Florida chapter
ICS
27,1

108

Figure 4.
Overview of the
three-phased research
design process

of the FBI/InfraGard, the US Secret Services’ (USSS) Electronic Crimes Task Force (ECTF)
team and other federal agencies. A review of literature involving current cyber attacks
(MacMillan and Yadron, 2014; Yadron et al., 2014) identified the initial 12 cybersecurity
threats and corresponding skills. These were then presented to the SMEs. After round one,
the 12 threats were reduced to 10. At the end of Round 2, cybersecurity threat and
corresponding skill number 10 was identified as an outlier and the SMEs highly
recommended removing it from the list. Thus, a consensus of SMEs opinion identified the
top nine cybersecurity skills needed for non-IT professionals to mitigate the respective top
nine cybersecurity threats. The top nine platform independent cybersecurity skills, their
respective category, SME rankings, number of SME responses, ranked weighted total,
ranked average, and skill importance weight are presented in Table I. Weighted totals were
based on the number of SME responses multiplied by the score for each assigned ranking.
This was then divided by the total number of SME responses to determine the final rank of
one through nine. Each skill importance weight was based on the individual skill weighted
average divided by the total weighted average.
Carlton and Levy (2015) reported how these results were then used to design a set of
observable hands-on, scenarios-based tasks that measure cybersecurity skills of non-IT
professionals without the bias of or need for self-assessment. A developed and validated
comprehensive set of scenarios-based, hands-on benchmarking index is fundamental, but
the challenge lies in the process of implementing it to actually measure such skills. Thus, the
scenarios and tasks were developed to represent published cyber attacks (MacMillan and
Yadron, 2014; PwC, 2013) experienced by individuals, organizations, and government. The
work of Carlton et al. (2016) describes how the prototype tool operationalized the previously
Individual SME rankings SME Weighted Weighted Skill importance
Skill Description 1 2 3 4 5 6 7 8 9 # response total average weight

1 Preventing the leaking of 8 2 0 0 3 0 1 2 0 2 18 128 7.111 0.136


confidential digital information to
unauthorized individuals
2 Preventing malware via non-secure 1 4 4 3 0 4 1 0 0 1 18 124 6.889 0.132
Websites
3 Preventing personally identifiable 4 1 2 3 2 2 2 0 2 0 18 120 6.667 0.127
information (PII) theft via access to
non-secure networks
4 Preventing PII theft via e-mail 1 3 1 2 2 3 2 3 1 0 18 105 5.833 0.112
phishing
5 Preventing malware via e-mail 2 0 6 1 2 0 2 0 3 2 18 103 5.722 0.109
6 Preventing credit card information 1 1 1 3 2 2 1 6 1 0 18 94 5.222 0.100
theft by purchasing from non-
secured Websites
7 Preventing information system 0 3 1 2 1 2 3 3 2 1 18 91 5.056 0.097
compromise via USB or storage
drive/device exploitations
8 Preventing unauthorized 1 0 1 4 3 2 1 1 3 2 18 89 4.944 0.095
information system access via
password exploitations
9 Preventing PII theft via social 0 2 2 0 3 2 3 2 3 1 18 87 4.833 0.092
networks
Totals ! 941 52.278 1.000

Source: Carlton and Levy (2015)

skills
Mitigating

nine cybersecurity
Rankings of the top
109

Table I.
cyber attacks
ICS developed and validated CSI that assesses the cybersecurity skills of non-IT professionals.
27,1 Figure 5 displays the design of the CSI as presented within the prototype tool.
The written scenarios-based, hands-on tasks from phase one were transformed into a
digital presentation that was presented to a panel of eight SMEs for their qualitative and
quantitative feedback on the scenarios, tasks, and scoring of the prototype. Each of the
cybersecurity skills was presented individually in the prototype. As described in the work of
110 Carlton et al. (2016), the four tasks in each of the cybersecurity skills scenarios were of
incremental difficulty. Once Task 1.1 (for example) was completed, the more difficult Task
1.2 was presented for completion (following Figure 5) and so on until all four tasks (Tasks
1.1 through 1.4) had been completed. This enabled the model to evaluate the highest skill
level of the participant, in relation to that cybersecurity skill, by identifying that task which
exceeded the participant’s highest skill level (Schwartz and Fischer, 2004). The process of
presenting the set of tasks incrementing in difficulty continued until a response was
 multiplied times their respective weight (wi)
received for each task. The sum of all nine skills
5
is then multiplied times the coefficient of 2 . This results in the non-IT professionals’ CSI
score of 0 to 100.
To validate the developed scenarios-based, hands-on tasks for each skills, a consensus of
SMEs’ opinion was reached within two rounds of the Delphi technique. Rigorous testing was
completed throughout the development process to ensure the prototype tool was valid and
reliable. Furthermore, a pilot study was conducted to ensure the correct score was recorded
by the prototype tool. The scores of 21 non-IT professionals were recorded using a manual
assessment process and then compared with the results from the same participants using
the prototype tool. This resulted in a reliable and validated prototype tool that could be used
to gather results in Phase 3 of the study (Sheng et al., 2007; Terrell, 2012). Thus, the
validated and reliable prototype was the tool used for collecting data from 173 non-IT
professionals and documenting the results of the measure.
To complete Phase 3, 975 employees in a number of organizations in the southeastern
USA, such as restaurants and medical offices, were invited to attend a cybersecurity
workshop, where the prototype tool was used to assess the participants’ cybersecurity skills.
Those that did wish to attend a cybersecurity workshop were offered a $5 Starbucks gift
card for using the prototype tool to assess their cybersecurity skills.
A significant correlation between demographic factors and victims of cybersecurity
threats was identified in prior literature (Algarni et al., 2015; Sheng et al., 2010). Therefore,
before using the prototype to assess their cybersecurity skills, participants were asked
demographic and technology usage questions. Pre-analysis data screening included
elimination of cases with the same score for all questions, verification of missing data, and
addressing extreme cases or outliers (Levy, 2006). Of the 245 responses, 57 participants
began the study, but did not complete the prototype tool, thus, no usable data was recorded.
An additional 15 responses were eliminated as the participants identified their primary job

Figure 5.
Design of the CSI
operationalized
within the prototype
tool
function as an IT. Thus, generating 173 (17.7 per cent) non-IT professional responses for Mitigating
analysis. cyber attacks
Data accuracy was not an issue, as the prototype was designed to allow only a single
valid answer for each task. Additionally, completed responses were downloaded into a
Google form and imported into Statistical Package for the Social Sciences (SPSS) for
further pre-analysis data screening. To further ensure data accuracy, descriptive
statistics were used to identify the minimum and maximum value for each skill score to
determine if responses were within the expected value range and were not accidently 111
corrupted during the transfer of data. All responses were within the expected ranges and
none were removed.
Of the 173 analyzed responses, 103 (59.5 per cent) were from women and 70 (40.5 per cent)
were completed by men. An analysis of the participants’ ages revealed that 144 (83.2 per
cent) were 20 to 64 years of age. Overall, 139 (80.3 per cent) had identified a primary activity
of work related tasks involving the use of the internet, use of social networks (Facebook,
Twitter, Instagram, etc.), or search engines (Google, Yahoo, Bing, etc.) and 118 (68.2 per cent)
accessed the internet between 6 and 30 hours per week. While nearly 37 per cent of the
participants were in administrative staff, managerial, or executive job functions, over 56 per
cent of participants responded with the job function of “other”. Given that recruitment took
place in public places, a response of “other” could include occupations such as nurses,
teachers, dental assistants, cashiers, and wait staff.
Next, the means and standard deviations for the skills (SKi) one to nine, each of the three
skill categories, and overall CSI for the 173 non-IT professionals were calculated. Table II
highlights the means and standard deviations for the population. The participants appeared
most skilled in preventing the leaking of confidential information (SK1) with a mean of 81.0
per cent. Moreover, the participants appeared least skilled in the preventing PII theft via
non-secure Websites (SK3) with a mean of 45.1 per cent. When comparing the means of the
three categories, the mean of the Work Information Systems (WIS) category was the highest
at 72.5 per cent. The mean of the PII category was 56.1 per cent and the malware category
was the lowest at 51.3 per cent.

Item Mean (%) SD

Individual skills
SK1 Preventing the leaking of confidential digital information to unauthorized individuals 81.0 0.144
SK2 Preventing malware via non-secure Websites 49.9 0.189
SK3 Preventing PII theft via access to non-secure Websites 45.1 0.282
SK4 Preventing PII theft via email phishing 59.7 0.198
SK5 Preventing malware via email 46.4 0.183
SK6 Preventing credit card theft by purchasing from non-secure Websites 57.5 0.157
SK7 Preventing information system compromise via USB or storage device exploitations 64.9 0.193
SK8 Preventing unauthorized information system access via password exploitations 71.7 0.175
SK9 Preventing PII theft via social networks 63.6 0.210
Categories
Work Information Systems (SK1, SK7, SK8) 72.5 0.119 Table II.
Malware (SK2, SK5, SK6) 51.3 0.118 Means and standard
PII (SK3, SK4, SK9) 56.1 0.155 deviations for the
Overall Cybersecurity Skills Index 59.8 0.098 population (N = 173)
ICS The useful responses were further analyzed using descriptive statistics to calculate the
27,1 individual skills and overall CSI by age group, gender, education, hours accessing the
internet, primary activity, job function, experience using technology, comfort level using
technology, the learning method of training, and reading manuals. Figure 6 displays the
means for the each skill and overall CSI by age group. Overall, the means for Sk1, preventing
the leaking of confidential information, revealed the highest mean scores. Sk2, preventing
112 malware via non-secure Websites, and Sk3, preventing PII theft via non-secure Websites,
revealed the lowest mean scores. As presented in Figure 7, the means for each skill and
overall CSI by gender revealed that men scored higher means than women for Overall CSI,
Sk2, Sk4, Sk5, Sk7 and Sk8. A review of the means for each skill and overall CSI by job
function revealed Sk1, with the highest means, and Sk2 and Sk3 with the lowest. The means
for each skill and overall CSI by job function are presented in Figure 8. Additionally, the

Figure 6.
Means for each skill
and overall CSI by
age group (N = 173)

Figure 7.
Means for each skill
and overall CSI by
gender (N = 173)
pattern of high mean scores for Sk1 and low mean scores for Sk2 and Sk3 revealed itself in Mitigating
the means for each skill and overall CSI by education (Figure 9). cyber attacks
A review of the means for each skill and overall CSI by the number of hours of accessing
the internet (Table III) revealed higher mean scores with preventing the leaking of
confidential information (Sk1) than the remaining skills and overall CSI. Figure 10 displays
the means for each skill and overall CSI by the participant’s self-identified level of
experience using technology. The participants indicating a “somewhat no experience using
technology” scored higher means for Sk2 and Sk6 than those indicating a higher experience I 113
level using technology. Furthermore, the participants reporting “absolutely no experience
using technology” attained higher means for Sk4, preventing PII theft via email. Whereas,
Figure 11 reveals the highest means for Sk1, Sk6, and Sk7 by participants strongly
disagreeing to comfortable using technology.
The participants indicating that they gained knowledge through technical training
demonstrated a greater skill range for each individual skill than the participants indicating
they “learn to use technology by reading manuals” (Table IV). Furthermore, the overall skill
range for those with technical training was wider (8.7 percentage points) than for those who
gained knowledge by reading manuals (6.4 percentage points). A one-way analysis of

Figure 8.
Means for each skill
and overall CSI by job
function (N = 173)

Figure 9.
Means for each skill
and overall CSI by
education (N = 173)
ICS variance (ANOVA) was conducted to assess the statistical significant mean differences for
27,1 the individual skills and overall CSI between age group, gender, education, hours accessing
the internet, primary activity, job function, experience using technology, comfort level using
technology, and learning method to learn how to use technology (i.e. reading manuals,

114 No. of hours of Skill (Mean in %)


accessing the Overall SK1 SK2 SK3 SK4 SK5 SK6 SK7 SK8 SK9
internet CSI (%) (%) (%) (%) (%) (%) (%) (%) (%) (%)
Table III.
Means and standard 0-5 h (n = 9) 59.03 80.56 55.00 36.67 50.35 44.72 61.39 67.78 69.44 68.89
6-10 h (n = 41) 62.74 80.00 49.76 47.80 65.47 50.85 60.46 69.76 74.63 70.06
deviations for the
11-15 h (n = 18) 61.93 81.39 53.33 47.78 62.57 48.82 59.38 60.83 72.22 73.06
study participants by 16-20 h (n = 18) 57.33 81.39 54.72 35.83 58.33 37.78 55.35 61.39 73.06 58.75
the number of hours 21-25 h (n = 22) 56.86 82.61 45.45 42.27 56.05 42.90 50.63 63.64 69.32 59.55
of accessing the 26-30 h (n = 19) 56.69 78.03 48.68 40.26 50.76 49.54 56.97 62.63 67.63 57.11
internet (N = 173) 31þ h (n = 46) 60.26 82.34 48.70 50.33 61.37 45.90 57.91 64.57 71.63 59.84

Figure 10.
Means for each skill
and overall CSI by
experience using
technology

Figure 11.
Means for each skill
and overall CSI by
comfort level using
technology
Skill (Mean in %)
Mitigating
Method of learning Overall CSI SK1 SK2 SK3 SK4 SK5 SK6 SK7 SK8 SK9 cyber attacks
Training as a method of learning
Strongly Agree (n = 70) 60.8 81.6 49.9 52.1 59.1 49.2 54.8 65.5 69.5 69.4
Agree (n = 33) 59.6 80.6 47.9 47.0 56.1 43.8 57.9 61.4 73.3 73.9
Somewhat Agree (n = 29) 57.6 81.0 51.9 39.7 61.8 46.1 58.3 67.8 66.3 50.0
Neutral (n = 27) 59.8 83.1 47.2 37.2 61.9 49.4 68.1 75.6 86.7 69.2 115
Somewhat disagree (n = 8) 61.5 74.4 63.1 28.1 66.5 46.3 76.3 70.0 51.7 73.1
Disagree (n = 3) 62.2 61.7 50.0 45.0 42.5 27.1 67.1 71.7 64.9 68.0
Strongly Disagree (n = 3) 53.5 91.7 46.7 31.7 45.1 61.0 47.0 56.0 63.3 57.7
Reading as a method of learning
Strongly Agree (n = 70) 59.6 82.3 48.9 46.1 63.8 46.3 53.3 62.9 72.1 61.2 Table IV.
Agree (n = 33) 60.8 80.0 51.1 52.0 60.1 47.8 55.3 62.3 74.5 64.6
Comparison of means
Somewhat Agree (n = 29) 59.6 80.7 54.7 40.9 57.0 45.5 58.6 66.3 72.3 62.1
Neutral (n = 27) 61.6 79.3 47.3 54.0 57.5 51.0 61.7 67.3 68.3 71.0 by method of
Somewhat disagree (n = 8) 58.7 82.8 48.2 33.7 61.1 43.8 59.9 70.5 69.2 63.2 learning for the study
Disagree (n = 3) 55.2 85.0 43.3 34.4 53.5 38.6 56.4 62.2 69.4 54.7 participants
Strongly Disagree (n = 3) 60.0 79.8 49.3 42.1 61.5 47.1 62.0 63.6 72.5 65.0 (N = 173)

training). The analysis showed significant differences for SK1 (unauthorized leakage of
confidential information) by age group (p < 0.05), by education (p < 0.01) and by primary
activity (p < 0.05) and SK9 (preventing PII theft on social networks) by hours spent
accessing the internet (p < 0.05). No other significant differences were noted.
A Tukey post hoc test revealed that Sk1 skill levels for the participants reporting 20 to 24
years of age were statistically lower than those reporting ages of 25-34 (82.1 per cent 6 16.1
per cent, p = 0.019), 35-44 (82.8 per cent 6 13.3 per cent, p = 0.022), 45-54 (85.0 per cent 6
12.1 per cent, p = 0.002) and 55-64 (82.2 per cent 6 13.9 per cent, p = 0.021). However, there
were no significant differences for less than 20 years of age (81.5 per cent 6 8.5 per cent, p =
0.245) or those reporting greater than 64 years of age (79.6 per cent 6 11.5 per cent, p =
0.223). A Tukey post hoc test revealed that Sk1 skill levels for the participants reporting
“social network” as their primary online activity was statistically lower than those reporting
“work related tasks” as their primary online activity (83.9 per cent 6 13.7 per cent, p =
0.036). However, there were no significant differences for those reporting their primary
online activity as “search engine” (82.8 per cent 6 13.4 per cent, p = 0.263), “shopping or
auctions” (79.0 per cent 6 5.5 per cent, p = 0.997), “personal finances” (80.7 per cent 6 14.3
per cent, p = 0.919), “entertainment” (76.4 per cent 6 12.0 per cent, p = 1.000), or “personal
communication” (93.0 per cent 6 6.7 per cent, p = 0.111). Tukey post hoc test results
revealed that Sk1 skill levels for the participants reporting “high school” as their education
level was statistically lower than those reporting “college” (82.4 per cent 6 12.7 per cent, p =
0.019) or “graduate” (87.1 per cent 6 14.5 per cent, p = 0.002) as their education level. There
was no significant difference noted for those reporting “other” (83.2 per cent 6 12.3 per cent,
p = 0.307) as their education level for Sk1. The Tukey post hoc test revealed no significant
differences in Sk9 by hours accessing the internet.

5. Conclusion
This study utilized scenarios with observable hands-on tasks to measure cybersecurity
skills of non-IT professionals. The results of this study provide information security
researchers and practitioners insight into the cybersecurity skills level of non-IT
ICS professionals, while also outlining a developmental research in the context of cybersecurity.
27,1 Additionally, the results of this study promise to influence industry practices in mitigating
the vulnerabilities and threats associated with cyber attacks that are due to limited
cybersecurity skills of non-IT professionals. Moreover, this study is significant as it has
several key implications for the provision of insight to researchers and practitioners into the
cybersecurity skills level of non-IT professionals. Moreover, this study is novel as it has
116 moved away from the prior survey instruments measures and into an actual scenario-based
demonstration of skills. This research has demonstrated the need to ensure that non-IT
professionals are provided with the opportunity to develop their appropriate cybersecurity
skills; and the value of having a tool to assess the level of the skills that people possessed.
Once the level of skill is measured, organizations or decision makers may focus their
training on those skills where an individual scores demonstrated lowest or provide it to
department of an organization that overall demonstrated lowest scores across all
departments.
As shown in this paper, the prototype of the cybersecurity skills measurement tool has
been designed and implemented. As evident in the results of this study, use of this prototype
has generated results that demonstrated the general need for cybersecurity skills assessment
tool. However, it has shown that gender, hours accessing the internet, job function, and other
demographic determinants were not found to be significant factors in influencing the level of
cybersecurity skills of the population in this study. However, the measured skills connected
with the reduction in unauthorized leakage of confidential information were significantly
improved as a result of increased age, more advanced education level, and use of IT systems
during the course of employment. These results are not surprising given that increasing age,
educational attainment, and professionalism are all likely to increase the need to protect the
confidentiality of information. The only other skill showing a significant difference was that
connected with preventing PII theft on social networks. Unsurprisingly, skill in this area was
influenced by hours spent accessing the internet.
Understanding an employee’s cybersecurity skills levels is critical to securing
information and the systems that store it, as organizations continue to rely on the internet
for conducting business, while people are remain the weakest link in cybersecurity. The CSI
tool described in this paper could assist organizations by helping them to assess the
cybersecurity skills of their non-IT staff. This assessment will provide insight into what
steps the organization needs to take to further mitigate the risk of data breaches caused by
human error or social engineering.
The further development and wider use of the CSI prototype tool described in this paper
may help practitioners to map the cybersecurity skills of non-IT staff to the relevant
education and training materials needed to ensure that their skill levels are improved in such
a way as to provide better protection for the organization against cyber security risks.
Furthermore, with the increase use of cybersecurity awareness programs only, as opposed
to Security Education, Training, and Awareness (SETA) programs, organizations fail to
emphasize the criticality of skills in cybersecurity workforce readiness. Specifically,
according to Whitman and Mattord (2018), while cybersecurity awareness provides
employees with exposure, it is highly limited and short term. Cybersecurity training and
education sessions that provide hands-on activities with scenario building exercises, deliver
the knowledge, skills, and deeper understanding needed to allow employees to have the
right cybersecurity skills to react to cyber attacks (Whitman and Mattord, 2018). As such,
the CSI prototype tool is significant as it can provide organizations an assessment tool that
measures how well their employees are able to mitigate social engineering and other cyber
threats.
References Mitigating
Adomßent, M. and Hoffman, T. (2013), “The concept of competencies in the context of education for cyber attacks
sustainable development (ESD)”, available at: http://esd-expert.net/assets/130314-Concept-
Paper-ESD-Competencies.pdf (accessed 22 September 2016).
Algarni, A., Xu, Y. and Chan, T. (2015), “Susceptibility to social engineering in social networking site:
the case of Facebook”, Proceedings of the 36th International Conference on Information Systems
(ICIS) 2015, Ft. Worth, TX.
Algarni, A., Xu, Y., Chan, T. and Tian, Y.-C. (2014), “Social engineering in social networking sites: how
117
good become evil”, Proceedings of the 18th Pacific Asia Conference on Information Systems
(PACIS 2014), Paper 271.
Anderson, J.R. (1982), “Acquisition of cognitive skill”, Psychological Review, Vol. 89 No. 4, pp. 369-406.
Anti-Phishing Working Group (APWG) (2016), “Phishing activity trends report (1st quarter 2016)”,
available at: http://docs.apwg.org/reports/apwg_trends_report_q1_2016.pdf (accessed 22 September
2016).
Axelrod, C.W. (2006), “Cybersecurity and the critical infrastructure: looking beyond the perimeter”,
Information Systems Control Journal, Vol. 6 No. 3, available at: www.isaca.org/Journal/Past-
Issues/2006/Volume-3/Pages/Cybersecurity-and-the-Critical-Infrastructure-Looking-Beyond-the-
Perimeter1.aspx (accessed 22 May 2014).
Beaudoin, M.F., Kurtz, G. and Eden, S. (2009), “Experiences and opinions of e-learners: what works,
what are the challenges, and what competencies ensure successful online learning”,
Interdisciplinary Journal of E-Learning & Learning Objects, Vol. 5, pp. 275-289.
Boritz, J.E. and No, W.G. (2011), “E-commerce and privacy: exploring what we know and
opportunities for future discovery”, Journal of Information Systems, Vol. 25 No. 2, pp. 11-45.
Boyatzis, R.E. and Kolb, D.A. (1991), “Assessing individuality in learning: the learning skills profile”,
Educational Psychology, Vol. 11 Nos 3/4, pp. 279-295.
Burley, D.L., Eisenberg, J. and Goodman, S.E. (2014), “Would cybersecurity professionalization help
address the cybersecurity crises?”, Communications of the ACM, Vol. 57 No. 2, pp. 24-27.
Carlton, M. and Levy, Y. (2015), “Expert assessment of the top platform independent cybersecurity
skills of non-IT professionals”, Proceedings of the 2015 IEEE SoutheastCon, Ft. Lauderdale, FL,
pp. 1-6, doi:10.1109/SECON.2015.7132932, available at: www.xcdsystem.com/SoutheastCon/
abstract/finalpapers/192.pdf
Carlton, M. and Levy, Y. (2017), “Cybersecurity skills: the cornerstone of advanced persistent threats
(APTs) mitigation”, Online Journal of Applied Knowledge Management, Vol. 5 No. 2, pp. 16-28,
available at: www.iiakm.org/ojakm/articles/2017/volume5_2/OJAKM_Volume5_2pp16-28.pdf
Carlton, M., Levy, Y., Ramim, M.M. and Terrell, S.R. (2016), “Development of the MyCyberSkillsTM iPad
app: a scenarios-based, hands-on measure of non-IT professionals’ cybersecurity skills”,
Proceedings of the Pre-International Conference of Information Systems (ICIS) SIGSEC –
Workshop on Information Security and Privacy (WISP) 2015, Ft. Worth, TX.
Choi, M.S., Levy, Y. and Hovav, A. (2013), “The role of user computer self-efficacy, cybersecurity
countermeasures awareness, and cybersecurity skills influence on computer misuse”,
Proceedings of the Pre-International Conference of Information Systems (ICIS) SIGSEC –
Workshop on Information Security and Privacy (WISP) 2013, Paper 29, Milan, available at:
http://aisel.aisnet.org/wisp2012/29 (accessed 22 May 2016).
Committee on National Security Systems (CNSS) (2010), “National information assurance (IA) glossary
(Instruction no. 4009)”, available at: www.ncsc.gov/nittf/docs/CNSSI-4009_National_Information_
Assurance.pdf (accessed 22 September 2016).
D’Arcy, J., Hovav, A. and Galletta, D. (2009), “User awareness of security countermeasures and its
impact on information systems misuse: a deterrence approach”, Information Systems Research,
Vol. 20 No. 1, pp. 79-98.
ICS Downey, J.P. and Smith, L.A. (2011), “The role of computer attitudes in enhancing computer
competence in training”, Journal of Organizational and End User Computing, Vol. 23 No. 3,
27,1 pp. 81-100.
Eargle, D., Taylor, R., Sawyer, L. and Gaskin, J. (2014), “Acquiring IS skill through habitual use”, 47th
HI International Conference on System Sciences (HICSS), pp. 3-12.
Ellis, T.J. and Levy, Y. (2009), “Towards a guide for novice researchers on research methodology:
118 review and proposed methods”, Issues in Informing Science and Information Technology, Vol. 6,
pp. 323-337.
Eschenbrenner, B. and Nah, F.F.-H. (2014), “Information systems user competency: a conceptual
foundation”, Communications of the Association for Information Systems, Vol. 34, pp. 1363-1378.
Fitts, P.M. (1964), “Perceptual-motor skill learning”, in Melton, A.W. (Ed.), Categories of Human
Learning, Academic Press, New York, NY, pp. 243-292.
Gemalto (2016), “Breach level index: Data breach database and risk assessment calculator”, available
at: www.breachlevelindex.com/ (accessed 22 September 2016).
Goode, J. and Levy, Y. (2017), “Towards empirical exploration of employee’s cybersecurity
countermeasures awareness and skills: differences in training delivery method and program
type”, Proceeding of the Knowledge Management (KM) 2017 Conference, Faculty of Information
Studies (FiS), Novo Mesto, pp. 18-30.
Gravill, J.I., Compeau, D.R. and Marcolin, B.I. (2006), “Experience effects on the accuracy of self-
assessed user competence”, Information and Management, Vol. 43 No. 3, pp. 378-394.
Havelka, D. and Merhout, J.W. (2009), “Toward a theory of information technology professional
competence”, The Journal of Computer Information Systems, Vol. 50, pp. 2106-2116.
Hovav, A. and Gray, P. (2014), “The ripple effect of an information security breach event: a
stakeholder analysis”, Communications of the Associations for Information Systems, Vol. 34,
pp. 893-912.
Hu, Q., Xu, Z., Dinev, T. and Ling, H. (2011), “Does deterrence work in reducing information security
policy abuse by employees?”, Communications of the ACM, Vol. 54 No. 6, pp. 54-60.
Identity Theft Resource Center (ITRC) (2015), ITRC Breach Report, available at: www.idtheftcenter.org/
images/breach/ITRCBreachReport2015.pdf (accessed 22 September 2016).
Identity Theft Resource Center (ITRC) (2017), ITRC Data Breach Stats Report, available at: www.
idtheftcenter.org/images/breach/2017Breaches/ITRCBreachStatsReport_2017.pdf (accessed 22
July 2017).
Jang-Jaccard, J. and Nepal, S. (2014), “A survey of emerging threats in cybersecurity”, Journal of
Computer and System Sciences, Vol. 80 No. 5, pp. 973-993.
Joint Task Force on Cybersecurity Education (2017), “Curriculum guidelines for post-secondary degree
programs in cybersecurity”, available at: http://cybered.acm.org/
Kaspersky Lab (2013), “Kaspersky security bulletin 2013”, Global Research and Analysis Team
(GREAT), available at: http://media.kaspersky.com/pdf/KSB_2013_EN.pdf (accessed 22
September 2016).
Kraiger, K., Ford, J.K. and Salas, E. (1993), “Application of cognitive, skill-based, and affective theories
of learning outcomes to new methods of training evaluation”, Journal of Applied Psychology
Monograph, Vol. 78 No. 2, pp. 311-328.
Kvedar, D., Nettis, M. and Fulton, S.P. (2010), “The use of formal social engineering techniques to
identify weaknesses during a computer vulnerability competition”, Journal of Computing
Sciences in Colleges, Vol. 26 No. 2, pp. 80-87.
Lerouge, C., Newton, S. and Blanton, J.E. (2005), “Exploring the systems analyst skill set: perceptions,
preferences, age, and gender”, The Journal of Computer Information Systems, Vol. 45 No. 3,
pp. 12-23.
Levy, Y. (2005), “A case study of management skills comparison in online MBA programs”, Mitigating
International Journal of Information and Communication Technology Education, Vol. 1 No. 3,
pp. 1-20.
cyber attacks
Levy, Y. (2006), Accessing the Value of e-learning Systems, Information Science Publishing, Hershey,
PA.
Levy, Y. and Ellis, T.J. (2006), “A systems approach to conduct an effective literature review in support
of information systems research”, Informing Science Journal, Vol. 9, pp. 181-212, available at:
http://inform.nu/Articles/Vol9/V9p181-212Levy99.pdf 119
Levy, Y. and Ramim, M.M. (2015), “An assessment of competency-based simulations on e-learners’
management skills enhancements”, Interdisciplinary Journal of E-Learning and Lifelong
Learning, Vol. 11, pp. 179-190, available at: www.ijello.org/Volume11/IJELLv11p179-
190Levy1958.pdf
MacMillan, D. and Yadron, D. (2014), “Dropbox blames security breach on password reuse”, The Wall
Street Journal, Digits, available at: http://blogs.wsj.com/digits/2014/10/14/dropbox-blames-
security-breach-on-password-reuse/
Marcolin, B.L., Compeau, D.R., Munro, M.C. and Huff, S.L. (2000), “Assessing user competence:
conceptualization and measurement”, Information Systems Research, Vol. 11 No. 1, pp. 37-60.
Maxion, R.A. and Reeder, R.W. (2005), “Improving user-interface dependability through mitigation
of human error”, International Journal of Human-Computer Studies, Vol. 63 Nos 1/2,
pp. 25-50.
Mitnick, K.D. and Simon, W.L. (2002), The Art of Deception: controlling the Human Element of Security,
Wiley Publishing, Indianapolis, IN.
Mukhopadhyay, A., Chatterjee, S., Saha, D., Mahanti, A. and Sadhukhan, S.K. (2013), “Cyber-risk
decision models: to insure IT or not?”, Decision Support Systems, Vol. 56, pp. 11-26.
National Institute of Standards and Technology (NIST) (2014), “Framework for improving critical
infrastructure cybersecurity (version 1.0)”, available at: www.nist.gov/cyberframework/upload/
cybersecurity-framework-021214.pdf (accessed 22 September 2016).
National Institute of Standards and Technology (NIST) (2017), “Framework for improving
critical infrastructure cybersecurity (version 1.1)”, available at: www.nist.gov/sites/
default/files/documents/2017/01/30/draft-cybersecurity-framework-v1.1.pdf (accessed 17
February 2017).
National Institute of Standards and Technology (NIST), Computer Security Division (2006), “Federal
information processing standards publication: minimum security requirements for federal
information and information systems (FIPS PUB 200)”, available at: http://csrc.nist.gov/
publications/fips/fips200/FIPS-200-final-march.pdf (accessed 22 September 2016).
Neves, D.M. and Anderson, J.R. (1981), “Knowledge compilation: mechanisms for the automatization of
cognitive skills”, in Anderson, J.R. (Ed.), Cognitive Skills and Their Acquisition, Lawrence
Erlbaum Associates, Hillsdale, NJ, pp. 57-84.
Nova Southeastern University (2018), “Institutional review board”, available at: www.nova.edu/irb/
Peha, J.M. (2013), “The dangerous policy of weakening security to facilitate surveillance”, available at:
http://users.ece.cmu.edu/peha/Peha_on_weakened_secuirty_for_surveillance.pdf (accessed 22
September 2016).
PricewaterhouseCoopers (PwC) (2013), “Defending yesterday: key findings from the global state of
information security survey 2014”, available at: www.pwc.com/gx/en/consulting-services/
information-security-survey/pwc-gsiss-2014-key-findings-report.pdf (accessed 22 September
2016).
PricewaterhouseCoopers (PwC) (2016), “Turnaround and transformation in cybersecurity: key findings
from the global state of information security survey 2016”, available at: www.pwc.com/gsiss
(accessed 22 September 2016).
ICS Ramim, M.M. and Lichvar, B.T. (2014), “Eliciting expert panel perspective on effective collaboration in
system development projects”, Online Journal of Applied Knowledge Management, Vol. 2 No. 1,
27,1 pp. 122-136.
Ransbotham, S., Mitra, S. and Ramsey, J. (2012), “Are markets for vulnerabilities effective?”, MIS
Quarterly, Vol. 36 No. 1, pp. 43-64.
Richey, R.C. and Klein, J.D. (2014), “Design and development research”, in Spector, J.M., Merrill, M.D.,
120 Elen, J. and Bishop, M.J. (Eds), Handbook of Research on Educational Communications and
Technology, Springer, New York, NY, pp. 141-150.
Rubin, R.S. and Dierdorff, E.C. (2009), “How relevant is the MBA? Assessing the alignment of required
curricula and required managerial competencies”, Academy of Management Learning and
Education, Vol. 8 No. 2, pp. 208-224.
Sasse, M.A., Smith, M., Herley, C., Lipford, H. and Vaniea, K. (2016), “Debunking security-usability
tradeoff myths”, IEEE Security and Privacy, Vol. 14 No. 5, pp. 33-39.
Schwartz, M.S. and Fischer, K.W. (2004), “Building general knowledge and skill: cognition and
microdevelopment in science learning”, in Demetriou, A. and Raftopoulos, A. (Eds), Cognitive
Developmental Change: Theories, models, and Measurement, Cambridge University Press,
Cambridge, pp. 157-185.
Sheng, S., Holbrook, M., Kumaraguru, P., Cranor, L.F. and Downs, J. (2010), “Who falls for phish? A
demographic analysis of phishing susceptibility and effectiveness of intervention”,
Proceedings of the 2010 SIGCHI Conference on Human Factors in Computing Systems,
pp. 373-382.
Sheng, S. Magnien, B. Kumaraguru, P. Acquisti, A. and Cranor, L.F. (2007), “Anti-Phishing Phil: the
design and evaluation of a game that teaches people not to fall for phish”, available at: http://
repository.cmu.edu/isr/22/ (accessed 22 September 2016).
Silber-Varod, V., Eshet-Alkalai, Y. and Geri, N. (2016), “Culturomics: reflections on the potential of big
data discourse analysis methods for identifying research trends”, The Online Journal of Applied
Knowledge Management, Vol. 4 No. 1, pp. 82-98.
Siponen, M. and Vance, A. (2010), “Neutralization: new insights into the problem of employee
information systems security policy violations”, MIS Quarterly, Vol. 34 No. 3, pp. 487-A12.
Sombatruang, N., Sasse, M.A. and Baddeley, M. (2016), “Why do people use unsecure public Wi-Fi? An
investigation of behaviour and factors driving decisions”, Proceedings of the 6th Workshop on
Socio-Technical Aspects in Security and Trust, pp. 61-72.
Symantec Corporation (2015), Internet Security Threat Report: Appendices, available at: http://know.
symantec.com/LP=1233 (accessed 22 September 2016).
Terrell, S. (2012), Statistics Translated: A Step-by-step Guide to Analyzing and Interpreting Data, The
Guilford Press, New York, NY.
Thomson, K.-L. and von Solms, R. (2005), “Information security obedience: a definition”, Computers and
Security, Vol. 24 No. 1, pp. 69-75.
Torkzadeh, G. and Lee, J. (2003), “Measures of perceived end-user computing skills”, Information and
Management, Vol. 40 No. 7, pp. 607-615.
Toth, P. and Klein, P. (2014), “A role-based model for federal information technology/cyber security
training (NIST special publication 800-16 revision 1, 3rd draft)”, available at: http://csrc.nist.
gov/publications/drafts/800-16-rev1/sp800_16_rev1_3rd-draft.pdf (accessed 22 September
2016).
Tracey, M.W. (2009), “Design and development research: a model validation”, Educational Technology
Research and Development, Vol. 57 No. 4, pp. 553-571.
Tracey, M.W. and Richey, R.C. (2007), “ID model construction and validation: a multiple intelligences
case”, Educational Technology Research and Development, Vol. 55 No. 4, pp. 369-390.
U.S. Department of Homeland Security (2012), “CyberSkills task force report”, US Department of Mitigating
Homeland Security, Homeland Security Advisory Council, available at: www.dhs.gov/sites/
default/files/publications/HSAC%20CyberSkills%20Report%20-%20Final.pdf cyber attacks
Vassiliou, M.C., Dunkin, B.J., Fried, G.M., Mellinger, J.D., Trus, T., Kaneva, P., Lyons, C., Korndorffer, J.
R., Ujiki, M., Velanovich, V. and Kochman, M.L. (2014), “Fundamentals of endoscopic surgery:
creation and validation of the hands-on test”, Surgical Endoscopy, Vol. 28 No. 3, pp. 704-711, doi:
10.1007/s00464-013-3298-4.
Verizon Enterprise Solutions (2014), “Verizon 2014 data breach investigations report”, available at: 121
www.verizonenterprise.com/DBIR/2014/ (accessed 22 September 2016).
Verizon Enterprise Solutions (2015), “Verizon 2015 data breach investigations report”, available at:
www.verizonenterprise.com/DBIR/2015/ (accessed 22 September 2016).
Verizon Enterprise Solutions (2016), “Verizon 2016 data breach investigations report”, available at:
www.verizonenterprise.com/verizon-insights-lab/dbir/2016/ (accessed 22 September 2016).
Weigel, F.K. and Hazen, B.T. (2014), “Technical proficiency for IS success”, Computer in Human
Behavior, Vol. 31, pp. 27-36.
Whitman, M.E. and Mattord, H.J. (2018), Principals of Information Security, 6th ed., Cengage Learning,
Boston, MA.
Winkler, S. and Dealy, B. (1995), “Information security technology? Don’t rely on it: a case study in social
engineering”, Proceedings of the Fifth USENIX UNIX Security Symposium, available at: www.
usenix.org/legacy/publications/library/proceedings/security95/full_papers/winkler.pdf (accessed
22 September 2016).
Yadron, D., Ziobro, P. and Devlin, B. (2014), “Target staff had warnings”, The Wall Street Journal,
Eastern Edition, B.1.

Further reading
Phish Tank (2009), “Statistics about phishing activity and Phish tank usage: November, 2009”,
available at: www.phishtank.com/stats/2009/11/ (accessed 22 September 2016).
PricewaterhouseCoopers (PwC) (2014), “Why you should adopt the NIST cybersecurity framework”,
available at: www.pwc.com/en_US/us/increasing-it-effectiveness/publications/assets/adopt-the-
nist.pdf (accessed 22 September 2016).

Corresponding author
Melissa Carlton can be contacted at: melissa.carlton.phd@gmail.com

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com

View publication stats

You might also like