Professional Documents
Culture Documents
Culture of Security
ISACA®
With 95,000 constituents in 160 countries, ISACA (www.isaca.org) is a leading global provider
of knowledge, certifications, community, advocacy and education on information systems (IS)
assurance and security, enterprise governance and management of IT, and IT-related risk and
compliance. Founded in 1969, the nonprofit, independent ISACA hosts international conferences,
publishes the ISACA® Journal, and develops international IS auditing and control standards,
which help its constituents ensure trust in, and value from, information systems. It also advances
and attests IT skills and knowledge through the globally respected Certified Information Systems
Auditor® (CISA®), Certified Information Security Manager® (CISM®), Certified in the Governance
of Enterprise IT® (CGEIT®) and Certified in Risk and Information Systems ControlTM (CRISCTM)
designations. ISACA continually updates COBIT®, which helps IT professionals and enterprise
leaders fulfill their IT governance and management responsibilities, particularly in the areas of
assurance, security, risk and control, and deliver value to the business.
Disclaimer
ISACA has designed and created Creating a Culture of Security (the “Work”) primarily as an
educational resource for security professionals. ISACA makes no claim that use of any of the
Work will assure a successful outcome. The Work should not be considered inclusive of any
proper information, procedures and tests or exclusive of other information, procedures and tests
that are reasonably directed to obtaining the same results. In determining the propriety of any
specific information, procedure or test, security professionals should apply their own professional
judgment to the specific circumstances presented by the particular systems or information
technology environment.
Reservation of Rights
© 2011 ISACA. All rights reserved. No part of this publication may be used, copied, reproduced,
modified, distributed, displayed, stored in a retrieval system or transmitted in any form by any
means (electronic, mechanical, photocopying, recording or otherwise) without the prior written
authorization of ISACA. Reproduction and use of all or portions of this publication are permitted
solely for academic, internal and noncommercial use and for consulting/advisory engagements and
must include full attribution of the material’s source. No other right or permission is granted with
respect to this work.
ISACA
3701 Algonquin Road, Suite 1010
Rolling Meadows, IL 60008 USA
Phone: +1.847.253.1545
Fax: +1.847.253.1443
E-mail: info@isaca.org
Web site: www.isaca.org
ISBN 978-1-60420-183-3
Creating a Culture of Security
CRISC is a trademark/service mark of ISACA. The mark has been applied for or registered in
countries throughout the world.
2 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Acknowledgments
ISACA wishes to recognize:
Development Team
Steven J. Ross, CISA, CBCP, CISSP, Risk Masters, Inc., USA, Author
Jo Stewart-Rattray, CISA, CISM, CGEIT, CSEPS, RSM Bird Cameron, Australia, Chair
Christos Dimitriadis, Ph.D., CISA, CISM, INTRALOT S.A., Greece
Wendy Goucher, Idrach Ltd., UK
Norman Kromberg, CISA, CGEIT, Alliance Data, USA
Finn Olav Sveen, Ph.D., Gjøvik University College, Norway
Vernon Poole, CISM, CGEIT, Sapphire, UK
Rinki Sethi, CISA, eBay, USA
Expert Reviewers
Sanjay Bahl, CISM, Microsoft Corp. (India) Pvt. Ltd., India
Garry Barnes, CISA, CISM, CGEIT, Commonwealth Bank of Australia, Australia
Krag Brotby, CISM, CGEIT, NextStepInfoSec, USA
Meenu Gupta, CISA, CISM, CBP, CIPP, CISSP, Mittal Technologies, USA
Mark Lobel, CISA, CISM, CISSP, PricewaterhouseCoopers LLP, USA
Naiden Nedelchev, CISM, CGEIT, Mobiltel EAD, Bulgaria
Ramesan Ramani, CISM, CGEIT, Paramount Computer Systems, UAE
Christophe Veltsos, Ph.D., CISA, CIPP, CISSP, GCFA, Minnesota State University, Mankato, USA
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 3
Acknowledgments (cont.)
ISACA and IT Governance Institute® (ITGI®) Affiliates and Sponsors
American Institute of Certified Public Accountants
ASIS International
The Center for Internet Security
Commonwealth Association for Corporate Governance Inc.
FIDA Inform
Information Security Forum
Information Systems Security Association
Institut de la Gouvernance des Systèmes d’Information
Institute of Management Accountants Inc.
ISACA chapters
ITGI Japan
Norwich University
Solvay Brussels School of Economics and Management
University of Antwerp Management School
ASI System Integration
Hewlett-Packard
IBM
SOAProjects Inc.
Symantec Corp.
TruArx Inc.
4 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Contents
Preface............................................................................................................................... 9
Endnotes....................................................................................................................... 9
1.0 Introduction............................................................................................................. 11
1.1 Hard and Soft Security....................................................................................... 13
1.2 Whose Culture Is It?............................................................................................. 15
Endnotes..................................................................................................................... 15
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 5
6 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
8 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Preface
In October 2010, ISACA published The Business Model for Information Security
(BMIS). The model takes a business oriented approach to managing information
security, building on the foundational concepts developed by the association. It
utilizes systems thinking to clarify complex relationships within the enterprise and,
thus, to more effectively manage security.1
One of the findings of the BMIS study was that an intentional culture of security2
was the primary objective for the model, as applied to information security.3 The
intentionality of security must be emphasized. Implicit in the use of “intentional”
is that enterprises—companies in the private sector, agencies in the public
sector—do not, for the most part, have an effective culture of security, one
that supports the protection of information while also supporting the broader
aims of the enterprise. They must take active, directed steps to improve it. All
enterprises have a culture of security. In most cases, it lacks intentionality and is
inconsistent to the extent that it exists at all; in others, it is robust and guides the
daily activities of employees and others who come in contact with the enterprise.
Most important, those enterprises with a stronger culture of security may not have
created it purposefully; the existence of meaningful security is so clearly aligned
with the mission of the business that management did not need to apply intentional
measures. Understanding whether the culture was created in a purposeful manner or
by “accident” is critical to sustaining the culture in the long run.
This volume is dedicated to all those who recognize the importance of security
and who strive to achieve it. They may feel that their enterprises have given lip
service to security, but do not actually have the firmness and resolution of purpose
to receive the full value of the investments made in security. The people they work
with say the right things, often do the right things and even pay for the right things,
but the information with which they carry out their responsibilities is not really
secure. They want to achieve a meaningful, intentional security culture. It is the
purpose of this volume to suggest the way to do it.
Endnotes
1
ISACA, The Business Model for Information Security (BMIS), USA, 2010
2
hroughout this volume, it is assumed, but not stated, that “security” refers to the
T
security of information resources. If a differentiation is required, it is specified,
e.g., physical, personnel or operations security.
3
ISACA, An Introduction to the Business Model for Information Security, 2009, p. 12
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 9
10 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
1.0 Introduction
My management just does not “get” information security!
All the information security people do is say “no.” They should learn
the way this business really works.
All of these comments, and many more like them, are heard in enterprise after
enterprise around the world. Often enough, all of these statements may be heard
in the same enterprise, although they do seem mutually contradictory. How can
it be that senior management funds an information security function; provides it
with the latest, most effective tools; and backs those tools with a definitive security
policy, but still does not feel that the enterprise’s information is secure? In fact,
there is sufficient evidence that it is not secure. Somewhere within the workings
of the company or government agency, something that should be happening is not
happening. Someone—or many people—is not effectively supporting security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 11
Note the importance of the terms used in the definition from BMIS:
• Pattern—Not just intermittent, but continuous
• Behaviors—The way people act and not what they say they intend to do
• Beliefs—The core principles that people bring to the world of business
• Assumptions—The personal and societal expectations about information and its
protection that binds belief and behaviors
• Attitudes—The perspectives on security that are ingrained in people based on
previous experience
• Ways of doing things—The security procedures embedded in day-to-day
operations
A culture arises whenever two or more people are engaged in a common endeavor.
In a business setting, there is a pattern of behaviors, beliefs, assumptions, attitudes
and ways of doing things that constitute a corporate culture. To the extent that
information is a part of that business, there is a component that is a security
culture. It may be weak, ineffective, disorganized, contradictory, unrecognized and
haphazard, but it exists. A security culture exists in every enterprise. It is preferable
that a culture of security be strong, effective, well-organized, consistent and
supportive of the intentions of those in an enterprise who recognize that security is
a strategic attribute and contributes to the overall health of the enterprise. That is,
supposedly, the intention of management.
12 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
An enterprise that recognizes that it does not operate with an effective culture of
security, but that wishes to create one, must establish a systemic viewpoint across
the enterprise with regard to the protection of information. It must reconcile all the
contradictory impulses within the enterprise that inhibit the growth of security. A
culture cannot be effected quickly, as may the mechanics of security. There is no
appliance to install or software to implement. It involves the creation of a mindset
among the people who make up the enterprise and among those with whom it
comes in contact—vendors, customers, other stakeholders and the society at large.
That mindset, the outlook and attitudes that drive behavior, is the substance of a
culture, one that must be implanted, nurtured and accepted gradually. It cannot be
imposed from above, although organizational leadership can lead the way.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 13
• Technology may seem firm and unbending: A software package will always do
as it is told and a piece of equipment will always work the same way—until they
do not. Software and hardware are engineered artifacts, all of which have defects.
Of course, a culture may be defective as well, but it is much more likely to bend
and adapt than to break.
• Similarly, technology is not always factual. Just because a machine produces a
result does not mean that it is the right result. A sound culture must be based on
facts: the way people actually work, the value of the information with which they
work and their contradictory impulses that must be accommodated.
• It is odd to think of facts as hard and opinions and emotions as soft when the
reality is that many, if not most, people act on the stimuli of their emotions and
opinions and not of the harsh reality before them. A culture of security can be
used to mold opinions across an enterprise much more realistically to the risks of
their business and the environment in which they perform.
• A culture of security is precisely as strict as a given enterprise wants it to be.
There are some types of enterprises, such as national intelligence agencies or
banks, in which security is strictly observed. This did not occur haphazardly, but
was a natural consequence of business drivers that include profit and customer
service, to be sure, but also managed risk, achievement of organizational mission
and ethics.
There are other meanings of security for which a culture must be established to
counter. Security should not be oppressive, unrelenting, resentful or troublesome.
Security must not be allowed to be considered adverse to mission achievement;
where that is so, there is clear evidence that security is a weak part of the overall
corporate culture. It has allowed security to be seen as prohibition rather than
enablement. Among the rationales for a culture of security is the alignment of
security with the business as a whole. The negativity often associated with
security—locks, barricades, punishment, etc.—undermine its effectiveness. A
culture of security is necessary to overcome obstacles of those sorts.
A culture of security may be seen as soft because it is less tangible, but fuzziness
should not be confused with inaccuracy. Culture deals with perceptions, estimations,
preponderances and directions and not with the orderly array of numbers that is
found, for example, in accounting or finance. However, perceptions and directions
are often the indicators of reality, more so than the seemingly hard numbers that on
closer inspection—or revelation—may be seen as a smokescreen designed to obscure
reality. A culture determines what an enterprise actually does about security (or any
other objective, for that matter) and not what it says that it intends to do.
14 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Endnotes
1
Ibid., p. 16
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 15
16 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
The answer to these and many more questions that are raised in this volume is
context. A culture of security fits within a much broader context of how a society
interacts; how an enterprise works; and the moral, ethical, political and economic
belief systems of the individual who is a part of that culture. No one set of behaviors
can be extracted from its context and shown to be secure, or insecure, for that
matter. Even more bedeviling, certain patterns of behavior may be secure in routine
circumstances, but become less so in times of crisis. For example, a help desk may
usually respond to callers on a first-come, first-served basis, but needs to react
aggressively without respect to the order of calls when a network is under attack.
Does a person have to believe in security to act securely? What, in fact, does it
mean to believe in security? Security is not a religion, so where does belief enter the
discussion? It may be fair to ask for two lists, one of secure practices and another of
dangerous ones. If everyone followed the first and eschewed the second, would that
not create security? Indeed, there is a place for those lists: They are called policy,
standards, guidelines and procedures, which relate the way an enterprise is to go
about its mission. Rules have exceptions, and the people who follow them are not
robots. Judgment; comprehension; and, yes, beliefs enter into the way things actually
work, as opposed to how they are supposed to work.
communicate a systematic viewpoint that drives the policies and that, in turn, is so
conditioned by the way the policy writers have internalized the need for security that
it may accurately be described as a pattern of beliefs.
Thus, a culture of security does not guarantee an absence of breaches nor freedom
from error. It is not the cause of security, but rather a necessary context in which
security can be fostered and accepted. It is foundational to the achievement of
security without any preceding understanding of what security is or demands. The
culture does not create security, but true security cannot be created in the absence
of a supportive culture.
Enterprises are organic. That is, all enterprises, beyond the most rudimentary, are
a systematic coordination of many discrete and interacting parts. For example, in a
commercial business, there are those who create a product, those who sell it, those
18 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
who record and administer the money made, those who control the process, and
those who manage the enterprise as a whole. Each is driven by its own dynamics;
all are working together toward a common goal—or at least should be doing so.
The values, assumptions and attitudes that underlie those goals are referred to as the
“corporate culture” that forms a unifying whole for the enterprise and provides the
context within which events are viewed and understood.
All of these cultures have a place in any enterprise; they need not be contradictory
to one another. They need to be balanced. Selling is good, but not at all costs.
Customer service is good, but not to the exclusion of profit. Growth is good, but
not if existing customers are dissatisfied and take their business elsewhere. Recent
news has shown that when a company allows one aspect of its culture to become so
dominant that others are crowded out, bad results follow. Unbalanced companies
face devastated morale among employees; poor financial results; mass exodus of
staff; and, ultimately, the extinction of a company.
The culture of security is the focus of this volume not because it should be
dominant, but because it appears that, in many enterprises, it is unnecessarily
overlooked. Many enterprises today have a function that oversees security. In
fact, they have many such functions, each one focused on the security of physical
assets; personnel; operational processes; personal information; data; or, indeed,
information in all its forms. Collectively, they may foster a culture of security, but
without active, deliberate, intentional management support, that culture can be so
fractionalized that it is ineffective in the broader enterprise. These functions may
not even recognize that their competing perspectives on security are undermining
the very culture of which they would want to be a part. This, in turn, makes it
difficult for those supportive of security to balance it with competing cultures.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 19
but for it to be integrated into a unifying whole within the corporate culture. An
intentional culture of security does not create values, but heightens them among the
tensions that act on each individual who lives within it.
There are international standards for security. Most notably, the 27000 series from the
International Organization for Standardization (ISO)5 is the DNA, style guide, metric
system and scoreboard of security6 and is generally accepted to be definitive about its
management. Even in this case, there is a caveat to universality: “within the context
of the organization’s overall business risks.”7 What, then, of the context of societal
norms and expectations that differ from nation to nation and region to region? For
example, the primary control statement for Data protection and privacy of personal
information is “Data protection and privacy shall be ensured as required in relevant
legislation, regulations, and, if applicable, contractual clauses.”8 Thus, explicitly,
there is no universal meaning for privacy—and, by extension, for confidentiality and
the rest of security—but rather reliance on necessarily local laws, regulations and
contracts. The comparison is very clear, for example, across the Atlantic. In the US,
privacy is limited to industry verticals, primarily financial services9 and health care.10
In most of Europe, privacy is a clearly stated fundamental right across society.11
20 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
This is, again, the concept of a culture of security in context. If there are no
universals in security, who is to say that one culture is superior to another, or is it
true that there are no universals at all? At some elemental level, there are things
that must be achieved or there is no security: access to resources restricted to
authorized people, breaches detected and repaired, data backed up, and people
held accountable for their actions with regard to information. The thoroughness
with which these are accomplished rests, in part, within an enterprise’s culture of
security, but not entirely so. Even the best intentioned and motivated employees
may make mistakes, and these, on occasion, have disastrous security consequences.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 21
An organizational culture sets the limits of acceptable behavior within that entity.
It may be said that policy, not culture, is the driving force for establishing those
boundaries, but conceptually and legally, what an enterprise does is its policy,
not what its management claims that the company does. Policy is a reflection of
aspirational goals; in the case of security, it describes security as management
wants security to be. The way with which people actually work and protect
information (i.e., the culture of security) is reality, and absent any explicit moves
by management to provide disincentives (i.e., punishment) for policy breaches, the
culture of security is, in actuality, policy as well.
In some industries, there are regulatory requirements for security that, in a broad
manner, set the context of a culture of security. That does not mean that all banks,
insurers, brokerages or hospitals have internalized security in the same way or to the
same extent. To much the same degree as unregulated companies, the extent and impact
of a culture of security is dependent on management’s perception of the risk to the
enterprise’s information resources and its willingness (or ability) to fund initiatives that
would strengthen either the culture or specific security measures—or both.
People are not tabulae rasae (blank slates). As they enter an enterprise and become
a part of its culture, they bring their own set of cultural expectations derived from
22 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
their parents, siblings, schools and houses of worship, friends and relations, and
companies for which they worked previously. It is fair to say that some are affected
with a deep respect for security and that others are not. The reason for this is a
subject for sociologists, psychologists and anthropologists; for most others, it is
sufficient to accept and recognize differences of outlook and to find the ways to
incorporate them all under the cultural tent.
People can and do change their cultural perceptions, even if they do not realize it
or even realize that they have cultural perceptions. The accepted norms of behavior
are transmitted by many means, and not all of them are intentional. Rules of
confidentiality, for instance, may be documented, but a disapproving glance when
a customer’s name is mentioned can communicate more effectively than an entire
volume of policies. It is less clear how a weak culture affects a person whose mores
are more supportive of security than those of the enterprise. Does someone who is
inclined to respect access rights become unconcerned simply because others do not
share that outlook?
If no one is opposed to security, it is equally true that there are many who are not
vocal in its support. A culture of security needs its evangelists and champions, those
who are eager to speak up and set examples for others. True, there are always going
to be those whose moral outlook is clouded by pride, avarice and sloth, to say nothing
of stupidity. However, if people are rational actors, they will do the right thing—or at
least the most utilitarian thing—most of the time. People hold their beliefs privately,
whether they concern religion, morals or politics. When individuals encounter what
they see (or think they see) as a majority holding different viewpoints, they descend
into a “spiral of silence,” becoming less and less likely to speak up for their beliefs
and, thus, reinforcing their minority status and giving credence to the majority. This
is how culture is formed.12 Only those willing to break out of the spiral are able
to change culture; only those committed to secure behavior are able to drive the
transformation toward a culture of security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 23
Information may be represented in “digital form (e.g., data files stored on electronic
or optical media), material form (e.g., on paper) and “unrepresented” in the form of
employee knowledge. Information may be transmitted by various means including:
courier, electronic or verbal communication. “Whatever form information takes,
or the means by which the information is transmitted, it always needs appropriate
protection” (emphasis added).16 The quote makes a broad statement that all
information always needs protection, albeit at an appropriate level. Security, in
the context of culture, cannot be so dogmatic. It is fair to question, first, whether
all information requires protection and, second, how appropriateness is to be
determined. In practice, security is whatever level of protection the culture will
allow with cognizance of differences in approach dependent on the risk related to
different forms, representations, communications, storage and disposal of varying
sorts of information.
Security may be (and often is) defined solely as confidentiality, integrity and
availability (CIA). Without diminishing these characteristics, security may also be
understood to include privacy (different than confidentiality), authenticity, accuracy,
completeness, recoverability (different than availability) and currency. Yet, this is
24 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
not the view of security in the popular imagination. When (or if) the public at large
thinks of security, it is in terms of preventing computer hacking and viruses. From
War Games in 1983 to The Taking of Pelham 123 in 2009, it is always the bad guys
hacking the system who are stopped, always at the last moment, by the student;
simple, but honest worker; or old codger relying on the tried–and-true methods.
These popular entertainments are not important in themselves, but they mightily
affect the perception of those who make up the populations of enterprises. The less
these people understand the reality of either computers or security, the higher the
barrier to achieving a culture of security. In short, the culture of security is affected
by the overall culture.
Prevention of malicious attacks and malware are indeed a part, but only a part, of
security, and much, but not all, information is stored and manipulated on computer
systems. For the purposes of creating a culture of security, information must be
addressed in all its forms and secured against all its risks, in context. Inherent in
such a culture is the recognition of value in the information. It is value that is the
limiting factor for the appropriate level of protection. The cost of security cannot
exceed the value of the information to be protected; do not build a $20 fence for a
$10 horse.
There may be good operational reasons, or even organizational benefits, for treating
some data as of a higher value than is apparent. For example, sometimes the
economies of scale, especially from a medium-size business that deals with data
with a range of security classifications, can be found best by treating the security
of all of it in the same way—either all high or all low, but without any particular
thought given to the matter.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 25
Thus, the value of information is driven by its use, not by threats that may afflict
it. The same information encapsulated in bits on magnetic storage may require
more security than that same information printed on a piece of paper because
the electronic form is used in more processes than the written. Contrarily, the
printed report may be more valuable if it is used for strategic decisions, not routine
transactions. It is the people who are using the information, holding it in custody
or acting as its owner who must make the decisions about its value and, thus, its
security, and those people both make up and are affected by a culture of security.
Trust is necessary in a functional workplace, but for the most part, it is something
that develops slowly. Its necessity is best illustrated by an environment in which
trust is absent: a prison. In a prison, no one trusts anyone, so there are locked
cells within locked cell blocks within locked subsections within locked prison
doors. There are armed guards and rigorously enforced procedures to restrict freedom
of movement. No business could work effectively in that manner. The
heavy-handedness of prison security must be transformed to a structure in which trust
is rewarded. Security is the catalyst of reliability. It evolves over time from repeated
displays of consistency between words (e.g., policies, standards, management
pronouncements) and behavior (e.g., access privileges, rewards, punishments). It
also comes about when all the participants in an enterprise who share resources,
such as information, have an accurate perception of one another’s interests. One
hopes that all those interests align to the benefit of the enterprise as a whole.17
26 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
US President Ronald Reagan was noted for saying, “Trust, but verify.” Trust alone
is insufficient; it must be accompanied by controls, among which is verification,
the underpinning of security. A security system should provide the means to verify
the authenticity of information so that all parties using it can assume that it has
not been modified, destroyed or disclosed except by those with the authorization
to do so. It removes the burden of validating each item of information by placing
reliance—that is to say, trust—on the security system that protects it. That reliance is
not necessary only because dishonest people may try to steal, manipulate or destroy
information. No matter what processes are established, what values are instilled,
or how open and transparent management practices are, people will make mistakes
and do things that are not right.18 Security is just as necessary to protect information
against people who are well intentioned, but overzealous, lazy, sloppy or just plain
stupid as from those who are dishonest.
There is more to trust and security than the exchange of information within an
enterprise. Enterprises and government agencies need security to establish trust with
their customers and citizens. Good security is perceived by many enterprises as a
prerequisite for doing business. They must ensure service availability, protection
of customer information and the secure operation of systems that manage customer
information. Without the basic ingredient of trust, founded on security, customers
would simply turn to competitors. Those enterprises that provide goods and
services on the World Wide Web have perhaps the purest vision of the relationship
between trust and security. If the information on a site or the processes by which it
got there is not trusted, the enterprise will probably lose more than a sale—it will
lose a customer. Where trust is strategic, as in these online companies (or in any
business), security becomes a strategic necessity as well.
Security may be seen as a competitive advantage if the enterprise has a high degree
of trust in its information systems. It would drive customer acquisition and retention
and the preservation of the value in a brand. However, enterprises are, in the main,
loath to trumpet their security because they do not want to publicize their protective
measures, they do not want their reputations to be hostage to criminals or they have
not been able to sufficiently verify the integrity of their security systems to publicly
base customer trust on them.
Among the elements leading to trust between enterprises and their customers
is effective management communication of security goals and objectives.
Management must make customers aware that, within its enterprise, there are
incentives for awareness and reporting of security incidents, a broad understanding
that identification of security problems will be dealt with openly and without
retribution, and personal recognition for those who act supportively of security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 27
Trust is not an absolute; it changes and grows (or diminishes) over time.
Unfortunately, trust can be destroyed far more quickly than it can be established
or repaired. As enterprises themselves grow and change, the basis for trust must
be restated and proven anew. Every security incident that attracts public notice
undermines trust, but if management is shown to be nimble, sensitive, compassionate,
honest and courageous, it can often manage to overcome the negative impact caused
by a seemingly random event.19 Of course, the requirement of honesty calls for
immediate acceptance of a security-related problem and its rapid repair.
There is another facet of trust based on security that is between enterprises that
do business jointly. Each is a separate entity with its own proprietary information
and information systems. In some cases, they are even competitors (so-called
“coopetition”). They need to share some information for their joint ventures, but
more important, they need to protect most information from their partners. They
cannot do business together without trust, and they can only base that trust on
security and, to a great extent, on mutual respect for ethical behavior. Of course,
ethics is a part of security as well.
Trust is a shared cultural experience. To the extent that enterprises are able to create
a culture of security, they will be able to enjoy the benefits of trust with their staff
and customers and with other enterprises. Those benefits mostly accrue in the form
of smooth working relationships, respect and a certain naiveté that allows managers
to proceed with their business without constantly having to check and recheck the
validity of the information with which they work. These “soft” benefits manifest
themselves, over time, in reduced costs and increased profits.
28 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
The focus was subsequently enlarged to the prevention of misuse of data resources,
which includes fraud, but also includes a variety of other concerns such as the
disclosure of secret information (a major concern of governments), unauthorized
manipulation of data (more of a commercial concern)20 and destruction of
computing capacity. As noted, confidentiality, integrity and availability (CIA)
has become, for many, the basic definition of security itself. The purpose of
such elemental features of security as encryption and access control was to
prevent unauthorized and, therefore, potentially malign use of information.21 Both
encryption and access controls were the development of methods of information
protection that had preexisted computer systems. As such, when computers began
to emerge in the business world, these concepts were not unknown and, therefore,
were more readily accepted.
Security will always have fraud and misuse as parts of its domain. Until and unless
that domain is seen as having greater application and value, a culture of security
will be difficult to create.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 29
What are the risks that security is able to manage, and for that matter, what is
“risk”? While one definition of “risk” is that it is the “potential that a given threat
will exploit vulnerabilities of an asset or group of assets and thereby cause harm to
the organization,”24 others define it as the “uncertainty of harm to an asset or group
of assets.”25 The common element is that there is an asset that is subject to harm.
Implicitly, the asset must have value or it would not be an asset. Thus, the harm in
question is the loss of value. The risk is often stated in terms of the causes of the
harm, but this only states threats, not risks. There are indeed threats to information
such as disclosure, manipulation and destruction. The risks arise from the business
of which that information is a part.
Confidentiality Risks
It is true that computerization has changed the nature of information. Information is
hidden from view, and it is in a form that is unintelligible to human beings without
the aid of technical devices. To that extent, it is a great deal easier to prevent the
dissemination of information to unauthorized recipients. At the same time, it is so
compact that vast amounts of information can be stored in very small spaces. It is
easily copied and transmitted, and the copies are very difficult to control. When
information was primarily in printed or written form and stored in bulky containers,
it was relatively easy for someone to gain access to a small amount of information
and, for example, photograph or steal a few sheets of paper. Now, access is
invisible and entire files can be falsely obtained. Hence, the scale of unauthorized
disclosure has been enlarged exponentially.
In some cases, the results for an enterprise whose information was disclosed or
stolen have been quite onerous. For example, a US pharmaceutical manufacturer
sent an e-mail to all participants in an online service for users of certain drugs,
inadvertently disclosing all the names of the people using the service. As a result,
regulatory authorities forced the company to incur the costs for establishing and
maintaining a security program for the protection of its collected personally
identifiable information (PII). The company was further required to perform an
annual third-party audit of its program and have external oversight of the relevant
30 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
record keeping.26 In another case, a US-based retailer was hacked and the credit
card information of more than 45.7 million individuals was stolen. As a result, the
retailer paid credit card companies tens of millions of dollars to cover the cost of
the fraud and faced extensive expense to improve its security systems.27
Availability Risks
The same factors that make it easier to steal information also raise the risk of
destruction of information. In electronic form, information does not need to
be destroyed to be made unavailable. The fact that electronic equipment and
information systems are required to read the information means that any failure
of those devices or systems renders information as useless as if it were physically
destroyed. For enterprises dependent on information to function (perhaps a
majority of businesses in the world today), the absence of current information is
tantamount to a cessation of operations. It is no longer possible to go back to the
older, nonautomated way of doing things; the information in file cabinets no longer
exists—indeed, neither do the file cabinets.
Hence, plans and preparations for conducting affairs in the absence of current
information are a necessary part of security. It is a broader question as to whether
interruptions to business caused by factors other than information unavailability are
equally a part of information security. The important point is that an enterprise’s
tolerance for information loss must be gauged and preparations put in place either
to recover the data in a predetermined length of time or to replicate the data at
regular (perhaps instantaneous) intervals so that they will never be lost at all.
Integrity Risks
One of the more significant risks inherent in information is that decisions may be
made on the basis of erroneous information. There are numerous causes and effects.
If someone changes information so that actions will be taken to that person’s
benefit, it is termed fraud. If the same thing happens because of inadvertent or
foolish mistakes, then it is error, but in either case, the value of the information in
question has been diminished. That is the harm; that is the risk.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 31
The risk management decisions that must be made can be reduced to assuming the
worst case, the most likely case or differential cases based on the perceived value of
the information. Assuming the worse case leads to the application of the highest level
of security to all information, which is extremely secure, but difficult to cost justify.
Either of the other two approaches opens the door to an unpredictable level of harm,
the very definition of risk. Information thought to be of low value may be extremely
important in the wrong hands or when combined with other, seemingly unimportant
information. The event thought to be unlikely may just happen. Security “within the
context of the enterprise’s overall business activities” is the answer to risk, but the
context also establishes the fact that some harm will occur. Intolerance for harm that
is greater than expected is a foundation of a culture of security.
32 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
It is important to realize that the creation of a culture of security does not imply
that security must be a primary concern for an enterprise. Rather, the objective of
a culture of security is to ensure that security is set at an appropriate level in the
context of an enterprise’s overall operations. In some cases, the minimum level
may be both prudent and appropriate. It is certainly appropriate if organizational
management, in possession of all the relevant facts, makes a conscious decision that
no more than the minimum is required. However, for many industries and for the
information-intensive activities of all, the minimum is insufficient.
Levels of Security
Figure 1 illustrates varying levels of security that may be encountered in an enterprise.
intrusion detection,
identity
management, etc.,
are utilized.
are conducted for exist, but are virus protection function exists.
critical personnel. not enforced. and firewalls are
utilized.
An enterprise should be conscious of its own strategic requirements, and then, the
appropriate level of security should be incorporated in them. To the degree that
the confidentiality, integrity and trustworthiness of information are necessities
for the success of an enterprise, the enterprise will find that security is a strategic
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 33
To be sure, some enterprises have lower security requirements than others. One
whose products are material goods needs to focus on production and sale, but even
within these enterprises, there are functions that require secure information. If
financial information is not secure, there may be compliance violations. If formulas
are not kept secret, competitive advantage may be lost. If orders are changed or
manipulated, profits may be undermined. If senior management does not receive
accurate information, initiatives may be stunted. Thought of in these terms, security
may be considered as much of a strategic driver for a manufacturer as for an
online service.
Security may be one of many strategic drivers, but only rarely the primary one.
The achievement of the others may be enabled by the effective implementation of
security. It allows management and staff to focus on the business without constant
concern for the possibility of negative events and security incidents. Such a state
must come from intentional acts by management, but once established, security can
be transparent and implicit.
In practical terms, this implies a balance within security in parallel with the balance
of security among other strategic drivers. An enterprise cannot feel secure simply
because it enforces so-called “hard” passwords, has a security policy manual or
filters viruses. All the pieces of an information security management system need
to be addressed—if not equally, at least to the extent that no link is so weak as to
34 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
invalidate the integrity of the chain, i.e., the security program. This requires not
only that security be a system, but that the approach to security be thought
of systemically:
As noted, enterprises are organic wholes that are the amalgamation of interacting
systems and are, thus, supersystems in themselves. It would be erroneous and
fruitless to think of security in isolation from other aspects of an enterprise’s
systems, just as it would be fallacious to think of each individual characteristic of
security without reference to the others. In the absence of a holistic understanding
of security, enterprises see either the absence or duplication of security efforts, the
emergence of silos of responsibility for security, and difficulty in restraining costs.
It leads to deviations from accepted practice and, ultimately, to breaches of security.
For example, some enterprises invest disproportionally in incident response and not
enough in policy or prevention. Thus, they invite the very penetrations they would
presumably prefer not to encounter.
Endnotes
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 35
Directive 95/46/EC of the European Parliament and of the Council, 1995, better
11
2003, p. 213
See ISO/IEC 27000:2009, in which “information asset” is defined as “knowledge or
13
www.ibm.com/ibm/responsibility/trust.shtml, 2008
36 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Knight, Rory; Deborah Pretty; “Protecting Value in the Face of Mass Fatality
19
Events,” Oxford Metrica, Oxford, UK, 2005. This document is a bit off-topic,
but a mass-fatality event is the ultimate security crisis and Knight and Pretty’s
observations have broader applicability.
In the US, the Department of Defense published a series of papers with covers
20
of various colors that became known, subsequently, as the Rainbow Series. The
first, Trusted Computer Security Evaluation Criteria (CSC-STD-001-83, 1983)
placed the emphasis of security squarely on confidentiality. In 1989, David Clark
of Ernst & Whinney (now Ernst & Young) and David Wilson of the Massachusetts
Institute of Technology (US) published “A Comparison of Commercial and
Military Computer Security Policies” (http://groups.csail.mit.edu/ana/Publications/
PubPDFs/A%20Comparison%20of%20Commercial%20and%20Military%20
Computer%20Security%20Policies.pdf), subsequently known as the Clark-Wilson
Model, which placed the emphasis in the commercial sphere of data integrity.
Ross, Steven J.; “IS Security Matters?,” ISACA Journal, vol. 2, USA, 2010, p. 4
21
importance or even the relevance of the ISO 27000 standards. It is even more
debatable whether what ISO describes as an Information Security Management
System (ISMS) is equivalent to a culture of information security. Nonetheless, the
standards do represent a framework and a lexicon for security that are accepted
internationally and must be respected if not always observed.
Ibid., p. 1
24
Ross, Steven J.; “Four Little Words,” ISACA Journal, vol. 1, 2009. See also Taleb,
25
Nassim Nicholas; The Black Swan, Random House, USA, 2007, p. xvii-xviv, passim.
Eisenhauer, Margaret P.; The Privacy Case Book: A Global Survey of Privacy
26
New York Times, USA, 29 March 2007, and “Swiped, Stolen and Sold,”
New York Times (online edition), USA, 6 August 2008,
http://topics.blogs.nytimes.com/2008/08/06/swiped-stolen-and-sold/
ISACA, An Introduction to BMIS, op. cit., p. 10
28
Ibid., p. 11
29
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 37
38 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
If a culture of security does not, by itself, create greater security, then why should
anyone champion it? The answer is that a culture is a necessary precondition in
which to establish an appropriate level of security. By analogy, good soil and
climate, the terroir, do not make great wine, but without a fine terroir, the best
grape clones, skilled winemakers and the latest equipment will, at best, result in
mediocre wine. In the same way, without a culture of security, the most advanced
techniques, dedicated security professionals and the finest technology will lead
to a middling level of security, at best. Some security practices may exist,
perhaps supported by technology, but “weaknesses that result from inappropriate
governance, inadequate management, a dysfunctional culture or unready staff
cannot be fixed with technology.”2
Thus, a culture of security is not an end in itself, but a pathway to achieve and
maintain other objectives. In one sense, the primary objective is assurance that
information will not be misused. This assumes that there is consensus on the
appropriate use of information, with everything else being misuse. This sort of
consensus is hard to come by and is the reason why the implementation of security
can be contentious. A culture of security will not eliminate discord, but it will
establish a basis on which accord can be reached.
There can be security in the absence of a robust culture supporting it. It may even
be appropriate to the needs of the enterprise and a rational understanding of the
information in question, if only by happenstance. However, security without culture
cannot be sustained over time. All of the other organizational dynamics will distort
security to the point that it is unrecognizable. Tough economic times will lead to
crippling budget cuts. Competitive pressures will dissipate access controls and
separation of duties. Otherwise benign motivations to serve customers will become
the pretext to lower barriers, and as the enterprise morphs and changes, security
will fall to the wayside. In the same manner, a security culture itself needs to be
sustained over time. Creating it is one thing, strengthening it another, and keeping
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 39
it going and growing is definitely a third. Cultures, too, morph as the people who
constitute them come and go. There need to be self-sustaining elements to a culture
or the same security battles will be fought over and over again.
The greatest benefit of a culture of security is the effect it has on other dynamic
interconnections within an enterprise. It leads to greater internal and external trust,
consistency of results, easier compliance with laws and regulations and greater
value in the enterprise as whole. In short, a secure enterprise—one that is as secure
as it needs to be—is a better enterprise. The payoff comes from all the areas
in an enterprise where reliability, open communication and cooperation among
individuals and departments work smoothly together to achieve overall goals.
All that is required is a common commitment to values and behaviors that enable
mutual trust. (See section 6.1.)
As is often the case, the presence of an intentional security culture is harder to discern
than is its absence. The most common indicators of a weak culture of security are an
information security function (if one exists) that is underfunded, demoralized and
so far down the chain of command that it is unlikely to have any influence on
organizational decision making. At the same time, it is common for security
professionals, in their zeal to protect everything as much as possible, to feel that
they never have a large enough budget, are never fully appreciated and are always
overruled on important business decisions. The fact that an information security
function exists does not, in itself, indicate a robust, effective security culture.
Security is the basis for trust, as stated previously. At the same time, there is the
sign often encountered in American diners: “In God we trust. All others pay cash.”
40 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Absent security, there is little or no way to know whether a person has manipulated,
disclosed or destroyed information without authorization to do so. Thus, security
fosters trust by providing assurance that information is being used as allowed or
else that misuse would be noted. Security does not engender trust of others. It
only allows each person to have confidence in the information being used. The
degree of trust should not be overstated; one of the parties may use the information
to undermine a colleague, spread rumors or make silly mistakes. However, the
information that is used to do so can be trusted.
There is a chain of causation here: A culture fosters security, which, in turn, leads
to trust. In turn, trust leads to reliable expectations in the decisions and actions
made on the basis of the information in question. Ultimately, reliability is a benefit
that a culture of security provides. This is true in the workplace, but also in the
home. Spouses must trust the word of spouses and children that of parents. Readers
must trust their press or the advantages of an informed citizenry are lost. Citizens
must be able to trust the information provided to them by their governments or
there is chaos, despotism or both. There is an appropriate, wonderful feedback loop:
Where security fosters trust, it also increases the strength of the culture. Forced
security, as in a prison, may protect resources, but it will not create trust.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 41
How, then, does trust manifest itself in a supportive culture? It shows up in speed.
Where the recipient or user of information trusts it, there is no need for repetitive
checking and verification. Thus, transactions move through a system quickly and
decisions can be made rapidly (or even be automated based on a reliable set of
rules). In physics, motion is a function of velocity and time. The forward motion
of an enterprise is a recurring function of speed and time—in other words, of
organizational nimbleness.
Nimbleness is the ability to adapt speedily and easily to changing conditions and
continue to perform at a high level. It is characterized by, among other things, quick
and effective decision making, a marked degree of autonomy among the employees
and managers, high-performing teams, and an ability to work through ambiguity
quickly and correctly.4 All of these are enabled by trusted information or, viewed
the other way round, are impossible without trusted information.
There are varying levels of trust within an enterprise. Departments may share
information freely without a need for independent validation until certain decision
points are reached. For example, all the information needed to construct a purchase
order (PO) can be gained from a variety of sources, such as a vendor database,
contracts and procurement policies. However, at the point that a PO is to be issued,
it is prudent to have someone, usually the department head who will have to pay for
it, review and validate all the information.
In another sense of varying levels of trust, the confidence that can be applied to
information in normal conditions is sharply different than that needed in periods of
change, disruption or chaos. As information moves through an enterprise, there are
systems—information systems—that preserve its integrity as it moves from process
42 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
to process. Thus, an order becomes an inventory selection that becomes a sale that
becomes a delivery that becomes a payment. The information must stay unchanged,
except through authorized modification procedures, throughout the journey.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 43
Moreover, even banks that do not routinely verify signatures do so when the value
of a check passes a certain threshold. Viewed systemically, the level of security
required increases in a direct relationship with the value of the information. The
heritage of banking is such that trust of customers is only slowly gained; a long
history of timely payments and large deposits has been the basis of lending in the
past. With the vast expansion of banking activity following World War II, it was
necessary for financial institutions to broaden their “know your customer” policies
and introduce automation of the information in banking transactions, notably
including checks. Interestingly, in 1998, “know your customer” rules were proposed
44 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
as addenda to bank regulations in the US, but they were withdrawn because of
privacy concerns—yet another example of level setting within the broader culture
of security affecting businesses.
Security should be boring. House keys are boring. Smoke detectors are boring.
When security is working, it is unobtrusive, functional and pervasive. Security is
noticeable only when it is not there when needed. It becomes quite exciting when
there is a security breach: People scramble to find every copy of a report; computer
emergency response teams (CERTs) mobilize to restore systems and networks;
datacenter staff travel to distant recovery sites; etc. Good security is virtually
invisible. Poor security is very evident indeed.
The essence of boredom is repetitiveness and continuity. The same key unlocks
the same door every time. A smoke detector sits in place, with a small light always
glowing. When information is secure, only authorized people have access to
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 45
46 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
The culture of a successful enterprise must recognize that there are exceptions to
every rule. If there is a standard procedure for adjudicating differences between
those responsible for security and those with a legitimate business need to deviate
from prescribed measure, that enterprise’s culture of security is typified by
flexibility and reasonableness. It may seem paradoxical, but consistent application
of exception processes actually creates a more effective culture of security than one
that is authoritarian and unbending.
Of course, it is possible to have too many exceptions, thereby invalidating the rules.
In such a case, the security culture becomes one typified by a “whatever you can
get away with” outlook. This is one instance in which there are metrics for security.
If a given enterprise’s security is based on all exceptions and no application of
standards, then the exception is the rule.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 47
with a viable security culture is more likely to see the necessary actions followed
through. The benefit of consistency in risk management is that all participants can
respect the decisions made, even if they disagree with them.9
3.2.4 Predictability
One of the byproducts of consistency is a reasonable expectation that the same
process, tool or procedure will render the same results, time after time. This may
be particularly important for security purposes. If security is operative today,
all involved have a right to anticipate that it will work the same way tomorrow,
i.e., that permissible access will be allowed, with all others denied. Information
systems need to be built in the expectation of a certain level of overall security
so that developers can accurately gauge on what their systems can rely outside
an application and what they must build into it. Without predictability, each new
application development would require revisiting security from the top.
48 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
3.2.5 Standardization
If consistency requires security to do the same things time and again, then it ought
to do them the same way. To that end, enterprises settle on standard operating
procedures (SOPs); standard tools; and, well, standards. An important benefit of
standardization is that doing anything in a standardized manner leads not only
to effectiveness, but also efficiency. People within an enterprise do not have to
reinvent methods and practices, but can simply apply those that have worked in the
past to future endeavors. (There is a risk that should not be overlooked, of standards
becoming hidebound and restricting progress.)
There are two types of standards that contribute to a culture of security. Some are
internal to an enterprise, and others are generalized across an industry (e.g., the PCI
Data Security Standards [PCI DSS]), a nation (e.g., those of the American National
Standards Institute [ANSI] or British Standards Institute [BSI]) or the globe (e.g.,
the ISO standards referred to previously). The benefit of external standards is that
they lend themselves to certification, as is the case with ISO 2700111 for security
generally or British Standard (BS) 2599912 for business continuity management
(BCM). Certification itself has two benefits: It promotes a culture of security
and builds internal support for it. Within an enterprise, who would want to be the
cause of it losing its certification? Moreover, certification inspires trust among
business partners. If unable to look deeply into one another’s security practices,
they can place reliance on an independent third-party’s assurance that, at minimum,
a publicly described standard of security has been attained. Certification is a
benchmark of trustworthiness that may be a part of the glue that holds an extended
enterprise together. By the same token, certification is not an unbounded assurance
of security or recoverability. Any individual or enterprise wishing to place reliance
on an associated business’s certificate must understand exactly what is certified.13
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 49
In other enterprises, the risk management function is little more than buying
insurance. It is true that risk transfer is one acceptable response to risk; nonetheless,
where risk management is thought of as purely insurance, it fails in its overall
responsibility to the enterprise also to mitigate, control or formally accept risk.
Some of these risks are stated in standards and textbooks. Focusing on those risks
results in standard, textbook responses. Risk management has the greatest benefit
in dealing with evident, but unexpected, possibilities, so-called “black swans.”
Both the term and the concept were popularized by Nasim Nicholas Taleb,17 and
these sorts of risks are very much in the consciousness of many risk managers
today. Black swans are credible risks that are so far out of common experience
that they are not given the credence they deserve. Recently, the new term “white
swans” has arisen and means, in general, the ability to see and respond to risks
that are before everyone’s eyes and to take action against them with a view beyond
short-term financial interests. The economy of an enterprise whose business is
based on a sound understanding of the value of its resources will be significantly
healthier than the economy of an enterprise whose business lacks a strong,
value-based approach.18 A value-based approach to risk is more than a white swan;
in the context of information, it is a distinct part of a culture of security.
A culture of security does not, in itself, result in lower risk, although it should
contribute to lowering it. Where such a culture exists, personnel will be more
50 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
attuned to risk, better able to see the risks in the way that information is managed
and more likely to use information securely because they comprehend the risks to
themselves and to the enterprise if they do not.
Perhaps one of the reasons that a culture of security is thought of as the “soft” side
of security is that there is nothing to go out and buy. A pattern of behaviors, beliefs,
assumptions, attitudes and ways of doing things does not show up in a box, but that
does not mean that such a culture is free. It costs money to instill attitudes, change
behaviors and operating procedures, and instill a sense within an enterprise that
security adds value to an enterprise that supports it.
This subject is too dense and too well explored to be expanded on further here. The
relevant matter is whether a culture of security affects ROSI and whether the impact
is positive or negative. On the surface, it would seem that an enterprise with a more
pronounced culture of security would invest more in security than one without such
a culture. It would buy more security products, hire more security professionals,
and have more complex procedures for implementing and maintaining security.
However, that only looks at one side of the ledger. It does not factor in the expected
returns from the expenditures, the net present value of information resources
over time, annual loss expectations or the cost of recovery if security is breached.
Indeed, there are many ways of calculating ROSI, each with inherent biases in
favor of maximizing security or minimizing the effect of breaches.20 These are
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 51
not necessarily the opposite sides of the same coin, inasmuch as the reputational,
social, moral and professional value of security cannot always be stated in purely
monetary terms. There is no way to balance the recognized cost of prevention with
the unrecognizable savings from events that do not happen because the preventive
measures are in place.
Implicitly, then, an enterprise with a culture of security places a value on the frauds
that do not occur, the audit reports that do not need responses, the remediation and
recovery costs that do not have to be incurred, or the disaster that did not disrupt
business operations. The question is whether the ROSI attributable to a culture
of security is an article of faith or whether it can be demonstrated. There are
calculations intended to optimize the investment in security,21 but it remains to be
seen whether they can be tied to a cultural impetus.
One of the difficulties in linking ROSI and culture is that there are many
investments in security that are nondiscretionary. They are mandated by laws
and regulations, so even the most risk-tolerant, security-averse management must
implement some level of security. For example, privacy is required across the board
in some areas of the world and is specified by law in certain industries in other
nations. Additionally, there is a level of prudence that leads to expenditures that no
sensible manager would dispute. For example, who today would operate without
virus filters or firewalls?
52 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
A culture of security comes into play in the middle ground between required security
and clearly excessive spending. These categories are not clearly defined, but fuzzy sets
of values lend themselves to conveying a large amount of information with a very few
words. They make it possible for people to manage uncertainty as characterized by
structures that lack sharp, well-defined boundaries. To that extent, a culture of security
is rightly “soft” in the sense of a soft focus, one that provides a fuzzy, nondiscrete,
poorly defined view of the requirements for security. It is better to be approximately
right than to be wrong with mathematical precision. Numbers, after all, are as subject to
(mis)interpretation as are behaviors. Those looking for “hard” facts and figures would
do well to look elsewhere than into the culture of an enterprise, but they would fail to
take advantage of real forces that guide, steer and push enterprises to the right decisions.
Whichever the view, enterprises comply unless they deliberately court penalties
with disobedience. Some do so grudgingly, to the least extent possible, while
others embrace the framework that regulations imply. It is not necessary to say
that enterprises with strong security cultures are compliant to recognize that those
without such a culture are more likely to find regulations burdensome.
Those seeking to build a culture of security can use laws and regulations to their
advantage. Where security is legally mandated, external rules can be used to force
internal action. However, imposing a culture in the name of compliance may
change behaviors, but it also hardens negative attitudes about security. Moreover,
those enterprises that operate globally must meet so many regulations that all
investments in security may seem arduous.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 53
The greatest benefit of a culture of security is that those enterprises that have
one are simply better enterprises than those that do not. Private companies with a
culture of security create greater value for their shareholders. Government agencies
deliver greater value to their citizens. Real value is derived from profits and mission
accomplishment in the short term. The ability to continue to create value is based in
an enterprise’s culture.
As noted previously, this is all the more the case for those enterprises whose
business is the provision of information. For these businesses, service, growth and
productivity—to say nothing of profitability—are directly linked to the quality
of the information provided, and security is an attribute of quality. A culture
54 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
of security allows all stakeholders, from the lowest-paid employee to the chief
executive officer (CEO) to the investors, to see the alignment of business and
security objectives.
Even in enterprises whose products are tangible goods, there is valuable information
behind the products. Recipes, formulas, production processes, research, etc., are all
information that leads to the items such companies sell. They cannot take or satisfy
an order, find a product in a warehouse, or track delivery and payment without
information. Again, the quality of the information is a major contributing factor in
the success of the products.
Endnotes
1
Ibid., p. 12
2
Ibid., p. 8
3
ISO, ISO/IEC 27002, op.cit. p. 23
4
Powers, Burke; “Strategic Nimbleness as a Business Culture,” 2 August 2005,
http://strategicchange.blogspot.com/2005/08/strategic-nimbleness-as-business.html
5
Connor, Darryl R.; How to Create the Nimble Organization, John Wiley & Sons,
USA, 1998, p. 68-69. Note that Connor is referring to a culture of nimbleness, not
security. The point here is that one cannot exist without the other.
6
Ross, Steven J.; “The Vanished Perimeter,” Information Systems Control Journal,
vol. 5, USA, 2003. See also van Wyk, Kenneth; “How to Protect a Vanishing
Perimeter,” eSecurity Planet, 4 April 2005, www.esecurityplanet.com/views/
article.php/3494991/How-to-Protect-a-Vanishing-Perimeter.htm.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 55
56 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
In some societies, little children are taught that the police officer is their friend.1
Why is that message even necessary? Is it globally true? The message perhaps
unintentionally transmitted is that there is a reason to consider the police officer not
to be a friend. At best, the police officer is a crime fighter—at worst, someone who
is menacing to average citizens. If the imagery of policing is dominant in portraying
security of any sort, no less information security, it brings out the thought processes
imparted directly and indirectly from a very young age. The police officer is your
friend … and so is the information security officer.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 57
Complacent forgetfulness leaves an opening for thought patterns that are perhaps
the most destructive to a culture of security, one that is, to a degree, prevalent
in many societies: Information wants to be free.2 People in many societies have
become used to having the greatest compendium of information ever assembled
available to them only a few keystrokes away. That expectation may be extended
in their mind to all information on the Internet and everywhere. Of course, all
societies keep a great deal of information secret and inaccessible. In dictatorial
nations, much political and economic content is censored and constrained. Even
in more liberal countries, information needed for national security is kept closely
guarded. If every country has the government it deserves,3 then those governments
impose security on the information they do not want their people and enemies to see.
If the government’s or the corporation’s leadership is understood to be repressive,
there is a powerful incentive to bypass any protection and controls that seem to get
in the way of whatever people want to do with the information they seek.
58 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Where security is seen as the instrument of repression, there will indeed be a culture
of security, but that culture will be seen as a malignant one. Security in a society that
values freedom is not the enemy, but where the institutions of society are perceived
as existing for the purpose of limiting freedom, it is very difficult to build the trust
that a culture of security is supposed to nurture. The values of a society may propel
or inhibit the development of a culture of security as a positive contributor to an
enterprise’s success. Where security is viewed with suspicion in everyday life, it may
well be contrary to self-interest to assume that security is beneficial.
Just as the values of a society affect its culture of security, so do the values of an
enterprise influence its perspective on security. Commercial companies succeed
in the marketplace based on one or several strategies, such as better products,
lower price, greater customer service or higher fashionableness. Organizational
imperatives follow from these strategies: Make it better, cut costs, be more nimble
or get ahead of the curve. There have been numerous cases reported in recent years
in which “make it safer” has been an imperative, often after a product has been
shown to be unsafe.
For companies whose primary products are tangible and personally delivered
services, not information itself, the arguments for security become more tenuous.
Every investment in the security of information is one not made in new production
facilities, training or personnel. Even if information is intrinsic to making money,
it is often difficult to see how securing information is connected to the money
that is made. Where that connection is unclear, there is unlikely to be any clear
organizational priority for the security of information. In part, the purpose of a
culture of security is to make the identification of security with success more
evident. With perfect circular reasoning, there can be no culture of security where
security’s relationship with organizational success cannot be readily demonstrated
and understood, while the demonstrability and understanding of the link depends on
there being a culture of security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 59
companies may differ on how far their requirements must take them. In all
enterprises—regulated or not, public or private, or profit-making or charitable—
there must be an executive commitment that necessitates the protection of
information by everyone if there is to be a culture of security.
Therefore, the need for security is left unspoken in many instances. Silence is not
motivational. A culture would make explicit what is often assumed, but in the
absence of a vocal, compelling imperative for security from the top, it is difficult to
mobilize the enthusiasm and support of people throughout an enterprise. A culture
of security cannot grow in quiet darkness.
Even if the overall imperative for security were distinctly understood within an
enterprise, the specific requirements to fulfill the implied obligations are often
unclear, at least to those who must articulate them, to say nothing of those who must
carry them out. In part, this is because the only clear “owner” of security is the head
of the information security department, often called the chief information security
officer (CISO). This individual may be responsible for the security of all information
in whatever form it may be, but in practice, the CISO focuses on information only in
electronic form. In most instances, the CISO reports within the business group of the
chief information officer (CIO) and is often oriented toward protective mechanisms
such as firewalls and virus filters, identity management software, encryption, and
access control. These are unquestionably important tools for data security, but
paradoxically, they may stand in the way of a culture of security.
The role of the CISO and all the measures put in place by the CISO to protect
information on computer systems and networks create the impression, in some
quarters, that security is the domain of the CISO and that no others need concern
themselves about it. The focus on the CISO security role is so intense that others
have no clarity about their participation and contributions to it. The users of
information look to the CISO to protect their information, as do executives,
operators, clerks, staff personnel, salespeople and the janitor. Security becomes
60 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
invisible because someone else, i.e., the CISO, is taking care of it. In the immortal
words of Douglas Adams, the way to make something disappear is to declare it to
be someone else’s problem (SEP). “An SEP is something we can’t see, or don’t see,
or our brain doesn’t let us see, because we think that it’s somebody else’s problem. ...
The brain just edits it out, it’s like a blind spot.”4 This, specifically, is a major
inhibitor to a culture of security.
Rarely will an executive make clear what is required of all parties within an
enterprise to protect its information. At best, there may be a broad goal, often stated
as “security is everyone’s job.” If it is everyone’s job, it is no one’s in particular.
Worse, “everyone” has no tasks assigned, no product to deliver and no metrics to
achieve. General business users of information may comprehend their roles and
keep their passwords secret, desks clean and lips shut, but most people simply
trust their fellow workers to use information responsibly. As previously stated, a
culture of security fosters trust, but trust, in the absence of a culture that would
produce adequate security, is simply blind faith. Faith has its place in the world, but
information security is not that place.
Finally, and perhaps most important for the development of a culture, security
awareness implies a political argument: that security is actually a good thing for
the individual, an enterprise and society as a whole. For those already attuned to
security, it may be difficult not to see the validity of all these points. The challenge
in developing a culture of security is to communicate all of the concepts raised here
to people who actually do not comprehend any or all of them.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 61
There are places in the world where people do not lock their doors—they may not
even have locks—and only in exceptional circumstances do they feel unsafe. Outside
of the exceptions, they are right. They are not unaware of security; they feel secure
in the context of their own environments. Heightening their awareness of potential
threats—far away—would do little or nothing to alter their beliefs or behaviors.
Many enterprises feel like small villages to those who work there—where everyone
knows everyone; desks are left unlocked with papers strewn atop them; and no one
looks at other people’s computer screens, much less impersonate others by using
their passwords. Management encourages a sense of common purpose, togetherness
and trust. The cafeteria, company basketball team and holiday party all conspire to
make business feel like high school. It is not that people are unaware of threats to
the security of information; it is just that they cannot internalize a belief that they
themselves are at risk. A culture of security will not arise by raising their security
awareness. Their everyday experience will tell them that the person trying to do so
is a mad “Cassandra” (even if Cassandra was right).5
Security awareness does have a place within a culture of security, but reliance
on awareness to create such a culture is misplaced. It may be that an appropriate
security culture can be maintained by a good awareness program, but to change
a culture, all existing cultural measures must be reengineered.6 Thus, reliance on
security alone to create a culture of security results in inhibiting the very culture
desired. To return to the village where people do not lock their doors, the likeliest
reaction to a threat that affects the community would be to seize pitchforks and
torches and hunt the monster down. Once that particular threat is taken care of, the
villagers can return to their peaceful, trusting lives.
62 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
On the other hand, where there is a culture of security, appropriate action will
generally be taken against the most relevant risks, not just specific threats at a
particular moment. That is because the people who are the vessels for the culture
take risk seriously and internalize it into a system of behavior, not just a reaction to
individual perceived threats. There is no conscious decision on the city-dweller’s
part as to whether to lock the door. It is an ingrained action because the possibility
of harm is (or so it seems) self-evident. The countryside is not crime-free nor is
every building in every city under siege, but the rural and urban cultures form
attitudes and behaviors that, over time, prove themselves. People who live in rural
areas are often shocked when crime occurs; urban folk may live without crimes
affecting their own lives for years. Neither unexpected crime nor unanticipated
safety change attitudes and cultures, at least not in the short term.
Even where a threat is well documented and apparent to all—such as with computer
viruses—awareness, by itself, does not lead to routine action. Antivirus filters
must be continually enhanced to recognize and erase new variants, but experience
has shown that people do not regularly download updates by themselves. They
know that they should back up their files in case a virus does strike, but that, too,
occurs only irregularly. They are not acculturated to the risk, even if they are aware
of it. Thus, modern antivirus filters update themselves automatically, and many
enterprises choose to employ systems to back up hard drives whenever users log on
to their central networks, without the intervention of the information owner or user.
The inhibition that awareness places on the creation of a culture is often caused
by the way in which the need for security is communicated. If the insecurity of
information is seen as harming the enterprise, then it would seem sensible that
members of the enterprise should take responsibility for it. However, when security
becomes the domain of the IT function and the CISO (SEP), others feel absolved
of responsibility. Security is not seen as something that should be addressed by
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 63
the individuals because they are not the ones who will suffer the harm, or so
individuals can come to believe. In fact, each staff member faces the possibility
of considerable personal damage, ranging from the disdain of colleagues through
career limitation to loss of employment (or worse). The punishment would be
justified, it may be thought, if the misuse of information were intentional, but errors
and omissions—“I lost my laptop” or “I left the disk on the plane”—while scary,
are not seen as necessitating the same drastic responses. A culture of security has
been created when people say “I need to protect the information I have” instead of
“IT needs to protect the information I use.”
To get to that point, people must see the personal benefit of security, that security
has a personal payback. Unfortunately, many people’s experience with security,
in the broader sense, may lead them to believe that security is not good for them.
If people’s only interaction with security is having someone on the CISO’s staff
tell them that they cannot do something they want to do, they start to see security
as a problem for them to overcome. If the process for gaining access privileges is
cumbersome and bureaucratic, they will look for ways to circumvent the system.
In short, if the interactions people have with the CISO and the information security
function is, in general, a negative one, then it is doubtful that they will accept an
awareness program from that same source. They will not participate in a culture
that imposes the burden for security on themselves rather than on the security staff
that is paid to build security.
It is important to reemphasize that all the foregoing does not mean that there is no
place for awareness of security in a culture of security; in fact, it is an attribute of
such a culture. However, mere awareness is not the same as a culture, and reliance
on it alone will simply stand in the way of creating one. It is necessary to bridge the
gap between perception of the problem of security and acceptance that the problem
is a personal one that requires personal action and involvement in the solutions.
64 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Of course, the security of the information comes from preventing those unauthorized
to see it or change it from doing so. However, a system is incapable of distinguishing
who is authorized to see and use what items of information. A system is established
as a means of carrying out specific operations according to specified rules. A
system does not have the capacity to know the value of the information or the
authority of a would-be user to access it. The system only “knows” that there is
data within it that can be transformed into other data in different forms by means of
prescribed processes. To people, the data may be transformed into information; the
transformative processes are transactions and programs. A system may be the means
of imposing security over the information by equating users with identifiers and the
ability to use information with access control lists.
It is people, though, who create the lists, and the lists are (or should be) consistent
with rules as to how the system works. A system is a mindless, mechanical
vehicle to store and transport information until and unless people are involved.
People are a part of information systems, whether they recognize and accept this
fact or not. To the extent that people conceive of themselves as distinct from an
information system, they will not be able to see themselves as a part of securing it.
In short, there must be an equation among users, usages and resources used for an
information system to have coherence.
The semiotics of identity take on a generally unexpected reality. Public and private
sector institutions implement identity management as a way to bridge the gaps
among the users and the resources used. It is a way of controlling the interests of
the enterprise above those of the individual who may be trusted most of the time,
but not always, to use the information as intended. The seemingly dispassionate
granting or denying access to an information resource underscores the authority of
someone—who?—to make the decision. Where identity management is thus
pursued within a security matrix of controlled process and property, essentially, it is
even identical to access control.7 The control remains with the enterprises, and the
individual remains detached from the system. A culture of security cannot grow out
of such detachment.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 65
The prescriptions of policy are easily seen. They are printed in manuals, displayed
on login screens and reinforced by management briefings. The gap between aspired
security and reality is less easily observed. Those who bypass the rules (that is,
those who violate stated policy) are, in fact, expressing their disaffection with the
authority of an enterprise to serve their interests. It does not seem to matter
whether the disparity of interests is factually supported; the tension occurs from
complexity, i.e., a lack of transparency. In defiance, some individuals “solve”
this by taking control of their personal identifier(s) and the identification of
what they “own” themselves. However much an enterprise wishes to state its
ownership of information and the importance of securing it, it is nonetheless stating
a relationship: The enterprise not only owns the information, but owns those
who would use it. The one-sidedness of rule making, i.e., an enterprise’s formal
domination of the relationship with correspondingly biased rules for identification
and access, undermines a healthy culture of security.8
66 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
The club culture is, to a degree, self-enforcing. Even though the culture is not always
observed, there is tolerance for unacceptable behavior if it is discrete. Those who are
blatant in breaking the unspoken code can be frozen out of social circles or asked to
leave the club. Business enterprises are similar in that if people truly do not fit into a
culture, they may be fired or more likely encouraged to quit beforehand.
Where the culture in question is one of security, it is very clear that people can
suffer penalties for flagrant disregard if a security incident causing harm to an
enterprise can be traced to them. Is that really a culture of security, though? Surely,
actively harmful activities lead to explicit sanctions. Can the same be said for
giving insufficient consideration to security when the emphasis is placed on sales,
profit or growth instead?
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 67
A culture cannot be enforced in the same way that a policy is enforced. Culture is
not a law, but a shared way of behaving based on assumptions, expectations,
attitudes and beliefs. Therefore, it must be self-enforcing. If one does not behave as
expected, it will be clear that that person is not part of the group. The individual’s
thinking and behavior will be seen as different from what is expected by the group.
Enforcement of a security culture is dependent on how important that aspect of the
culture is. If a culture says to do what has to be done to close deals and also says to
follow security rules, sales may win out because it is perceived to be more strongly
valued within the culture. All elements of culture are not weighed the same.
The value of powerful champions for security (other than the security professionals)
is that they provide the background for enforcement of a culture. There need not
be a valid, intellectualized rationale for heightened security if one can say, “Do it
because the boss wants it this way.” However, the personal perspective of a senior
manager is a weak method for enforcing a culture, especially if the champion does
not have the support of peers or leaves an enterprise.
One of the aspects of a culture of security is informed risk acceptance. On the other
hand, uninformed risk acceptance—in actuality, intentional ignorance of risk—can be
used to justify any security shortcoming in advance of a loss. Worse for the culture,
it is impossible to prove that risk was imprudently accepted when a security breach
occurs. Thus, those who favor a strong security posture must justify investments in
countermeasures by demonstrating risk avoidance. Those who blindly and
thoughtlessly accept risk rarely have to justify their decision—after all, they have
accepted not only, risk but accountability—and they do not incur any costs. Their
bottom lines look better (until an incident occurs, which may be years later). The
security-conscious take all the personal risk up front. A culture of security is difficult
to build when people perceive that they can lose, but cannot win.
It is clear that those who bring in more sales or greater profit can be rewarded in
higher pay and bonuses, but what accolades and benefits come to the person whose
actions prevent an otherwise undetected security flaw from turning into a breach? It
can be shown that certain heroic efforts repelled an explicit attack, but not that regular
vigilance prevented that attack from occurring in the first place.10
68 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
To the degree that security professionals receive the credit for an enterprise’s
overall security posture, there is less incentive available for others who are focused
on operations, sales, production, distribution, etc., to act in a secure manner. To be
sure, there are penalties for insecure behavior, but the best that most personnel can
expect is that they break even with security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 69
In effect, the most widely accepted method for measuring risk multiplies the
unknown (probability) times the unknowable (impact). Why multiplication and
not, say, exponentiation? Why omit other factors, such as credibility, resources,
scale, duration, mean time to repair or mean time to recurrence? In short, current
70 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
This is the metrics issue in another guise, but it does raise a specific dilemma.
Security can be justified not only on pure cost avoidance, but on the basis of
prudence and due diligence. However, prudence and diligence are baseline
objectives, the bare minimum of security. A culture of security is, in great measure,
an unwillingness to settle for just the minimum, but, instead, the appropriate level
of security. The baseline may not need justification, but everything beyond it does.
Marketing personnel can conduct studies to show that an increase in price will not
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 71
significantly reduce sales and, therefore, that the increase will go to the bottom line.
The champions of security cannot demonstrate that an investment of x will bring y
security, much less that spending 2x will bring 3y security.
The foregoing section dealt with the rewards to the individuals who may support
a culture of security, with the focus on monetary compensation. There are other
sorts of rewards that an enterprise can bestow that are important to the creation
of a security culture. These do not directly involve money that goes to the
individual, but rather organizational advancement in the form of budget, influence,
management attention and the regard of one’s fellow staff members. Some people
are motivated solely by their remuneration, but others also find incentive in these
other sorts of rewards. When someone asks of a champion of security, “What is in
it for me?” (WIFM), that individual asks a core cultural question that may entail
seeking monetary compensation, but may also may be much more.
4.7.1 Budget
In many enterprises, security is an unfunded mandate. It is simply assumed that all
personnel will conduct themselves and their business activities in a secure manner.
Where the work performed or the resources used are considered sensitive or at risk,
the workplace may be specially protected (as in a datacenter or a vault). Managers
may have offices and file cabinets so that they can conduct their activities literally
behind closed doors. However, many people who come in routine contact with
information resources work in open areas or cubicles or do not work in a business
environment at all.
There is a cost for the tools and techniques to allow them all to work securely, such
as virtual private networks (VPNs), remote access devices, encrypted hard drives,
privacy screens and content filters. Generally, these tools are purchased centrally
and distributed to all relevant personnel. The costs are often charged back. Even if
they are absorbed as a corporate expense, line management has little involvement in
the selection of products or their applicability to each manager’s business function.
If one size does not fit all, managers must either use the tools selected for them or
find budget to obtain better tools.
It is not unusual for a manager to ask WIFM when budget for security must be
balanced with money for salaries, business equipment or travel. The manager’s
department must bear the additional cost without seeing the direct benefit to the
department’s function.
72 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
4.7.2 Influence
The most important characteristic of cultural champions is their ability to influence
the decisions of their enterprises. Implicitly, the cause that they advocate for is one
that they believe is not appropriately valued. They put their political capital behind
something—in this case, security—to achieve an objective that they consider worth
backing. They do so in an environment in which many people may be championing
many causes, such as new products, higher pay, environmental protection or social
consciousness, all of which have merit.
Potential champions of security may well ask WIFM if the effort to build a culture
of security entails a cost that would reduce their influence on other matters. There
is a payback in prestige both within and outside enterprises for those who promote
good causes, which can expand a person’s influence. The proponents of security
rarely get hearty congratulations for a security breach that does not occur. No
matter how important security may be, it requires an investment in personal clout.
Worse, they may see the accolades bestowed on someone who achieved a short-term
goal by circumventing security. At that moment, WIFM is a very human attitude.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 73
Endnotes
1
http://politedissent.com/images/jul08/policeman.html
2
Attributed originally to Stewart Brand. See Clark, Roger; “Information
Wants to Be Free…,” 24 February 2000, www.rogerclarke.com/II/IWtbF.html.
The complete quote attributed to Brand is more nuanced: “On the one hand
information wants to be expensive, because it’s so valuable. The right information
in the right place just changes your life. On the other hand, information wants to
be free, because the cost of getting it out is getting lower and lower all the time.
So you have these two fighting against each other.” Recognizing the value of
information and using “free” as “without cost” rather than as “liberated” is quite
different from the context in which the expression is usually used.
3
Joseph Marie de Maistre (French diplomat, writer, philosopher and politician,
1753-1821)
4
Adams, Douglas; Life, the Universe and Everything, UK, 1982, p. 29
5
A prophet in Greek mythology who was cursed so that her prophecies, though
true, were never to be believed
6
Schlienger, Thomas; Stephanie Teufel; “Information Security Culture—
From Analysis to Change,” International Institute of Management in
Telecommunications, University of Fribourg, Germany, 2003
7
Wiesse, Pieter; “Semiotics of Identity Management,” Sprouts Working Papers on
Information Systems, http://sprouts.aisnet.org/81/1/2006-02.pdf. , 2006, p. 4
8
Ibid., p. 33
9
The uncertainty principle in quantum mechanics, formulated by Heisenberg, that
the accurate measurement of one of two related, observable quantities, as position
and momentum or energy and time, produce uncertainties in the measurement of
the other, such that the product of the uncertainties of both quantities is equal or
greater than h/2∏, where h equals Planck’s constant.
10
See Taleb, op. cit., p. xxii – xxiv
11
Peacock, Marissa; “GRC Roll-up: The Mistakes and Rewards of IT Security
Compliance,” CMS Wire, 10 February 2010, www.cmswire.com/cms/enterprise-
cms/grc-rollup-the-mistakes-and-rewards-of-it-security-compliance-006652.php
12
Cameron, Kim; “A Process for Changing Organizational Culture,”
University of Michigan, USA, 2004, p. 9
13
www.hpcnet.org/cgi-bin/global/a_bus_card.cgi?SiteID=410037#x
14
Ibid.
15
Ross, Steven; “Effective Techniques for Risk Measurement,” SearchCompliance.
com, 22 July 2009, http://searchcompliance.techtarget.com/tip/0,289483,sid195_
gci1362498_mem1,00.html
74 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Of course, a large enterprise that comprises many people in some sort of hierarchy
has a much more complex, nuanced culture than one between two individuals.
They do not create the culture in which they operate; it exists as a function of them
coming together with a shared (or overlapping) purpose. The existence of a culture
precedes a determination of whether it is strong or weak, beneficial or malign, or
good or bad. Therefore, a culture of security exists. The objective of those who
support it is not to create it, but to strengthen it within the broader confines of an
enterprise’s corporate culture.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 75
The question remains as to how, once the deficiencies of a culture are known, to change
it in a positive direction. This not so much a matter of piling on more safeguards as of
changing minds, outlooks, attitudes and beliefs. Influencing decision making is one
thing; altering the framework of the decisions is quite another. The creation, if that is the
word, of a culture of security is to accomplish the latter.
The first and perhaps most important step in strengthening a culture of security is to
erase the negativism often associated with the subject. Security is often thought of,
at best, as the prevention of the occurrence of bad things such as fraud, disclosure
of private information or viruses. The imagery is of a police officer, a guard or a
locked door. Unfortunately, in many societies police officers, guards and locked
doors are emblems of repression and not very likely to inspire support for a culture
enshrining these images. Even in freer societies, the only contact most people
have with the police is when a crime occurs or when they are pulled over for
speeding. Guards and locked doors may keep valuable things safe, but they also are
impediments to free access. Most people do not enjoy being told what they cannot
do, even if they know they should not do some things.
Success in creating a security culture begins with altering the perception that
security is about negative events and, instead, associating it with the benefits to
people of moving freely, having access to everything they should have and knowing
(or having the ability to know) all that they would have a right to know. Security is
a positive attribute to those living without it. Security of information is also quite
76 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
positive for its owners; they care about who sees it and what is done with it. The
boundary between positive and negative is the decisions on who should and should
not do what. In information terms, those decisions are termed “access control,”
living by the rules of what information one can see or use.
Part of the difficulty with the perception of security, of living by the rules, is that the
rule breaker is often romanticized as a rebel, a pirate or even as an outlaw. These may
seem like dashing figures on the silver screen, but people are not nearly so impressed
by rebels, pirates and outlaws when they actually encounter them. The challenge is
to marshal the positive reality of security in support of a culture that values it. When
security is framed as trust, consistency, reliability, predictability and productivity, it
becomes easier to enlist others in a culture-strengthening exercise.
As stated, security has established a very negative brand, which is not effective in
developing a positive identity for security. However, here have been successful
efforts to rebrand security as a friendly if ever-vigilant force for good. In the US,
the National Crime Prevention Council has adopted a hound dog dressed up like a
detective as their logo. His name is McGruff the Crime Dog®, famous for his advice
on how to stop crime before it happens and for his great sense of humor, as his
web site3 proclaims. The marketing of crime prevention is intentionally made
people-friendly to take the harshness away from this form of security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 77
78 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 79
People need to be educated about security and their role in it, which is a great deal
more than being aware. However, education does not come simply in a classroom.
In fact, classroom training is useful for transferring skills, but not attitudes. People
may be educated in meetings, especially one-on-one, face-to-face meetings. It is
clearly infeasible to have personal meetings with every employee of a large
enterprise; what is necessary is to have such educational sessions with those who
show evidence of being potential champions and those whose positions should
require them to champion security.
One reason to educate people about security, rather than simply make them aware
of it, is that security awareness programs are unprovable. It cannot be shown
that awareness reduces the incidence of security breaches or lowers the cost of
countermeasures. So, when the inevitable attack does occur, some may feel that
the promise of security was not kept, which undermines security’s brand. It may be
expected that, after people have been educated (with regular reinforcement), they
should know about security, what its objectives and tools are, and what should be
their own responsibilities.
80 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
The people in an enterprise make the culture, and hence, there is a need for strong
human resource practices and management. Enterprises should be able to attract
and train the right people, develop them, engage them and help them perform,
inspire them, and ensure that they are committed. As stated in the beginning of
this volume, a culture, in general, has been defined to include shared attitudes
and beliefs and a way of doing things that is common within an enterprise. In
particular, a culture of security is shown in BMIS to be transformational, a shift
from functional security (what people do) to intentional security (how people think
and behave). The transformation has four primary areas of application: technology,
process, people and enterprise. In one case study, the movement toward an
intentional security culture was shown in figure 5.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 81
5.2.1 Intentionality
The defining factor in the transformation is that it is intentional. The term raises
the question whether the holder of the intent is the subject or the object. In the
first connotation, the intent is on the part of those who would create or strengthen
a security culture. In the other, it implies that the result will be to turn individuals
into the participants in the culture. The distinction may seem to be unimportant and
unnecessary in that both are required not only to create a culture of security, but to
see it take root. However, it does point the way as to who should champion an
intentional security culture and who should be involved in the transformation to it.
The problem with the first sense of intentionality is that it implies an actor,
someone who has an intention and acts on it. The act and the consequences are
closely tied. Something occurs because someone made it so. However, what
someone intends to do may not always work out as planned; a program to make
security more comprehensive may become so structured and bureaucratic that it
frustrates the original objective. On the other hand, it is impossible for the result of
an action to be intentional without the initial cause being intentional as well. Pulled
82 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
together, as was first seen by the British philosopher Jeremy Bentham,6 the intent
must be to produce something utilitarian—in this case, an enterprise that is and
behaves securely in a self-sustaining manner.
Thus, while these three candidates may have supporting roles in developing a
culture, they need others at a senior level to be the focal point. There is no one
position that is able to claim the mantle of security champion apart from those
mentioned previously. Much depends on the personality, seniority, political skill
and professional concern among those in the executive suite. Depending on the
enterprise, the champion may be the chief operating officer (COO), chief financial
officer (CFO), CIO, general counsel, or even the head of human resources (HR). Of
course, if several of these are already vying to drive their enterprise into a culture of
security, then much progress has been made already.
Many executives only have time for security matters when there is a regularly
scheduled update or if there is a serious security breach. They are well attuned to
the need for security of the information in their own hands; by the time it gets to
their level, it is either so concentrated or so sensitive that the need for security is
self-evident. There is very little for them to do as a group, apart from having one
of their members champion security. As individuals, they need only think about
security a little more, consider how it affects them and what they could do more
securely. They do not need to be persuaded that security is a positive value to
their enterprises; they need to be convinced to convince others.
• Middle management is often the greatest stumbling block to a security culture.
Again, it is not because middle managers are opposed to it, but because they are
the ones who must formulate budgets and meet senior management’s demands.
They hear clearly, “Sell more, grow bigger and make more profits.” “Be secure”
often gets drowned out. Moreover, it is they who transmit their understanding
of what their management wants to their own staffs. If they feel the heat for
objectives other than security, they transfer it downward.
84 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Looking at the hierarchy in this manner, it becomes evident that, even if senior
managers believe in a culture of security, the message will not reach the staff if the
middle managers are not similarly supportive. The staff must be led to believe that
management champions are as sincere when they urge security as they are when
they urge sales. If the staff carries the substance of a culture, middle managers are
the catalysts who make the substance react.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 85
As stated, asking the question is half the battle, which leaves another half to be
won. It is quite important that the same question be asked repeatedly, initiative
after initiative, project after project, until it no longer needs to be asked. Everyone
involved will think about security without being asked. In this way, a culture of
security goes from being intentional to being unintentional—so natural that it is no
longer thought about, but just done. It is even better if more than just the original
champion asks the question. That would indicate that the culture of security is
catching on at the highest levels.
All of this conversation occurs at the highest levels, but champions must also
communicate it downward. They can invigorate those in their line of authority, but
the message must also be conveyed down the chains of the champions’ peers, some
of whom are not yet involved in a culture of security. As noted previously, senior
management proposes, but middle management disposes.
There are more nuanced views of the correlation of budget and culture. For one
thing, the issue is not so much the money allocated for security, but how well it is
spent. If one enterprise’s security objectives can be met with less funding, then its
culture may, in fact, be superior to another that simply throws money at problems.7
It is also a matter of when the money is spent. When an enterprise first becomes
aware that its security is insufficient, at the onset of a demonstrable security
culture, it needs to spend more just to correct prior shortfalls. Since security is not
86 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
achieved overnight, an enterprise may spend more, but, at the time, be less secure.
The culture of security is more evident in the trajectory of improvement than in the
current state.
There is also the factor of how budgets are calculated and aggregated. One way, of
course, is to measure the money specifically allocated to an information security
function. This is meaningful, but incomplete. If security is pervasive within an
enterprise, there will be a security component to HR, facilities, operations, finance
and many other functions. It is true to say that an indicator of the strength of
a security culture is how widely an enterprise spreads its security investments.
Paradoxically, the stronger the culture, the harder it is to trace the money spent on
it. Measuring the total cost of security in an enterprise is a fascinating subject for
future research.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 87
Making people aware of the parts that they are to play in securing an enterprise’s
information resources and holding them accountable for their roles are essential
for an intentional security culture. There is a positive decision to be made in
assigning responsibility to a particular function. Awareness occurs on multiple
levels. Someone in a position of relatively high authority must conclude that a
given function has a set of responsibilities. This person may be the aforementioned
champion or someone influenced by the champion. Managers of the functions that
receive the mandate must accept that they bear the designated responsibility, and
staff members should also be consulted on their roles. The gap between grudging
and wholehearted acceptance is filled by a security culture. It is the culture that
creates awareness and not the other way around.
Of course, just because someone has a responsibility does not mean that the
individual knows how to fulfill it; therefore, the person must be educated. This may
be achieved in a number of ways, including formal training, professional literature,
coaching, the use of consultants or delegation to specialists in the assigned roles. Most
likely, the educational process will incorporate all of these learning alternatives.
(While it is difficult for CISOs to be the champions of a security culture, that is not
to say that they have no role in its strengthening. As stated previously, the function
88 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
For example, a standard such as “symmetric encryption systems that utilize shared
secret keys for authentication and encryption must change these keys on at least
an annual basis” cannot be issued without explanation and education. What is
a symmetric encryption system? Is there an asymmetric system, and how does
it differ? What is an encryption system? What are shared secret keys (are there
unshared ones?), and why must they be changed?
The standard helps build security; the education in its meaning and use builds a
culture of security. The challenge is not whether the information security function
has the ability to draft policies, standards and guidelines, but whether it has the
communications skill to sell them.
This is not a power to be used lightly, and as with so much of a culture of security,
it must be applied in context. It is insufficient for a CISO to state by fiat, “This
shall not pass.” There must be a broadly accepted framework within which that
power may be exercised. Fortunately, that context is provided by the policies,
standards and guidelines that had been agreed on previous to the decision in
question. If a product or project does not live up to them, it should not be allowed.
Policies should be the least malleable; there should be little, if any, cause to go
forward if policies are violated. Standards usually contain waiver mechanisms that
apply to cases in which the business or a technology cannot support a requirement.
Guidelines, by their nature, are most open to interpretation. Thus, the mechanisms
are there for security to be the deciding factor for or against an initiative. Senior
leaders, if well informed, have the right to make decisions contrary to security
interests, but they also inherently accept accountability if overruling security
concerns backfires. Attitudes, not platitudes, are the stuff of a security culture,
and those attitudes manifest themselves when tough decisions need to be made.
From a cultural perspective, it is sufficient that security be grounds for not doing
something, with the blessing of management.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 89
5.3.7 Rewards
If management must take the blame for poor decisions, it must also be rewarded for
good ones. As noted in section 4.6, the absence of rewards is an inhibitor of a security
culture. It is difficult, as noted, to equate the compensation for sales, growth or profits
with those of security. They cannot be measured on the same scale.
Security breaches come in many forms. If one is the result of external forces (e.g.,
hacks or DoS service attacks), it pays to be very public in taking action against them.9
If nothing else, it would demonstrate to customers and staff alike that a company is
serious about security if it makes strategic or tactical changes to its business model
in the face of attacks. Equally important is to be visible in responding to internal
violations of security. When these are criminal matters, an enterprise should seek
prosecution. If they are breaches of trust or propriety, they should not be swept under
the corporate carpet. If security is seen as a part of an enterprise’s business, then it
needs to show that it means business when it comes to security.
90 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
There is little doubt that laws and regulations have helped enterprises improve their
systems of internal control and, in turn, their security cultures. However, an equally
important factor has been the interaction of companies in what has been termed an
“extended enterprise.”10 If there is to be an active collaboration among business
partners, each must be satisfied that the other has achieved at least a comparable
level of security as its own. Mutual interdependence breeds a joint concern for
security. If any party to a transaction feels that it is exposed, there is little chance of
success. Thus, each seeks assurance from the other so that together they may reap a
“variety of business benefits (e.g., enhanced customer loyalty, increased revenues,
reduced inventory, reduced time to market for new products, more effective
business processes, reduced costs, and/or increased profits).”11
Note the reference to customers in the preceding quotation. When entrusted with
people’s (or enterprise’s) information, customers do not demand security, but simply
expect it. It is, or should be, a routine matter that those who hold information need
to protect it. This may be backed up by law and regulation (e.g., the European Data
Protection Directive of 1995), but it is a manifestation of a culture of security among
customers that they ask for security and among vendors that they supply it. After all,
vendors are also somebody’s customers. It is quite clear that customers dissatisfied with
security will bring action (and strengthen a security culture). When there are satisfied
customers, the tie between security and revenues and profits is more demonstrable, and
thus, it is easier to tie rewards to security.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 91
Endnotes
1
Sartre, Jean-Paul; Existentialism and Human Emotions, Citadel Press, USA,
1957, p. 15
2
For example, the volume 2, 2010 issue of the ISACA Journal, dedicated to
security, has on its cover a stylized personal computer looking like a vault door,
with two combination locks and a large vault bolt.
3
www.mcgruff.org
4
Knowles, Malcolm S.; Elwood F. Holton III; Richard A. Swanson; How Adults
Learn, Elsevier, UK, 2005
5
ISACA, An Introduction to BMIS, op. cit., p. 21
6
Bentham, Jeremy; An Introduction to the Principles of Morals and Legislation,
UK, 1780, p. 82-83
7
Boesen, Thomas; “New Tools for a Corporate Culture,” Balanced Scorecard
Report, Harvard Business School Publishing, USA, November-December 2000
8
Discussions of risk management in society at large are too numerous to cite. One
that is indicative of the public consciousness of risk management may be found in
Brooks, David; “Drilling for Certainty,” New York Times, USA, 28 May 2010
9
For example, Google received much positive publicity about its decision to
change its business plans in the face of perceived security attacks. (See “Google,
Inc.,” New York Times, USA, 20 April 2010, http://topics.nytimes.com/top/news/
business/companies/google_inc/index.html?scp=15&sq=Google+security&st=
cse.) Less discussed was the effect of the decision on morale within the company.
10
See David, Edward Wilson; Robert E. Spekman; Extended Enterprise: Gaining
Competitive Advantage Through Collaborative Supply Chains, FT Press, UK, 2004
11
Ibid., p. 132-133
92 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
The ultimate positive reinforcement, as stated previously, is the rewards that come
to the individual for treating information securely. There is an important distinction
here: Remuneration, advancement and influence come to people for what they
do to protect information resources and there can be no culture of security where
security is ignored. However, reinforcing a culture is different. It necessitates
actions to inculcate attitudes and beliefs, an organizational vision of how security
fits into its behavior, and a way of doing things.
The objective is to fuse the interests of the enterprise, the individual and security
into one organic whole. The enterprise may face circumstances in which security
seemingly runs counter to short-term goals, such as speed or flexibility in
responding to customer demands. An executive may think, “Who will know or care
if the rules are bent—not broken, to be sure—just a little?” An individual may see
security as an impediment, slowing things down and generating more bureaucracy.
A security culture reinforces itself by getting all within an enterprise to see that
security makes things better. This is the heartbeat that must be felt throughout an
enterprise: Secure is better.
Why is it better? If “secure is better” is the heartbeat of a security culture, then the
reasons it is better are a culture’s lifeblood. Secure is better because an enterprise’s
business depends on it. It is better because customers expect reliability. It is better
because it lives within tolerable risks. It is better because secure resources will not
be misused and will be there when needed. It is better because a secure anything is
better than an insecure anything.
The challenge for management is to build security into the way it thinks about and
runs an enterprise and by reinforcing all the positive attributes of a security culture
listed previously. It would be enough to make all personnel behave in a secure
manner, but the real goal of a culture is to convince them to think about what
they do for the business in a certain way, placing security, if not first, at least not
last. Thought patterns are best directed by emphasizing organizational goodness,
not the punishments that will be meted out for bad behavior. It calls for a liberal
application of honey with a dose of vinegar in reserve to spice it up.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 93
Unfortunately, many people do not see that a secure enterprise is a better enterprise.
That is because they see their own business requirements going in one direction and
security’s going in another, if not blocking their business objectives altogether. It is
insufficient to point out the risks these people are taking. They have accommodated
risk in their own minds. Whether they are rationalizing their ambitions or not, they
believe that there is something they want to do, generally to make more money for
a private-sector enterprise or improve service in the public sector, that they would
and could do except for the “silly” demands of security. For them, security is an
obstruction to overcome. By the time security becomes a consideration, it is already
too late. The information security function may prevail in stopping an insecure
initiative, but that will only deepen these employees’ suspicion of security and the
professionals who expound it.
It is largely because security does not deal in hard, documented facts such as laws
or insurance policies. Security looks to the future: what may happen, but has not
happened yet. Sometimes, security does deal with incidents, but once these have
been repaired, the next incident still lies in the future. Everyone has to accept
laws, standards and regulations that have already been issued, but people can hold
different opinions2 about what may happen in the future. Where a CISO may see
only the prospect of harm, a salesperson may see profit and discount the possibility
of an incident as so remote as to be dismissible.
94 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
When other staff functions manage alignment of a business with things that have
happened in the past, information security must manage the future. In reality, all
management should be oriented toward the future: the sales to be made, bonuses to
be paid and mistakes not to be made. In the same way that lawyers help their clients
to do what they want to do legally, security professionals can help their colleagues
to do what they want to do securely. The challenge to these professionals and the
secret of creating a security culture is to transform themselves from naysayers to
problem-solvers. They can demonstrate their alignment with the overall business by
assisting their colleagues to meet their own business objectives.
The CISOs of various industries cannot all achieve the same level of security; once
again, context enters the discussion. Security professionals can measure themselves
against others in their own industries. Using conferences, literature and personal
networks, they can learn what others are accomplishing in security and raise the
levels within their own enterprises. (A special challenge comes for those who lead the
pack in their own industries, but do not feel they have done enough. Perhaps they are
reaching for too much.) The interesting question that should be posed and answered is
whether the different appetites for risk, company to company, justify different degrees
of security. If so, security professionals must adjust their sights accordingly to remain
in alignment with their own enterprise’s goals. Not to do so is to oppose security to
the corporate culture, which is hardly conducive to reinforcing a security culture.
Security professionals in the private sector need to understand how their enterprises
make money and focus their attention there. Clearly, this entails security for the
systems that take orders, manage inventory and ship products. There are other
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 95
types of information such as formulas, trade secrets and new product developments
that are money makers also. A culture of security most frequently exists around
this latter sort of information. People accept that secret recipes and information
about the latest models needs to be secure. They have bought into a culture of
security, at least that far. That culture can be reinforced by extending it to other
types of information.
It is important that security professionals help their enterprises to recognize the real
extent of the risks they face. Simply put, some are targets for misuse of information
more than others. For example, information in the military and intelligence agencies
is more likely to be sought and misused than in civilian agencies. As mentioned, the
information in banks and other financial institutions is constantly under attack because
it has monetary value. A culture of security suffuses these types of enterprises
because the threats are clear. No one at the US Central Intelligence Agency (CIA);
UK Military Intelligence, Section 5 (MI5); or the Russian military intelligence
agency Glavnoye Razvedyvatel’noye Upravleniye (GRU) thinks for a moment that
security is a nonissue. In enterprises in which threats are not so evident, it may be
hard to see why anyone would misuse information about the seemingly uninteresting
products they make. However, if enterprises make revenue on their products and the
information about them supports profits, there will be someone who is interested in
stealing, revealing, modifying or destroying that information. The culture must rise to
the level of the threat for the safeguards to be appropriate.
Of course, there is a cost side to the risk equation. The best investments in security
are those that cost little and protect a lot. The best contribution of a security culture
to overall business objectives is the understanding that the right level of security,
in context, is a parallel objective of the business. Security devices and software can
be costly and protect against only a limited range of threats. A culture that leads to
understanding those threats costs very little and can be applied against the full array
of risks an enterprise faces.
96 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
Bad things that may happen do not always occur. The frequency of occurrence
(not the probability) is the measure of risk acceptability. Something that goes
wrong once in 10 times is surely unacceptable. Something that goes wrong once in
a million times may be acceptable, except for enterprises that perform millions of
risk-bearing transactions every year. Even for those who do not face risk as often, it
must be accepted that the one-in-a-million occurrence could happen today.
There is a virtual cycle between risk management and a security culture: the more
people in an enterprise who appreciate the nature of the risks they face, the more
likely they are to incorporate security in their attitudes concerning their business
and the more acculturated they are toward security, the more they will appreciate
the appropriate amount of risk they can take. In short, risk management reinforces a
culture of security. Nonetheless, in many enterprises, risk management is no stronger
than its security culture, so they both need to be elevated. In fact, it may be axiomatic
that where a security culture is strong, risk management is also strong.
In recent years, so many seemingly “smart” risks have proved to be foolish that
there is greater acceptance of risk management in enterprises around the world.
For the most part, the areas of risk that have caused the most harm do not concern
information. However, gross miscalculations in such diverse fields as warfare,
finance, petroleum and construction have heightened overall awareness of risk,
which can be leveraged for security’s sake. It is notable that the term “information
risk management” (IRM) is gaining currency. Some CISOs style themselves—or
report to—information risk managers. Much of the literature on the subject of IRM
addresses the same points as have been made about a security culture: alignment
with organizational goals, senior management support, visibility for security and
incentives for secure behavior. One study even states that the risk mindset—much
the same as a culture—must change for “information risk [to be] part of every
business discussion.”4 In these terms, it is easy to see the symbiosis between a
security culture and risk management.
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 97
For example, in the financial services industry, one group of people consists of
traders and another performs all the posttrade activities to execute the trades. The
functions are incompatible. Were a trader to carry out the posttrade activities, there
would be a significant breakdown in separation of duties. No one would cross
that line without realizing that, to do so, a fraud would be committed a fraud.5
The information that the two groups use is the same, but they use it at different
stages of a trading life cycle. Security is enforced by an access control system
that permits traders and operations personnel to see and act on the information
only at the required stages. Also, the access control system may be backed up by
an identity management system that recognizes all the people in a group called
“Traders” and another called “Trade Executors.” The point of this little treatise
on trade processing is that no one in a financial institution gives a second thought
about whether to carry out activities securely. This is simply the way in which the
job is done. Despite the occasional fraudster who finds a way around the system,
thousands (perhaps millions) of people are involved in trading every day, with
attitudes and behaviors enveloped in security without their even thinking about it.
There have been trading activities going on since the dawn of humanity that were
formalized during the Renaissance and passed down to the present day. A notable
turning point occurred when trading systems were automated in the second half of
the 20th century. At that time, security had to be built into a series of programs and
user operating procedures. Not all security decisions were as simple as the separation
of duties described previously. Limits, approvals, reporting, correction, controls
and many other aspects of business life have to be encapsulated in programs and
procedures. Even in an activity such as trading in which the need for security is so
98 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
evident, decisions can and have been made that strengthen or weaken overall security.
The manner in which those decisions are made is indicative of the strength of a
security culture. Including security professionals as advisors or even as designers
reinforces a culture of security. The same may be said of auditors, although they
would rarely design controls they may have to audit, and of risk managers, who may
be called upon for their insights into the risks in a business function.
In other words, to be rewarded in pay, promotion, respect and clout, managers must
not only do the right things with regard to security, but also be seen to be doing so
willingly, supportively and intentionally. They must be proactive in considering
security as a part of their jobs, insisting on secure solutions to day-to-day problems.
They should find themselves in accord with security professionals on most matters
and should not be constantly negotiating for less security in each new project and
system. Senior management should be aware of the attitudes and approaches taken
by middle managers and reward them accordingly.
If a business manager and a CISO disagree on the extent of security needed, should
the CISO always be considered to be correct, and how does a senior manager know
whether a middle manager is being obstructive to security or standing firm for the
appropriate level of security? To the first question, it is clear that CISOs are not
always right and that they sometimes are more extreme in their drive for the most
secure operations possible, losing sight of the business context in which security
is to be implemented. It is as incumbent on CISOs to learn to think like business
managers as it is on business managers to think like CISOs. That said, most CISOs
are not overly extreme all of the time (or else they will not be in their positions for
very long). A particular disagreement between the two means little, especially if the
difference of opinion is conducted in a collegial manner, but a pattern of conflict
is another matter entirely. Senior managers should be attuned to such behavior and
reward or reprimand accordingly.
One of the roles of senior managers is the resolution of disputes among their
subordinates. They can tell who is involved the most often and how often those
people are supported or denied in their arguments over security. (To be sure,
© 2011 ISACA. Al l Ri g h t s Re s e r v e d . 99
a senior manager may also act outside a security culture, but if the culture is
to be built around security in context, it is senior management that bears the
responsibility of establishing that context.) Senior managers do not need to rely
solely on their own judgment in determining who is and is not supportive of a
security culture. They know who is criticized in audits on a regular basis, who is
out of tune with colleagues and who is running the greatest risks.
Establishing the reward structure for a culture of security does not need to take the
form of a written set of metrics. In fact, to define a culture solely on the basis of a list
of dos and don’ts would unnecessarily constrain it. A quiet word in the corridor, a
note on an annual review or a supportive e-mail may do quite as well. The important
thing is for everyone to know that someone above is aware of the cultural temperature
of an enterprise and will take active measures to reward those trying to raise it.
6.2 Balance
Tightrope walkers have many skills and attributes: style, courage, determination,
showmanship and a little bit of magic. They definitely have a culture of security
that consists of balancing poles, nets and years of practice. What they have most
of all is balance. If building a security culture is not quite so treacherous as
tightrope walking, it calls just as much for balance. It requires some organizational
acrobatics, the ability to change direction and overcome inertia, and a solid central
position that does not shift when conditions do.
This ability to see the matters from another person’s viewpoint is a difficult skill to
master. It is the essence of organizational balance, especially for those who are entrusted
with the responsibility to keep information safe. In the information security functions of
many enterprises, there is a history of battles won, but mostly lost, and of slow progress
to push the front forward to overcome the forces that would sacrifice safety for meager
gains. The use of military terms in the previous sentence is intentional, and for those
security professionals who think in those terms, it is destructive of a security culture.
Security of business information is not a war, and implementing security is not a matter
of wins and losses. Those who think back on the introduction of a new application or
technology and are still upset because management did not support a particular security
initiative have to put all that behind them.
Each issue has to stand on its own merits in context and with balance. The imagery
of battle is counterproductive precisely because, even if security is not a war, one’s
fellow employees are allies in the fight. Security professionals need to ask themselves
what would be a “win” for the others in their enterprise who have its welfare at heart
just as much as they do. Security can be achieved by forcing a particular safeguard or
restriction to be put in place, but the imposition of organizational force undermines a
security culture.
To some extent, the problem lies with security professionals’ lack of understanding
(and some would say, of interest) in how a business actually works and how a
private enterprise makes money. This is, in most cases, overstated; most security
professionals have a very good comprehension of the workings of their enterprises,
gained through business impact analyses and risk assessments. What many lack is,
in the words of the poet, the ability to see themselves as others see them.6 In too
many enterprises, the information security function is not well liked, even where it
is respected. As explained in section 4.0, security professionals are sometimes seen
as organizational cops and not friendly. They still have the obligation to do the right
things for the security of an enterprise’s information; they must learn to put things
their way, but nicely.
In short, information truly is at risk and enterprises are at risk of not controlling
their precious information. This recognition needs to be engrained in every
company, government agency and charitable institution today. Technology has
altered the balance of security everywhere. The sheer amount of data is growing at
incredible rates, more than 50 percent year over year.7 In previous years, data were
measured in megabytes, then gigabytes and then terabytes. Industry analysts now
talk in terms of petabytes—that is thousands of trillions of bytes of information.
The information flows quickly as well. It is routine for large enterprises to have
communications lines of megabits per second. Even home users of the Internet are
seeing speeds of many megabits per second.
Information does not just fly, it walks as well. In many enterprises, there is a
growing understanding of the vast amount of data that move through society on
laptop computers; compact disks-read only memory (CD-ROMs); Universal Serial
Bus (USB) drives; backup tapes; and, yes, paper. In many cases, data do not
leave the protected perimeter of an enterprise’s data processing systems through a
security breach, in which someone accesses data without authorization, but, rather,
through transportation of data accessed in an authorized manner. This is not a new
concern, but the increased ubiquity and capacity of readily transportable media
have magnified the problem.8 Enterprises must be cognizant of the change in the
dynamic of securing all that information.
This also raises the urgency for strengthening a culture of security. Business leaders
cannot sit idly by, waiting for the CISO in their enterprise—if there is one—to
save the day. Enough has been said already about the need for champions who
are aware of the problem of securing so much information. The CISO’s task is
to communicate the magnitude of the problem and to present solutions that the
business can accommodate, and it is up to the leadership of each enterprise to heed
the warning.
In other words, if the balance point for security has changed, so, too, has the center
of gravity of a security culture. There are issues on which security professionals
can bend and others in which any bending will lead to rupture. Although there is
room for accommodation in some matters, the center must still hold. It is up to
security professionals everywhere to identify the breaking point for the information
in their enterprises. A security culture, too, is a fabric that can bend in some
places, but must be stiffened so as not to tear. For example, there is much room for
compromise as to which roles need access to which information to perform at an
optimum level, but there is no space for a breakdown in separation of duties. Some
latitude may be given to system administrators to have privileged access to many
servers and operating systems, but the ability to bypass supervision and control with
regard to programs and data is not negotiable. Information owners can deliberate
on how critical their information is to their business and, therefore, how quickly it
needs to be restored after a disruption. They cannot scrimp on cost by foregoing
recoverability altogether. Also, once they have decided on the sensitivity, criticality
and risk of their information, they cannot quarrel with the cost of the necessary
safeguards and controls to protect it to the level they have determined.
In many enterprises, it is felt that information owners are all in favor of security,
recoverability and control—until they hear the price of achieving it. If there is to be
a culture of security in an enterprise, it must be based on openness and cooperation
in finding the balance between the need and the cost of security. Security
professionals generally strive to provide decision makers with accurate and relevant
information of risk and costs, and information owners must not adjust their
risk-related decisions based purely on cost. That is not to say that affordability
should be taken out of the assessment of the appropriate level of security; cost is a
factor of appropriateness. However, the risk does not change whether the price of
safety is high. By analogy, many people buy the maximum amount of insurance
they can afford and accept the fact that, if the insured event occurs, they may not
be fully recompensed. Enterprises with a serious security culture make the right
choices, not always the ones that provide the highest level of security.
The distinction between security professionals and business leaders is, of course, a
false one. Those in the information security function are part of the business, and
many parts of an enterprise participate in security. As BMIS notes:
However, there are many others who play a part in information security, whether
they have a security title or not. (See section 5.3.) Each participant must be in
close contact with the others to build, in ISO’s terms, an information security
management system—not an information security department. Risk management,
HR, physical security, datacenter operations, corporate communications,
telecommunications, general counsel, internal audit, compliance, privacy and BCM
all have a part to play in securing information. It would be easy to think that all
these different organizational functions taken together would form the nucleus of a
security culture.
They would, if they recognized one another for their different roles in security.
Unfortunately, in all too many cases, they do not. A notable example in many
enterprises is the divergence between what should be closely entwined functions:
information security and BCM. Continuity and security are simply two points on
a spectrum of risk management. If the risk involved to information is one or more
events or conditions that create losses (financial, surely, but data losses as well), all
sources of those events or conditions should be understood as being the same, or at
least closely related. “Business continuity” is generally used for a loss caused by
a physical event (e.g., a disaster), and “information security” is generally used for
a loss caused by a logical event (e.g., a virus). As long as confidentiality, integrity
and availability are used as a definition of security, then business continuity must
be included. There is a definite convergence of interest between the two. Moreover,
the risk to availability stems from more than the possibility of disasters, which
occur rarely, but with enormous impact. Losses are caused by fires and earthquakes,
but they are also caused by downtime of any sort.10
Why do these two functions not work more seamlessly together? Why, for that
matter, are the different components of security not more closely aligned? The
problem is politics; the solution is a culture of security, which would, as BMIS puts
it, allow “for the convergence of security strategies,”11 operations, supervision and
reporting. The most useful contribution senior management can make to a security
culture, aside from intentionally championing its existence, is to ensure that all
those with converging security responsibilities reinforce one another rather than
needlessly, heedlessly fighting for their own “turf” at the expense of one another
and the detriment of the security cultures in their enterprises.
There is a process that enterprises can follow that will allow them to approach
complete absorption of security into a corporate culture even if they never
completely get there. It implies that the cultural environment is not static; at
the same time as requirements are issued (or become obsolete), systems are
being introduced, upgraded or discarded. The process calls for vigilance and
responsiveness. When an enterprise needs to respond to an internal or external
stimulus (e.g., a reorganization, an acquisition, or a new law or regulation), it needs
to instigate action regarding security. The first step is to analyze the requirement,
which can rarely be done by security professionals alone. It necessitates
involvement by those who own or use the information in question; often by legal
counsel; and, in some instances, by senior management.
The culture then needs to adapt to fit the requirement to the enterprise. Depending
on what it is, the CISO may lead the way or perhaps someone in an enterprise
closer to the impact of the change. Analysis must be performed to determine
whether an enterprise is already doing what is required everywhere and without
exception. At this point, the change is subject to automated tools for project
management, reporting, budgetary impact and role management. None of these are
automation of the culture as such, but together, they influence what the culture of
security is and what it is to become.
Authoritative
Source Identity
Business Events/Triggers
Business
Partner Access
Management
Protection
The point of the previous example is that there are technologies that have cultural
impact on security. In implementing technical tools, enterprises come face to face
with the fact that they need to tailor their cultures, security being not the least
aspect, to work effectively with their technologies. At the same time, technical tools
become instruments for the development of a security culture.
Repository Layer
The repository is more than just a data warehouse of laws and regulations. It
consists of components that interact in such a way as to provide the raw data that
need to be acted on by the business logic to make a culture comprehensible. Tools
that make up the repository include:
• Databases of relevant information about the enterprise, including the management
structure, identity management, infrastructure and applications, location and use
of information, and networks
• A compliance requirements database that contains the complete range of laws,
regulations, policies, standards, guidelines and directives to which an enterprise
is subject. There must be a normalizing structure to enable users to learn about all
the requirements for access control, identification, recoverability, etc.
• A database of cultural documents, closely linked (perhaps the same as) the
compliance requirements database, containing the actual language (translated
as necessary) of the policies, standards, guidelines, management dictates, laws,
regulations, etc.
• A policy interface that harmonizes naming standards, control processes, metadata,
technology elements, etc. This interface is needed to apply all the other information
in the repository to a given matter at hand: making all the pieces and parts of an
enterprise and its technology fit together in a manner that supports a security culture.
Presentation Layer
As it may be deduced from what has been said previously, it is no easy task to put
together all these components in a way that can be readily used by people. The data
generated by the business logic have to be interpreted and digested to a level that
makes appropriate actions clear and measures whether those actions have been or
are being taken in a timely fashion. The presentation layer has two components:
• A portal to enable users to navigate through all the data, services and reports
offered by the foregoing components. This portal should also have connections
to related systems such as configuration management, access control and identity
management.
• A dashboard that will enable managers at all levels to monitor the security culture.
It should facilitate ad hoc queries and reporting.
The cultural tool kit described does not stand alone. It needs to fit within the
process, enterprise and governance addressed previously. The set of tools that
is needed in one industry is not necessarily the same as those for others.
Manufacturing companies, for example, may be more concerned with personnel
safety than would be banks, which, in turn, may be more focused on uninterrupted
availability of IT systems. Just as surely as there is a body of law and regulation in
each industry around these issues, there is a broader set of attributes (see section
5.3) that should be the lens through which management views its security culture.
As with any set of tools, the quality of use is more important than the quality of the
tools themselves. The uses of automated tools are limited only by the imagination
of the user, but these can be categorized in such a way as to lead to effective growth
of a security culture:
• Managing a culture begins with recognizing that a culture exists. This requires
identification and documentation in the repository of all the attributes of a security
culture. It also implicitly requires that missing elements be identified and filled
in over time. This requires local personnel to recognize new and altered attributes
and requirements and enter them into the repository. Logging the entry should
initiate a workflow of evaluation, prioritization and assignment of responsibility.
• A cultural tool kit consists of more than programs and databases that labor to fix
things that no one recognizes as broken. When done correctly, the tool kit can
be used to reinforce the assimilation of the culture of the enterprise through the
details of security in system development, configuration management, datacenter
operations, vital records management and other aspects of using information.
This programmatic aspect of a culture of security is a reflection of enterprise and
governance as recognized through the evidence it leaves behind.
• Finally, a security culture is not an objective unto itself, but the inclusion of attitudes
and behaviors into the full array of day-to-day activities that make up a business.
The tool kit makes it possible to scan, monitor, anticipate, respond and learn over
time. It should be obvious that a cultural tool kit can be used to make an enterprise
more in tune with its internal and external obligations, but not all at once.
108 © 2011 ISACA. Al l Ri g h t s Re s e r v e d .
This is a generic description of software that is often labeled as governance, risk and
compliance (GRC). GRC itself is not a security culture, but it is impossible to have
a functioning culture of security if GRC is not managed. As a general statement, the
vendors of commercial GRC software products do not emphasize the cultural aspects
of what they are selling. However, their products cannot be implemented without
considering their cultural implications and an enterprise’s security culture can be
ratcheted up by using such tools.
Former New York (USA) mayor Ed Koch was famous for asking “How am I
doing?” of every citizen he encountered. As an elected official, responsible to the
people of the city, it was appropriate for him to ask such a question. So, too, this
question may be asked of those who would support a security culture. Interestingly,
it is not clear who should ask and who should answer. There are cultural and
political ramifications in both the query and the response.
The question demands an answer. Internal auditors and, to a lesser extent, security
professionals are empowered to answer, but the best that they can do with regard
to culture is observe and comment on patterns of behavior. Unless they are mind
readers, they have no ability to determine the beliefs, assumptions and attitudes of
others. Yet, a state of mind can be read through what people say, how they say it, to
whom they say it, under what circumstances they say it, and at what potential cost
in influence and respect it is said. In short, if a person wants an answer to “How am
I doing?” with regard to a security culture, that person should not only act securely,
but speak up about it.
“How am I doing?” is a very different question from “How are we doing?” The latter
is a question that should be asked foremost by boards of directors, which have a
fiduciary interest in the security of the enterprises they serve. They should, and often
do, inquire about the state of security, but less often about how security is viewed,
spoken of and acculturated. In one industry in one country, the responsibility is
clearly stated: “Information security should be supported throughout the institution,
including the board of directors, senior management, information security officers,
employees, auditors, service providers and contractors.”15 All of these people should
have a firm grasp of what the stakeholders are doing with regard to security and
should attempt to impute intent from actions.
Stakeholder feedback should be sought out, but in a culture in which all are
participants, who is not a stakeholder? Reactions should be sought from the
ultimate stakeholders, the customers (or the citizenry, in the case of a public sector
enterprise). There is some information that is purely proprietary to an enterprise,
such as strategic plans, financial reports and product evaluations, but a great deal of
information in every business is actually someone else’s orders, records, personal
data, medical history, literary preferences, travel plans, etc. The people in question
have a very real stake in the security of their information and the way in which it is
used by the people in enterprises that have been provided that information. Today,
many enterprises are seeking assurance on security and recoverability from their
vendors, and the nature of their questions should, in part, inform their own security
culture and those of the respondents. However, senior managers, especially those
who elect to be champions of a security culture, should be asking their business
partners, “How are we doing?” Security should be a part of the question.
Endnotes
1
“Business: Concerning Morgan,” Time Magazine, USA, 21 March 1927,
www.time.com/time/magazine/article/0,9171,730161,00.html
2
The American sociologist and senator Daniel Patrick Moynihan said, “Everyone is
entitled to his own opinion but not his own facts.”
3
Willie “The Actor” Sutton was a small-time American bank robber. He was not
very good at his trade and kept getting caught. When he was asked why he kept
robbing banks, he replied “Because that’s where the money is.” See US Federal
Bureau of Investigation, www.fbi.gov/libref/historic/famcases/sutton/sutton.htm.
4
Johnson, M. Eric; Eric Goetz; Shari Lawrence Pfleeger; “Security Through
Information Risk Management,” Dartmouth College, USA, http://mba.tuck.
dartmouth.edu/digital/Research/ResearchProjects/JohnsonRiskManagement_
Finald.pdf
5
This was precisely what occurred in the massive fraud that brought down the
Barings banking firm in 1995. (See “Bank of England Cites Fraud in Barings
Collapse,” New York Times, USA, 19 July 1995). The same thing is alleged in the
case at Societé Generale in 2008.
6
Burns, Robert; “To a Louse,” Scotland, 1786, the actual line is “Oh would some
power the giftie gie us, to see ourselves as others see us.”
7
“Disk Storage Systems Market Rebounds to Double-Digit Growth Across All
Segments in First Quarter, According to IDC,” Press release, 4 June 2010,
www.idc.com/about/viewpressrelease.jsp?containerId=prUS22368310§ionId=
null&elementId=null&pageType=SYNOPSIS
8
Ross, Steven; “Data Plumbing,” ISACA Journal, vol. 6, USA, 2009
From the cultural perspective, there is a need to eliminate attitudes and behaviors
that are harmful to security. It is easier to manage actions than thoughts, and if all
people always acted in a secure manner, no matter what they thought, the issue
of the culture would be moot. However, that is not human nature; the thought
instigates the deed. Both need to be addressed, and a security culture must be
advanced by counteracting insecurity in both word and action. At some point,
negative reinforcement involves discipline, but that cannot be the only basis of
an effective security culture. This simply reinforces the perception of security as
a negative force. Rather, just as positive reinforcement consists of management
practices to promote desired attitudes and ways of doing things, negative
reinforcement entails prevention of unwanted behaviors and thinking.
It is important to note that all managers need to be involved, not just a CISO. It is
tempting to use security professionals to frighten people into surly submission to
security requirements. In the long run, though, it undermines a security culture, as
noted previously, and no security professional should accept that role. Culturally,
it is quite different to say that people should behave securely so as not to run
afoul of the information security department as opposed to doing so because it is
management’s expectation—a part of their jobs. This is the difference between
functional and intentional security, as described in section 5.2.
Perverse incentives are “measures that have unintended and undesirable effects which
go against the interest of the incentive makers. They become counterproductive in the
end.”1 It is not unusual to read about, or even experience, this sort of conundrum, but
in security, as in many other endeavors, the possibility of an action having an equal
and opposite—but very much undesired—reaction is an ever-present possibility.
Humans being human, these things will occur; it is up to those who manage within a
security culture to be sensitive to the occurrence of such incentives and to stamp them
out when they occur.
The most common example of a perverse security incentive deals with passwords.
In many instances, enterprises attempt to make passwords “tougher” by making
them longer. It is not unusual to see a system-enforced requirement for passwords
to be eight characters long.2 So-called “hard” passwords also may call for
capitalized letters, special characters and numerals. However, research has shown
that such conglomerations of digits, letters and symbols are very difficult to
remember, so people write the passwords down and sometimes post them near their
workspaces so that they will not be locked out and have to call a help desk, which
incurs both lost productivity and a cost for a password reset.3 The very purpose
of passwords, to authenticate user identities, is, thus, completely undermined by
attempts to enhance them.
This has the destructive cultural effect of encouraging people to act in a manner that
is clearly beneficial to themselves (i.e., higher productivity), directly at the expense
of security. Saying one thing while doing another does create a way of doing things,
but that pattern is the exact opposite of what the basis for a security culture may be.
Most parents know the “do as I say, not as I do” trap. This makes it especially
important that management at all levels in an enterprise avoids the temptation to
bypass security measures for its own convenience. In many, if not all, enterprises,
there is already a cultural divide between management and staff. Demonstrating that
security is for the “little people” while the leadership can ignore it without penalty
sends a defeatist message about that enterprise’s culture of security. The necessary
negative reinforcement is to hold executives to the highest standards of compliance.
In particular, security professionals should never bypass security.4
Bruce Schneier, the noted information security specialist, has addressed the topic of
perverse incentives:
7.2 Vigilance
US president and revolutionary patriot Thomas Jefferson said, “The price of freedom
is eternal vigilance.” Although an enterprise’s security culture is important, it cannot
be compared with the concept of freedom, but vigilance is the price of both. Not
only do enterprises need to watch for shortcomings in security itself, but they must
also constantly observe themselves and look for signs of reversion in the culture.
As difficult as it is to strengthen a security culture, it is all too easy to backslide and
return to old, bad habits for all the reasons expressed in section 4 and more.
a whole while, at the same time, being an opportunity for improved management.
Unfortunately, cultural weaknesses are rarely self-correcting. It is impossible
to break out of a downward slide without deliberate management attention. An
enterprise with an intentional security culture must constantly manage it.7
The first things for which to look are those that an enterprise does not have:
a dedicated information security function, security policies and standards,
enforcement mechanisms, automated security safeguards, or tight access controls.
(Of course, if these do not exist, it is difficult to substantiate the claim that an
intentional security culture exists at all.) Assuming that these are present, a leading
indicator of cultural weakness is the number of disputes concerning security that
must be resolved by senior management. If each new initiative, system or product
causes a confrontation between security professionals and business leaders, there is
clearly something amiss.
The matter of balance comes into play. CISOs are not always right, but they are
not always wrong, either. If the majority of disagreements are settled in favor of
greater security, it signifies that greater awareness and training are required in
certain business areas. Training should not be seen as negative reinforcement, but
when it is remedial training, it clearly has a negative impact. However, if senior
management regularly overrides security, then perhaps the security professionals
have lost sight of the overall business objectives, are out of step with management’s
directions, or have simply lost balance and underestimated the enterprise’s risk
appetite. Here the negative reinforcement is much more direct; always being
overruled generally leads away from compensation; promotion; influence; and,
ultimately, employment.
Much more difficult to deal with, but perhaps more insidious, are casual
conversations that downplay the importance of security. If people discuss
information concerning a customer, client or patient in a public space, they are not
only violating implicit policy, but they are undermining the enterprise’s security
culture. If they carry home sensitive information on a CD-ROM or thumb drive,
they are not only leaking data, but diminishing the culture of security. When
someone shares a password, the security culture suffers. As bad as these sorts
of activities are, from a cultural perspective, it is even worse when someone—
especially a manager—approves such behavior.
This sabotage of a security culture can be stopped, but only if managers and
colleagues are vigilant about doing so. It may seem difficult for an enterprise to
stop such attitudes and actions, but in recent years, it has been done with regard to
differences in race, gender and sexual orientation. Through training; admonition;
and, above all else, negative reinforcement, cultures around the world have
changed. If unacceptable behavior and speech have not been eliminated, they have
been reduced. If enterprise cultures have been transformed with regard to long-held
prejudices, changing minds about security should be even easier.
If poor security is indicative of a poor security culture, then security failures are
leading indicators. These would not be outright security breaches, but all the little
(and some not so little) deficiencies in security that, if left uncorrected, may result
in a true breach over time. For example, an incorrectly entered password means
nothing by itself, but with a culture defined, in part, as a pattern of behavior, then
a pattern of incorrectly entered passwords may mean that a culture of security has
failed to reach into some part of an enterprise. The same may be said of standards
violations, unapplied patches, sensitive papers left on desktops and ill-considered
conversations in public places. At the very least, these show that someone was not
paying attention at the awareness sessions. At worst, such patterns may indicate a
deliberate ignorance of security requirements.
If the root cause proves to be ineffective training, then the security education
program can be enhanced. If certain individuals are ignoring or bypassing
security, they should be shown the possible repercussions of their actions and be
reprimanded. If it is management that is at fault, then deliberate reinforcement
of the tenets of security should be made. Moreover, managers must be shown
the correlation between the so-called “silly” requirements of security and an
enterprise’s overall risk profile.
While automated tools can assist in the vigilance required to maintain an effective
security culture, there is more to it than reviewing computer-generated reports.
Management9 at all levels must be attuned to the possibilities for weakening the
culture and be prepared to take appropriate action. To do so, managers need to be
aware of the indicators of backsliding. Some should raise concern, others should
instigate corrective action and still others should be a routine part of assuring that a
culture of security remains strong.
7.4.1 Alerts
Audit comments on a security culture are an effective means to keep management’s
level of awareness of the culture high, but audits are periodic affairs and occur only
after the fact. There should be indicators produced within a culture that can show
management where trouble spots may be arising and what to do about them before
they become troublesome. One means of doing so would be a “dashboard” related
to a security culture. (See section 6.4.1). The concept of a dashboard relates to that
of key performance indicators (KPIs). These are quantifiable measurements that can
be traced over time to show progress or regression.
What, then, are those goals and values that can be measured quantitatively?
Project 2
Project 3
Project 4
Project 5
Project 6
7.4.2 Alarms
In some cases, action must be taken at once to correct a growing problem. The
clearest case is the existence of actual security violations. These should be tracked
and reported to determine whether there is a pattern to the attacks and inappropriate
actions. Combating actual security weakness is the objective of an information
security function, but by itself, is not a cultural KPI. A measure of a security
culture would be the budget in terms of staff time and capital outlay to eliminate
a weakness once it is exposed and the time taken between an alarm and the
corresponding response.
Another sort of alarm that is a KPI of a security culture is the disputed requirements
for security that are elevated to a senior level (see section 7.2), in which cases,
management must actually make security-related decisions. A dispute in and of
itself means little, but there are several matters to be aware of: how often these
disputes occur, from where they stem and how difficult it is to resolve them. If
nearly every project results in management intervention, then it is indicative of
problems, not only with the process for resolution, but in the culture itself. A
different story is told if disagreements arise routinely in certain sectors and not
in others or if disagreements regularly occur with the same type of information
(research, financials, personnel records, etc.). These should be alarms for
management to take a closer look at certain policies and people regarding those
sorts of information.
7.4.3 Triggers
There are planned and unplanned cultural triggers that necessitate management
action. The former are generally associated with the passage of time. For example,
management should consider an assessment of an enterprise’s security culture on
a regular basis, perhaps every few years (more or less dependent on the overall
corporate culture and the sensitivity of the information involved). This implies that
management is aware of a culture of security, recognizes its importance and supports
maintaining it at a high level. Where that is not the case, some party (i.e., information
security, the security champion, internal audit) may perform such an assessment
on its own, presenting the results to management and, one would hope, triggering
both further analysis and, ultimately, decisive action. A formal assessment is not a
necessity for creating an intentional culture of security (see section 5), but depending
on its findings, it may provide a badly needed “wake-up call.”
Not every issue is a momentous problem; many are simply the result of one-time
human error. These are easily corrected. The first step is to determine which
anomalies are triggers for root cause analysis. The frequency that constitutes a
Typically, a root cause analysis will not result in an explanation that “A occurred
because of B.” Often, B was caused by C, which was caused by D, etc. For that
matter, it is not always the case that an event has a single cause. Therefore, it is
important to clarify the chain of causation leading to a defect. It is always necessary
to validate whether the negative KPIs in question are accurate; it may be that the
problem is in the measurement, not the process itself. The responsible person or
people should be interviewed to gain an understanding of the reason for the
cultural shortcoming.
The causes of a cultural weakness may not be clear, especially if it requires working
backward from effects to causes. Essentially, this necessitates asking, “If A had not
occurred, would B have happened?” in an iterative fashion. The causal chain should
be examined at each step to see whether the action suspected of causing the ultimate
cultural weakness did, in fact, contribute. At the end of the root cause analysis
process, it is necessary to reach conclusions as to the underlying causes of cultural
weaknesses. While it is important not to allow root cause analysis to become a search
for someone to blame, it is equally important to find and fix existential problems.
Affixing blame is self-defeating; it leads to the conclusion that the problem is caused
by ineffective people rather than dysfunctional processes or technology.
It is important to note that identification of a root cause provides the opportunity for
leverage. The resolution of a problem at its source may have wider ramifications
further down the line, but there are times when the cause really is individuals acting
in ways that are counter to an intentional security culture. In those cases, negative
reinforcement is called for, rather than simply blame.
If a security culture is to have any meaning, there are times when it is necessary to
take decisive and punitive action against those who determinedly defy everything
for which the culture stands. Notwithstanding all that has been said before in this
volume, security cannot always be positive, upbeat, supportive and reassuring.
Those who run counter to the culture must be warned of the consequences of their
actions and receive those consequences if behavior does not change. Behavioral
modification is sufficient; it is unrealistic to enforce changes in attitude. However,
generally, when people act differently, their hearts and minds do follow.
There are two analogous cultural changes in the business world that can be used
as examples of the effective use of negative reinforcement: smoking and sexual
harassment. It is true that these have not developed in the same way and at the
same speed in all parts of the world. There are still nations where people routinely
smoke in the office, and the treatment of coworkers is not the same everywhere.
Nonetheless, in many countries, management has made a commitment to the
health of employees and banned smoking in the workplace. Those who were told
to change their actions were, in the most literal sense, addicted to a behavior that
was harmful to themselves and those around them. There was resistance at first,
but change has been accomplished. In Spain, where there is a higher percentage of
smokers than in the rest of Europe, the smoking ban has been effective as applied
to public buildings and workplaces. Nonsmokers have no problems in government
buildings, airports, offices and so on.11 In Germany, some still smoke at work,
but only behind closed doors in their private offices.12 In China, progress against
smoking has been slower, “but even there, the smoking ban is mostly targeting
offices and public working areas.”13 The same may be said of all corners of the
globe. Culture can be changed if it is enforced.
As a cultural issue in the workplace, sexual harassment is both different and the
same as that of security—different in that:
The point is that cultural attitudes and behaviors were forced to change by
public mores, to be sure, but also through certain managers saying that this is
unacceptable. Despite the cultural differences as to what constitutes permissible
behavior, a consensus has arisen that:
This understanding did not simply occur haphazardly in businesses and government
agencies. The spirit of the times did change, but managers made it happen in
their enterprises by making it clear that certain behaviors and attitudes were
impermissible and taking forceful action to stop them.
7.5.1 Penalties
In some cases, it is not enough to reward good conduct. There are times when
penalties must be exacted for bad behavior. As applied to security, those penalties
range from reprimands to prosecution. The latter, of course, is reserved for very
bad behavior indeed: fraud, espionage, sabotage or theft. While criminal cases
are necessary when there are crimes, this actually does little to improve a security
culture. It is, in the minds of many, the exception that proves the rule. They do not
see themselves as criminals, and for the most part, they are not. To base security
on extreme cases leads to complacency for those in the middle. One of the tests of
criminality is intent, and most people who do not participate in a culture of security
intend no harm—they just cannot be bothered.
If a warning is insufficient, then more drastic action needs to be taken. People who
violate security policy by sharing passwords, disclosing sensitive information,
reading prohibited records or bypassing access restrictions should be told bluntly
that these actions are impermissible and that a memorandum will be added to
their personnel file or some such permanent record. Such a statement indicates
that the acts were noticed; that the offenders were rebuked; and, most important,
that the acts will have effect over time. The time in question may be when raises,
bonuses and promotions are given out. If people believe that they will be penalized
in material ways in the future, behavior (and maybe even attitudes) will change.
Regardless of what caused the violation, unless fired, the offenders need to be
reeducated in security policy and its importance.
7.5.2 Defiance
What of the people who deliberately refuse to follow security policy or to act in a
secure manner? There is no point in reminding them of the potential harm in failing
to secure information resources. They have indicated that they do not care. They,
too, are a part of an intentional culture of security; one that is dismissive, disdainful,
disobedient and defiant. These are not the people who violate security requirements
inadvertently. They see themselves apart from an enterprise’s expressed intent to
promote security, believing that it only interferes with some “higher purpose.”
These people are the most destructive of a security culture. If management fails to
counter their defiance, then management implicitly buys into their higher purpose. It
avails nothing to appeal to the organizational commitment of those who defy security
requirements; they have placed their own goals in front of the enterprise’s. The issue
is no longer security, but insubordination. That is exactly how smoking bans and
antiharassment policies have taken effect. Management no longer debates the relative
merits of a policy; it simply says that this is the policy and it must be observed.
This tough line is not drawn all at once. There should be a period of time in which
people learn how to behave under the security policies. As with nonsmoking
policies, the level of top management support is directly correlated with the speed
at which the enterprise becomes compliant. Where senior management strongly
supports security (or smoking or harassment prohibition) policies, they move ahead
without resistance. Once compliance with security policies is considered the norm,16
an intentional, positive security culture has been achieved. Those who refuse to
conform to normative behavior have to be removed from an enterprise.
The real proof of a security culture comes when otherwise valuable employees are
let go for refusing to protect the information with which they come in contact. This
is not a routine occurrence; many security cultures do need strengthening. However,
termination for cause does exist, and in the military and intelligence fields, it is
understood that compliance is mandatory. This attitude is also spreading to the
fields of education, health care and financial services.17
Being the person who causes a security problem should result in more than snide
comments at the water cooler. For example, reports of security and privacy
breaches due to loss of physical media (one aspect of data leakage) are so numerous
that they barely make the news.18 When they do happen, the people responsible
receive unwanted attention from their superiors. They also become participants in
an intentional culture of security, but only too late.
Endnotes
www.ehow.com/about_5142698_definition-key-performance-indicators.html
“Smoking Spain—A Really Tough Smoking Ban in Spain, or Not,” Culture
11
www.bellaonline.com/articles/art16340.asp
“Future of China’s Smoking Ban Looks Hazy,” Wall Street Journal, USA, 26 July
13
2010, http://blogs.wsj.com/chinarealtime/2010/05/14/future-of-china%
E2%80%99s-smoking-ban-looks-hazy/
Zembroff, Jennifer; “Cultural Differences in Perceptions of and Responses to Sexual
14
The Wright State document is notable for describing the particulars of just the sort
of defiant behavior discussed here:
• Leaving a classified file or security container unlocked and unattended either
during or after normal working hours
• Keeping classified material in a desk or unauthorized cabinet, container, or area
• Leaving classified material unsecured or unattended on desks, tables, cabinets or
elsewhere in an unsecured area, either during or after normal working hours
• Reproducing or transmitting classified material without proper authorization
• Losing your security badge
• Removing classified material from the work area in order to work on it at home
• Granting a visitor, contractor, employee or any other person access to classified
information without verifying both the individual’s clearance level and need-to-know
• Discussing classified information over the telephone, other than a phone
approved for classified discussion
• Discussing classified information in lobbies, cafeterias, corridors or any other
public area where the discussion might be overheard
• Carrying safe combinations or computer passwords (identifiable as such) on
one’s person, writing them on calendar pads, keeping them in desk drawers, or
otherwise failing to protect the security of a safe or computer
• Failing to mark classified documents properly
• Failing to follow appropriate procedures for destruction of classified material
A few recent cases have. See Wilson, Tim; “Two Major Breaches Caused
18
Management has the more difficult task of having to understand just how secure its
information needs to be to establish a culture at the right level. Clearly, information
in an intelligence agency calls for more security (and a tighter culture to protect it)
than in a bank, which may need more security than a pharmaceutical maker, which
may need more than a manufacturer of shoes; etc. In short, managers, CISOs,
auditors and others need to confront the question: How good is good enough?
Figure 9 offers some metrics that may be applied to that decision. No enterprise
wants to have a culture that could be called “lagging” in security (or at least
no enterprise should want one). Yet, there are too many enterprises with senior
management that exhibit no support for security, middle managers who are
actively hostile to anything that limits their ability to do whatever they want, and
complacent staff and systems that show no evidence of security in their design or
operation. Although a security culture does exist in these enterprises, “lagging” is
too nice of a word for to describe it.
An enterprise that is aware of the need for security, but does not do enough to
achieve it, is little better. One could argue that having good intentions, but ignoring
them, is worse than having no intentions for security at all. If anything positive may
be said about an enterprise at this level of cultural maturity, it is that a champion is
more likely to emerge in one of these than in one totally oblivious to the need for
security and a culture supportive of it.
The metrics for a culture are not so well and sharply defined that anyone can say
that only this practice, this belief or this attitude would make the best culture of
security, as opposed to one that is merely good enough. In these circumstances, it
is too easy to make the best the enemy of the good. Yes, managers should want a
culture that will support the appropriate level of security in their enterprises, but it
is possible to overreach as well. The objective is not to keep building an intentional
security culture indefinitely nor to get to a certain point and then stop. Rather,
enterprises should always be aware of potential slippage, be vigilant and keep trying
to do better. In short, within the context of any business, the ideal security culture
will never be attained or, if it is, it will need to change with changing contexts.
As with any endeavor, there are distinct phases to the implementation of a security
culture. The first phase is always a dawning recognition that something should
be done. Then, there is the doing followed by the effort to sustain that which
was done. In the case of a security culture, it is a cyclical process because it is
never-ending.
entum
going? 1 What a
he mom re th
pt ed
kee Initiat
rive
we rs?
do view s e pr
ow Re tivenes ogr
am
ec me
eff
7H
Establ
is
stai
n to ch h des
Su ang ire
2W
e
Def opport
re?
efits
6 Did we get the
ine
Recog
here a
r
nito
Fo
Mo and need nise
rm team
probleities
Realise ben
ate act to
approach ew
alu
es
re we now?
impl
ev
Embed n
un
ementation
Operate
Asseent
e
curr te
ms and
measur
sta
and
ss
I m p o ve m
imp
rg n e
De a
ta e t
fi
le m
r
en
te
t
m e te
en t s
co ca
ts B u il d
O p d us
i m pro
ut u ni
ve m e nts
an
er
ap
e
m
m
at
Exe
e?
e Co o
dm
5H
to b
cu
I d e n tif y r o l e
oa
ow
te
a nt
la
er
players
n fin
p
do
ew
De
we
ow
ge
th e
ed
er
t
• Programmeandmanagement
Source: ISACA, Implementing • Change enablement
Continually Improving • Continual
IT Governance,
(outer ring)
improvement
USA, 2009, figurelife
5 cycle
(middle ring) (inner ring)
The circle of champions will not constitute, in any real sense, an implementation
team. The champions will not hold regular meetings, produce any documents or come
up with a project plan. They will act as a group to build a new consensus within the
enterprise, and to that end, they must reach a consensus among themselves. Some will
be more aggressive and some more accepting of existing attitudes and ways of doing
things. To be effective, they must be able to articulate a common vision of what an
intentional culture would look like, how it would work in practice and how it would
affect the interests of others within that enterprise.
important revisions and also those most readily achieved. For example, it may
be thought that protecting data from unintended leakage, the greatest risk to an
enterprise, is the highest priority for implementation. Unfortunately, it may be quite
difficult to prevent well-meaning personnel from taking sensitive information home
to work on after hours, much less to stop deliberate theft of information. Therefore,
despite the priority of data leakage prevention, it may not be achievable in the short
term. Something less critical, such as a clean desk policy, may be more enforceable
and, hence, easier to attain in short order.
Es
Champions
tab
n.
lis
tai
ht
s
Su
he
Senior
ne
ed
Management
.
Middle
Management
Staff
St
te
rik
ica
ea
m un ion.
ba
m vi s
lan
Co the
ce
.
Achieve initial
objectives.
8.2 Conclusion
Endnotes
1
ISACA, Implementing and Continually Improving IT Governance, USA, 2009,
p. 35-36
2
Ibid., p. 36
3
Ibid.
BMIS-related Publication
• An Introduction to the Business Model for Information Security, 2009
COBIT-related Publications
• Aligning COBIT ® 4.1, ITIL® V3 and ISO/IEC 27002 for Business Benefit, 2008
• Building the Business Case for COBIT ® and Val ITTM: Executive Briefing, 2009
• COBIT ® and Application Controls, 2009
• COBIT ® Control Practices: Guidance to Achieve Control Objectives for Successful
IT Governance, 2nd Edition, 2007
• COBIT ® Mapping: Mapping of CMMI® for Development V1.2 With COBIT ® 4.1, 2011
• COBIT ® Mapping: Mapping of FFIEC With COBIT ® 4.1, 2010
• COBIT ® Mapping: Mapping of ISO/IEC 17799:2000 With COBIT ®, 2nd Edition, 2006
• COBIT ® Mapping: Mapping of ISO/IEC 17799:2005 With COBIT ® 4.0, 2006
• COBIT ® Mapping: Mapping of ISO/IEC 20000:2005 With COBIT ® 4.1, 2011
• COBIT ® Mapping: Mapping of ITIL® V3 With COBIT ® 4.1, 2008
• COBIT ® Mapping: Mapping of NIST SP 800-53 With COBIT ® 4.1, 2007
• COBIT ® Mapping: Mapping of PMBOK® With COBIT ® 4.0, 2006
• COBIT ® Mapping: Mapping of SEI’s CMM® for Software With COBIT ® 4.0, 2006
• COBIT ® Mapping: Mapping of TOGAF 8.1 With COBIT ® 4.0, 2007
• COBIT ® QuickstartTM, 2nd Edition, 2007
• COBIT ® Security BaselineTM, 2nd Edition, 2007
• COBIT ® User Guide for Service Managers, 2009
• Implementing and Continually Improving IT Governance, 2009
• IT Assurance Guide: Using COBIT ®, 2007
• IT Control Objectives for Basel II, 2007
• IT Control Objectives for Sarbanes-Oxley: The Role of IT in the Design and
Implementation of Internal Control Over Financial Reporting, 2nd Edition, 2006
• ITGI Enables ISO/IEC 38500:2008 Adoption, 2009
• SharePoint® Deployment and Governance Using COBIT ® 4.1: A Practical Approach, 2010
Academic Guidance
• IT Governance Using COBIT ® and Val ITTM:
– S tudent Book, 2nd Edition, 2007
–C aselets, 2nd Edition, and Teaching Notes, 2007
– TIBO Case Study, 2nd Edition, and Teaching Notes, 2007 (Spanish translation
also available)
–P resentation, 2nd Edition, 2007 (35-slide PowerPoint deck on COBIT)
– Caselets, 3rd Edition, and Teaching Notes, 2010
– City Medical Center Case Study, 3rd Edition, and Teaching Notes, 2010
• Information Security Using the CISM® Review Manual and BMISTM:
–C aselets, 2010
–M ore4Less Foods Case Study, 2010
–C aselets and More4Less Foods Case Study—Teaching Notes, 2010
Practitioner Guidance
• Audit/Assurance Programs:
– ApacheTM Web Services Server Audit/Assurance Program, 2010
– Change Management Audit/Assurance Program, 2009
– Cloud Computing Management Audit/Assurance Program, 2010
– Crisis Management Audit/Assurance Program, 2010
– Generic Application Audit/Assurance Program, 2009
– Identity Management Audit/Assurance Program, 2009
– Information Security Management Audit/Assurance Program, 2010
– IT Continuity Planning Audit/Assurance Program, 2009
– Microsoft® Internet Information Services (115) 7 Web Services Server
Audit/Assurance Program, 2011
– Mobile Computing Security Audit/Assurance Program, 2010
– MySQLTM Server Audit/Assurance Program, 2010
– Network Perimeter Security Audit/Assurance Program, 2009
– Outsourced IT Environments Audit/Assurance Program, 2009
– Security Incident Management Audit/Assurance Program, 2009
– Social Media Audit/Assurance Program, 2011
– Systems Development and Project Management Audit/Assurance Program, 2009
– UNIX/LINUX Operating System Security Audit/Assurance Program, 2009
– VMware® Server Virtualization Audit/Assurance Program, 2011
– Windows Active Directory Audit/Assurance Program, 2010
– z/OS Security Audit/Assurance Program, 2009
• Creating a Culture of Security, 2011
• Cybercrime: Incident Response and Digital Forensics, 2005
• Enterprise Identity Management: Managing Secure and Controllable Access in the
Extended Enterprise Environment, 2004
• Information Security Career Progression Survey Results, 2008
• Information Security Harmonisation—Classification of Global Guidance, 2005
• Monitoring Internal Control Systems and IT, 2010
• OS/390—z/OS: Security, Control and Audit Features, 2003
• Peer-to-peer Networking Security and Control, 2003
• Risks of Customer Relationship Management: A Security, Control and Audit Approach, 2003
• Security Awareness: Best Practices to Serve Your Enterprise, 2005
• Security Critical Issues, 2005
• Security Provisioning: Managing Access in Extended Enterprises, 2002
• Stepping Through the InfoSec Program, 2007
• Stepping Through the IS Audit, 2nd Edition, 2004
• Technical and Risk Management Reference Series:
– Security, Audit and Control Features Oracle® Database, 3rd Edition, 2009
– Security, Audit and Control Features Oracle® E-Business Suite, 3rd Edition, 2010
– Security, Audit and Control Features PeopleSoft, 2nd Edition, 2006
– Security, Audit and Control Features SAP® ERP, 3rd Edition, 2009
• Top Business/Technology Survey Results, 2008