You are on page 1of 7

DOI:10.

1145/ 3 2 8 2 48 7

Cybersecurity design reduces the risk


of system failure from cyberattack, aiming
to maximize mission effectiveness.
BY O. SAMI SAYDJARI

Engineering
Trustworthy
Systems:
A Principled
Approach to
Cybersecurity
in frequency, severity,
C YBE RATTAC K S A R E IN C R E A S IN G
and sophistication. Target systems are becoming
increasingly complex with a multitude of subtle
dependencies. Designs and implementations
continue to exhibit flaws that could be avoided with
well-known computer-science and engineering
techniques. Cybersecurity technol- in the hundreds of billions of dollars,4
ogy is advancing, but too slowly to erosion of trust in conducting busi-
keep pace with the threat. In short, ness and collaboration in cyberspace,
cybersecurity is losing the escala- and risk of a series of catastrophic
tion battle with cyberattack. The re- events that could cause crippling
sults include mounting damages damage to companies and even entire
countries. Cyberspace is unsafe and is
key insights becoming less safe every day.
The cybersecurity discipline has
Cybersecurity must be practiced as created useful technology against as-
a principled engineering discipline. pects of the expansive space of pos-
Many principles derive from insight into sible cyberattacks. Through many
the nature of how cyberattacks succeed. real-life engagements between cyber-
Defense in depth and breath is required to attackers and defenders, both sides
cover the spectrum of cyberattack classes. have learned a great deal about how to

JU N E 2 0 1 9 | VO L. 6 2 | N O. 6 | C OM M U N IC AT ION S OF T HE ACM 63
contributed articles

design attacks and defenses. It is now ernments and ways of life though what
time to begin abstracting and codify- is sometimes known by the military as
ing this knowledge into principles of influence operations{24.09}.6
cybersecurity engineering. Such prin- Before launching into the princi-
ciples offer an opportunity to multiply
the effectiveness of existing technol- Students of ples, one more important point needs
to be made: Engineers are responsible
ogy and mature the discipline so that
new knowledge has a solid foundation
cybersecurity for the safety and security of the sys-
tems they build {19.13}. In a conver-
on which to build. must be students sation with my mentor’s mentor, I
Engineering Trustworthy Systems8
contains 223 principles organized into
of cyberattacks once made the mistake of using the
word customer to refer to those using
25 chapters. This article will address and adversarial the cybersecurity systems we were de-
10 of the most fundamental principles
that span several important categories
behavior. signing. I will always remember him
sharply cutting me off and telling me
and will offer rationale and some guid- that they were “clients, not custom-
ance on application of those principles ers.” He said, “Used-car salesmen
to design. Under each primary princi- have customers; we have clients.”
ple, related principles are also includ- Like doctors and lawyers, engineers
ed as part of the discussion. have a solemn and high moral respon-
For those so inclined to read more in sibility to do the right thing and keep
Engineering Trustworthy Systems, after those who use our systems safe from
each stated principle is a reference of the harm to the maximum extent possi-
form “{x.y}” where x is the chapter num- ble, while informing them of the risks
ber in which it appears and y is the y-th they take when using our systems.
principle listed in that chapter (which In The Thin Book of Naming Ele-
are not explicitly numbered in the book). phants,5 the authors describe how the
National Aeronautics and Space Ad-
Motivation ministration (NASA) shuttle-engineer-
Society has reached a point where it is ing culture slowly and unintentionally
inexorably dependent on trustworthy transmogrified from that adhering to a
systems. Just-in-time manufacturing, policy of “safety first” to “better, faster,
while achieving great efficiencies, cheaper.” This change discouraged
creates great fragility to cyberattack, engineers from telling truth to power,
amplifying risk by allowing effects including estimating the actual proba-
to propagate to multiple systems bility of shuttle-launch failure. Manage-
{01.06}. This means that the potential ment needed the probability of launch
harm from a cyberattack is increasing failure to be less than 1 in 100,000 to
and now poses existential threat to in- allow launch. Any other answer was an
stitutions. Cybersecurity is no longer annoyance and interfered with on-time
the exclusive realm of the geeks and and on-schedule launches. In an inde-
nerds, but now must be considered as pendent assessment, Richard Feyn-
an essential risk to manage alongside man found that when engineers were
other major risks to the existence of allowed to speak freely, they calculated
those institutions. the actual failure probability to be 1 in
The need for trustworthy systems 100.5 The engineering cultural failure
extends well beyond pure technology. killed many great and brave souls in
Virtually everything is a system from two separate shuttle accidents.
some perspective. In particular, essen- I wrote Engineering Trustworthy Sys-
tial societal functions such as the mili- tems and this article to help enable and
tary, law enforcement, courts, societal encourage engineers to take full charge
safety nets, and the election process of explicitly and intentionally manag-
are all systems. People and their beliefs ing system risk, from the ground up,
are systems and form a component of in partnership with management and
larger societal systems, such as voting. other key stakeholders.
In 2016, the world saw cyberattacks
transcend technology targets to that of Principles
wetware—human beliefs and propen- It was no easy task to choose only 5%
sity to action. The notion of hacking of the principles to discuss. When in
democracy itself came into light,10 pos- doubt, I chose principles that may be
ing an existential threat to entire gov- less obvious to the reader, to pique cu-

64 COMMUNICATIO NS O F TH E AC M | J U NE 201 9 | VO L . 62 | NO. 6


contributed articles

riosity and to attract more computer simply the probability of cyberattacks tant yet subtle aspects of an engineer-
scientists and engineers to this impor- occurring multiplied by the potential ing discipline is understanding how to
tant problem area. The ordering here is damages that would result if they actu- think about it—the underlying attitude
completely different than in the book ally occurred. Estimating both of these that feeds insight. In the same way that
so as to provide a logical flow of the pre- quantities is challenging, but possible. failure motivates and informs depend-
sented subset. Rationale. Engineering disciplines ability principles, cyberattack moti-
Each primary principle includes a require metrics to: “characterize the vates and informs cybersecurity princi-
description of what the principle en- nature of what is and why it is that ples. Ideas on how to effectively defend
tails, a rationale for the creation of the way, evaluate the quality of a system, a system, both during design and oper-
principle, and a brief discussion of the predict system performance under ation, must come from an understand-
implications on the cybersecurity dis- a variety of environments and situa- ing of how cyberattacks succeed.
cipline and its practice. tions, and compare and improve sys- Rationale. How does one prevent at-
˲˲ Cybersecurity’s goal is to optimize tems continuously.”7 Without a met- tacks if one does not know the mecha-
mission effectiveness {03.01}. ric, it is not possible to decide whether nism by which attacks succeed? How
Description. Systems have a primary one system is better than another. does one detect attacks without know-
purpose or mission—to sell widgets, Many fellow cybersecurity engineers ing how attacks manifest? It is not pos-
manage money, control chemical complain that risk is difficult to mea- sible. Thus, students of cybersecurity
plants, manufacture parts, connect peo- sure and especially difficult to quan- must be students of cyberattacks and
ple, defend countries, fly airplanes, and tify, but proceeding without a metric adversarial behavior.
so on. Systems generate mission value is impossible. Thus, doing the hard Implications. Cybersecurity engi-
at a rate that is affected by the probabil- work required to measure risk, with neers and practitioners should take
ity of failure from a multitude of causes, a reasonable uncertainty interval, is courses and read books on ethical
including cyberattack. The purpose of an essential part of the cybersecurity hacking. They should study cyberat-
cybersecurity design is to reduce the discipline. Sometimes, it seems that tack and particularly the post-attack
probability of failure from cyberattack the cybersecurity community spends analysis performed by experts and
so as maximize mission effectiveness. more energy complaining how diffi- published or spoken about at confer-
Rationale. Some cybersecurity en- cult metrics are to create and measure ences such as Black Hat and DEF CON.
gineers mistakenly believe that their accurately, than getting on with creat- They should perform attacks within
goal is to maximize cybersecurity under ing and measuring them. lab environments designed specifi-
a given budget constraint. This exces- Implications. With risk as the pri- cally to allow for safe experimenta-
sively narrow view misapprehends the mary metric, risk-reduction becomes tion. Lastly, when successful attacks
nature of the engineering trade-offs the primary value and benefit from any do occur, cybersecurity analysts must
with other aspects of system design and cybersecurity measure—technological closely study them for root causes and
causes significant frustration among or otherwise. Total cost of cybersecu- the implications to improved com-
the cybersecurity designers, stakehold- rity, on the other hand, is calculated in ponent design, improved operations,
ers in the mission system, and senior terms of the direct cost of procuring, improved architecture, and improved
management (who must often adjudi- deploying, and maintaining the cyber- policy. “Understanding failure is the
cate disputes between these teams). In security mechanism as well as the in- key to success” {07.04}. For example,
reality, all teams are trying to optimize direct costs of mission impacts such the five-whys analysis technique used
mission effectiveness. This realization as performance degradation, delay to by the National Transportation Safety
places them in a collegial rather than market, capacity reductions, and us- Board (NTSB) to investigate aviation
an adversarial relationship. ability. With risk-reduction as a benefit accidents9 is useful to replicate and
Implications. Cybersecurity is always metric and an understanding of total adapt to mining all the useful hard-
in a trade-off with mission functional- costs, one can then reasonably compare earned defense information from the
ity, performance, cost, ease-of-use and alternate cybersecurity approaches in pain of a successful cyberattack.
many other important factors. These terms of risk-reduction return on in- ˲˲ Espionage, sabotage, and influence
trade-offs must be intentionally and vestment. For example, it is often the are goals underlying cyberattack {06.02}.
explicitly managed. It is only in con- case that there are no-brainer actions Description. Understanding adver-
sideration of the bigger picture of op- such as properly configuring existing saries requires understanding their
timizing mission that these trade-offs security mechanisms (for example, fire- motivations and strategic goals. Ad-
can be made in a reasoned manner. walls and intrusion detection systems) versaries have three basic categories
˲˲ Cybersecurity is about understand- that cost very little but significantly re- of goals: espionage—stealing secrets
ing and mitigating risk {02.01}. duce the probability of successful cy- to gain an unearned value or to de-
Description. Risk is the primary met- berattack. Picking such low-hanging stroy value by revealing stolen secrets;
ric of cybersecurity. Therefore, under- fruit should be the first step that any sabotage—hampering operations to
standing the nature and source of risk is organization takes to improving their slow progress, provide competitive ad-
key to applying and advancing the disci- operational cybersecurity posture. vantage, or to destroy for ideological
pline. Risk measurement is foundation- ˲˲ Theories of security come from purposes; and, influence—affecting
al to improving cybersecurity {17.04}. theories of insecurity {02.03}. decisions and outcomes to favor an ad-
Conceptually, cybersecurity risk is Description. One of the most impor- versary’s interests and goals, usually at

JU N E 2 0 1 9 | VO L. 6 2 | N O. 6 | C OM M U N IC AT ION S OF T HE ACM 65
contributed articles

the expense of those of the defender. portunities for a system design and im-
Rationale. Understanding the stra- plementation to be exposed and sub-
tegic goals of adversaries illuminates verted along its entire life cycle. Early
their value system. A value system sug- development work is rarely protected
gests in which attack goals a potential
adversary might invest most heavily in, It is much better very carefully. System components are
often reused from previous projects or
and perhaps give insight into how they to assume open source. Malicious changes can
easily escape notice during system inte-
adversaries know
will pursue those goals. Different ad-
versaries will place different weights on gration and testing because of the com-
different goals within each of the three
categories. Each will also be willing to
at least as much as plexity of the software and hardware in
modern systems. The maintenance and
spend different amounts to achieve the designer does update phases are also vulnerable to
their goals. Clearly, a nation-state intel-
ligence organization, a transnational
about the system. both espionage and sabotage. The ad-
versary also has an opportunity to
terrorist group, organized crime, a stealthily study a system during opera-
hacktivist and a misguided teenager tion by infiltrating and observing the
trying to learn more about cyberattacks system, learning how the system works
all have very different profiles with re- in reality, not just how it was intended
spect to these goals and their invest- by the designer (which can be signifi-
ment levels. These differences affect cantly different, especially after an ap-
their respective behaviors with respect preciable time in operation). Second,
to different cybersecurity architectures. the potential failure from making too
Implications. In addition to inform- weak of an assumption could be cata-
ing the cybersecurity designer and op- strophic to the system’s mission, where-
erator (one who monitors status and as making strong assumptions merely
controls the cybersecurity subsystem could make the system more expensive.
in real time), understanding attacker Clearly, both probability (driven by op-
goals allows cybersecurity analysts to portunity) and prudence suggest mak-
construct goal-oriented attack trees ing the more conservative assumptions.
that are extraordinarily useful in guid- Implications. The implications of
ing design and operation because they assuming the adversary knows the sys-
give insight into attack probability and tem at least as well as the designers and
attack sequencing. Attack sequencing, operators are significant. This princi-
in turn, gives insight into getting ahead ple means that cybersecurity designers
of attackers at interdiction points with- must spend a substantial amount of
in the attack step sequencing {23.18}. resources: Minimizing the probability
˲˲ Assume your adversary knows your of flaws in design and implementation
system well and is inside it {06.05}. through the design process itself, and
Description. Secrecy is fleeting and performing extensive testing, includ-
thus should never be depended upon ing penetration and red-team testing
more than is absolutely necessary focused specifically on looking at the
{03.05}. This is true of data but ap- system from an adversary perspective.
plies even more strongly with respect The principle also implies a cyberse-
to the system itself {05.11}. It is un- curity engineer must understand the
wise to make rash and unfounded as- residual risks in terms of any known
sumptions that cannot be proven with weaknesses. The design must com-
regard to what a potential adversary pensate for those weaknesses through
may or may not know. It is much safer architecture (for example, specifically
to assume they know at least as much focusing the intrusion detection sys-
as the designer does about the system. tem to monitor possible exploitation of
Beyond adversary knowledge of the sys- those weaknesses), as opposed to hop-
tem, a good designer makes the stron- ing the adversary does not find them
ger assumption that an adversary has because they are “buried too deep”
managed to co-opt at least part of the or, worse yet, because the defender
system sometime during its life cycle. believes that the attacker is “not that
It must be assumed that an adversary sophisticated.” Underestimating the
changed a component to have some de- attacker is hubris. As the saying goes:
gree of control over its function so as to pride comes before the fall {06.04}.
operate as the adversary’s inside agent. Assuming the attacker is (partially)
Rationale. First, there are many op- inside the system requires the designer

66 COMMUNICATIO NS O F TH E AC M | J U NE 201 9 | VO L . 62 | NO. 6


contributed articles

to create virtual bulkheads in the sys- date distribution and maintenance.


tem and to detect and thwart attacks ˲˲ An attacker’s priority target is the
propagating from one part of the sys- cybersecurity system {19.17}.
tem (where the attacker may have a Description. Closely following from
toehold) to the next. This is a wise ap-
proach because many sophisticated at-
The effectiveness the primacy-of-integrity principle
{03.06} is the criticality of the cyber-
tacks, such as worms, often propagate of depth could be security subsystem. To attack the mis-
within the system once they find their
way in (for example, through a phish-
measured by how sion, it is necessary first to disable
any security controls that effectively
ing attack on an unsuspecting user miserable it makes defend against the adversary’s attack
who clicked on an attacker’s malicious
link in an email message). an attacker’s life. path—including the security controls
that defend the security subsystem it-
˲˲ Without integrity, no other cyber- self. Great care must be taken to pro-
security properties matter {03.06}. tect and monitor the cybersecurity sub-
Description. Cybersecurity is some- system carefully {23.12}.
times characterized as having three Rationale. The security subsystem
pillars, using the mnemonic C-I-A: pre- protects the mission system. There-
serving confidentiality of data, ensuring fore, attempted attacks on the cyber-
the integrity of both the data and the security subsystem are harbingers of
system, and ensuring the availability attacks on the mission system itself
of the system to provide the services for {22.08}. The cybersecurity system is
which it was designed. Sometimes, cy- therefore a prime target of the adver-
bersecurity engineers become hyperfo- sary because it is the key to attacking
cused on one pillar to the exclusion of the mission system. Protection of the
adequate attention to the others. This cybersecurity system is thus para-
is particularly true of cybersecurity mount {21.03}. For example, the cyber-
engineers who have their roots in U.S. security audit log integrity is important
Department of Defense (DoD) cyberse- because attackers attempt to alter the
curity because confidentiality of clas- log to hide evidence of their cyberat-
sified data is a high-priority concern tack activities.
in the DoD. The reality is that all other Implications. The cybersecurity sys-
system properties depend on system tem must be carefully designed to it-
integrity, which therefore has primacy. self be secure. The cybersecurity of the
Rationale. System integrity is the cybersecurity system cannot depend
single most important property be- on any other less secure systems. Do-
cause, without it, no other system ing so creates an indirect avenue for
properties are possible. No matter attack. For example, if the identity
what properties a system may possess and authentication process for access
when deployed, they can be immedi- maintenance ports for updating the
ately subverted by the attacker altering cybersecurity system use simple pass-
the system to undo those properties words over remotely accessible net-
and replace them with properties de- work ports, that becomes the weakest
sirable to the attacker. This gives rise to link of the entire system. In addition,
the fundamental concept of the refer- cybersecurity engineers cannot simply
ence monitor {20.02}, which requires use the cybersecurity mechanism that
the security-critical subsystem be cor- the cybersecurity system provides to
rect (perform the required security protect the mission systems. In other
functions), non-bypassable (so that the words, the cybersecurity system cannot
attacker cannot circumvent the correct use itself to protect itself; that creates
controls to access protected resources), a circular dependency that will almost
and tamperproof (so the system cannot certainly create an exploitable flaw an
be altered without authorization). attacker can use. Lastly, the cyberse-
Implications. This primacy-of-integ- curity mechanisms are usually hosted
rity principle means that cybersecu- on operating systems and underlying
rity engineers must focus attention on hardware, which become the under-
access control to the system as a first belly of the cybersecurity system. That
priority, including heavy monitoring of underbelly must be secured using dif-
the system for any unauthorized chang- ferent cybersecurity mechanisms, and
es. This priority extends to the earlier it is best if those mechanisms can be as
stages of system life cycle such as up- simple as possible. Complexity is the

JU N E 2 0 1 9 | VO L. 6 2 | N O. 6 | C OM M U N IC AT ION S OF T HE ACM 67
contributed articles

enemy of cybersecurity because of the of attack, for all attack classes, will be will have for the targeted attack class.
difficulty of arguing that complex sys- equally difficult, and above the cost and Said a different way, the effectiveness of
tems are correct {19.09}. risk thresholds of the attackers. depth could be measured by how miser-
˲˲ Depth without breadth is useless; Implications. This depth-and- able it makes an attacker’s life.
breadth without depth, weak {08.02}. breadth principle implies that the cy- ˲˲ Failing to plan for failure guaran-
Description. Much ado has been bersecurity engineer must have a firm tees catastrophic failure {20.06}.
made about the notion of the concept understanding of the entire spectrum Description. System failures are in-
of defense in depth. The idea is often of cyberattacks, not just a few attacks. evitable {19.01, 19.05}. Pretending
vaguely defined as layering cyberse- More broadly, the principle suggests otherwise is almost always catastroph-
curity approaches including people, the cybersecurity community must de- ic. This principle applies to both the
diverse technology, and procedures to velop better cyberattack taxonomies mission system and cybersecurity
protect systems. Much more precision that capture the entire attack space, subsystem that protects the mission
is needed for this concept to be truly including hardware attacks, device system. Cybersecurity engineers must
useful to the cybersecurity design pro- controller attacks, operating system understand that their systems, like all
cess. Layer how? With respect to what? attacks, and cyberattacks used to af- systems, are subject to failure. It is in-
The unspoken answer is the cyberat- fect the beliefs of people. Further, the cumbent on those engineers to under-
tack space that covers the gamut of all principle also means that cybersecuri- stand how their systems can possibly
possible attack classes as shown in the ty measures must be properly charac- fail, including the failure of the un-
accompanying figure. terized in terms of their effectiveness derlying hardware and other systems
Rationale. One must achieve depth against the various portions of the on which they depend (forexample,
with respect to specified attack classes. cyberattack space. Those who create the microprocessors, the internal sys-
Mechanisms that are useful against or advocate for various measures or tem bus, the network, memory, and
some attack classes are entirely useless solutions will be responsible for creat- external storage systems). A student
against others. This focusing idea fos- ing specific claims about their cyber- of cybersecurity is a student of failure
ters an equally important companion attack-space coverage, and analysts {07.01} and thus a student of depend-
principle: defense in breadth. If a cyber- will be responsible for designing tests ability as a closely related discipline.
security designer creates excellent depth to thoroughly evaluate the validity of Security requires reliability; reliability
to the point of making a particular class those claims. Lastly, cybersecurity requires security {05.09}.
of attack prohibitive to an adversary, the architects will need to develop tech- Rationale. Too many cybersecurity en-
adversary may simply move to an alter- niques for weaving together cyberse- gineers forget that cybersecurity mecha-
native attack. Thus, one must cover the curity in ways that create true depth, nisms are not endowed with magical
breadth of the attack space, in depth. Ideal- measured by how the layers alter the powers of nonfailure. Requirements can
ly, the depth will be such that all avenues probability of success an adversary be ambiguous and poorly interpreted,
designs can be flawed, and implementa-
Defense depth and breadth in a cyberattack. tion errors are no less likely in security
code than in other code. Indeed, secu-
rity code often has to handle complex
timing issues and sometimes needs to
be involved in hardware control. This
involves significantly more complexity
than normal systems and thus requires
Depth = 2 even more attention to failure avoid-
ance, detection, and recovery {05.10}.
Yet the average cybersecurity engineer
today seems inadequately schooled in
Depth = 1
this important related discipline.
Implications. Cybersecurity engineer-
ing requires design using dependabil-
ity engineering principles. This means
that cybersecurity engineers must un-
derstand the nature and cause of faults,
Depth = 3 how the activation of faults lead to er-
rors, which can propagate and cause
Attack space
system failures.1 They must understand
Attack class within the attack space where size this not only with respect to the cyber-
corresponds to number of attacks in the class security system they design, but all the
The subset of attacks classes systems on which the system depends
covered by a security control
and which depend on it, including the
mission system itself.
˲˲ Strategy and tactics knowledge

68 COMMUNICATIO NS O F TH E AC M | J U NE 201 9 | VO L . 62 | NO. 6


contributed articles

comes from attack encounters {01.09}. based on this knowledge are some- defenders to autonomic action and
Description. As important as good times called playbooks. They must planning that may eventually be driv-
cybersecurity design is, good cyberse- be developed in advance of attacks en by artificial intelligence. Stronger
curity operations is at least as impor- {23.05} and must be broad enough and stronger cybersecurity measures
tant. Each cybersecurity mechanism is {23.07} to handle a large variety of at- that dynamically adapt to cyberat-
usually highly configurable with hun- tack situations that are likely to occur tacks will similarly lead adversaries
dreds, thousands, and even millions in real-world operations. The process to more intelligent and autonomic
of possible settings (for example, the of thinking through responses to vari- adaptations in their cyberattacks.
rule set of firewalls denying or permit- ous cyberattack scenarios, in itself, The road inevitably leads to machine-
ting each combination, port, protocol, is invaluable in the planning process controlled autonomic action-coun-
source address range, and destination {23.10}. Certain responses that may be teraction and machine-driven adap-
address range). What are the optimal contemplated during this process may tation and evolution of mechanisms.
settings of all of these various mecha- need infrastructure (such as, actuators) This may have surprising and poten-
nisms? The answer depends on varia- to execute the action accurately and tially disastrous results to the system
tions in the mission and variations in quickly enough {23.15} to be effective. called humanity {25.02, 25.04}.
the system environment, including This insight will likely lead to design
attack attempts that may be ongoing. requirements for implementing such Acknowledgments
The settings are part of a trade-off actuators as the system is improved. First and foremost, I acknowledge all
space for addressing the entire spec- of the formative conversations with
trum of attacks. The reality is there The Future my technical mentor, Brian Snow. He
is no static optimal setting for all cy- Systematically extracting, presenting, is a founding cybersecurity intellectual
berattack scenarios under all possible and building the principles underlying who has generously, gently, and wisely
conditions {22.07}. Furthermore, dy- trustworthy systems design is not the guided many minds throughout his il-
namically setting the controls leads to work of one cybersecurity engineer— lustrious career. Second, I thank the
a complex control-feedback problem not by a long shot. The task is difficult, dozens of brilliant cybersecurity engi-
{23.11}. Where does the knowledge daunting, complex, and never-ending. neers and scientists with whom I have
come from regarding how to set the I mean here to present a beginning, had the opportunity to work over the
security control parameters accord- not the last word on the matter. My last three decades. Each has shone a
ing to the particulars of the current goal is to encourage the formation of light of insight from a different direc-
situation? It is extracted from the in- a community of cybersecurity and sys- tion that helped me see the bigger pic-
formation that comes from analyzing tems engineers strongly interested in ture of underlying principles.
cyberattack encounters, both real and maturing and advancing their disci-
simulated, both those that happen to pline so that others may stand on their References
1. Avizienis, A., Laprie, J.-C., and Randell, B. Fundamental
one’s own organization and those that shoulders. This community is served concepts of dependability. In Proceedings of the 3rd
happen to one’s neighbors. by like-minded professionals shar- IEEE Information Survivability Workshop (Boston, MA,
Oct. 24–26). IEEE, 2000, 7–12.
Rationale. There is certainly good ing their thoughts, experiences, and 2. Hamilton, S.N., Miller, W.L., Ott, A., and Saydjari, O.S.
The role of game theory in information warfare.
theory, such as game-theory based results in papers, conferences, and In Proceedings of the 4th Information Survivability
approaches,2 which one can develop over a beverage during informal gath- Workshop. 2001.
3. Hammond, S.A. and Mayfield, A.B. The Thin Book of
about how to control the system ef- erings. My book and this article are a Naming Elephants: How to Surface Undiscussables
fectively (for example, using standard call to action for this community to for Greater Organizational Success. McGraw-Hill, New
York, 2004, 290–292.
control theory). On the other hand, organize and work together toward the 4. Morgan, S. Top 5 Cybersecurity Facts, Figures and
practical experience plays an impor- lofty goal of building the important Statistics for 2018. CSO Online; https://bit.ly/2KG6jJV.
5. NASA. Report of the Presidential Commission on the
tant role in learning how to effectively underpinnings from a systems-engi- Space Shuttle Challenger Accident. June 6, 1986;
defend a system. This knowledge is neering perspective. https://history.nasa.gov/rogersrep/genindex.htm
6. Rand Corporation. Foundations of Effective Influence
called strategy (establishing high-lev- Lastly, I will point out that cyber- Operations: A Framework for Enhancing Army
el goals in a variety of different situ- attack measures and cybersecurity Capabilities. Rand Corp. 2009; https://www.rand.
org/content/dam/rand/pubs/monographs/2009/
ations) and tactics (establishing ef- countermeasures are in an eternal co- RAND_MG654.pdf
fective near-term responses to attack evolution and co-escalation {14.01}. 7. Saydjari, O.S. Why Measure? Engineering Trustworthy
Systems. McGraw-Hill, New York, 2018, 290–292.
steps the adversary takes). Improvements to one discipline 8. Saydjari, O.S. Engineering Trustworthy Systems: Get
Implications. Strategy and tactics Cybersecurity Design Right the First Time. McGraw-
will inevitably create an evolution- Hill Education, 2018.
knowledge must be actively sought, ary pressure on the other. This has 9. Wiegmann, D. and Shappell, S.A. A Human Error
Approach to Aviation Accident Analysis: The Human
collected with intention (through ana- at least two important implications. Factors Analysis and Classification System. Ashgate
lyzing real encounters, performing First, the need to build cybersecu- Publishing, 2003.
10. Zarate, J.C. The Cyber Attacks on Democracy.
controlled experiments, and perform- rity knowledge to build and operate The Catalyst 8, (Fall 2017); https://bit.ly/2IXttZr
ing simulations {23.04}), curated, and trustworthy systems will need contin-
effectively employed in the operations uous and eternal vigilant attention. O. Sami Saydjari (ssaydjari@gmail.com) is Founder and
of a system. Cybersecurity systems Second, communities on both sides President of the Cyber Defense Agency, Inc., Clarksville,
MD, USA.
must be designed to store, communi- need to be careful about where the
cate, and use this knowledge effectively co-evolution leads. Faster and faster Copyright held by author/owner.
in the course of real operations. Plans cyberattacks will lead cybersecurity Publication rights licensed to ACM. $15.00.

JU N E 2 0 1 9 | VO L. 6 2 | N O. 6 | C OM M U N IC AT ION S OF T HE ACM 69

You might also like