You are on page 1of 14

J Transp Secur (2013) 6:221–234

DOI 10.1007/s12198-013-0113-3

Developing immunity to flight security risk:


prospective benefits from considering aviation
security as a socio-technical eco-system

Paul McFarlane & Mils Hills

Received: 6 March 2013 / Accepted: 11 March 2013 / Published online: 22 March 2013
# Springer Science+Business Media New York 2013

Abstract Since 9/11, preventing similar terrorist disasters has been the predominant
goal of aviation security. Yet, in this paper we seek to explore why it is that despite
our increased knowledge of disaster causation - aviation security systems still remain
vulnerable to future exploitation by adaptive terrorists and other threat groups. We
adopt a novel approach, and present early directions of how we apply the benefits of
high level appreciations of socio-technical and biological eco-systems to existing
complex aviation transportation security systems. We propose that by approaching
aviation security as a complex socio-technical eco-system, it offers an opportunity to
think beyond conventional methodologies to improve system performance in a way
that, hitherto, would not have been possible. The paper concerns itself with the ability
for aviation socio-technical eco-systems to hold the capacity to proactively identify
and mitigate pathogenic errors and violations. This narrow view is juxtaposed with
identifying methods of reducing error creation ‘before’ they become system vulnerabil-
ities. To address this problem, the paper concludes that a fresh approach, both concep-
tually and operationally, is required to understand that ‘true’ foresight of latent
vulnerabilities can only be achieved by a system which is ‘intelligent’ and ‘self-aware’,
in other words to identify and modify hostile pathogens before they are exploited. The
development of true foresight in aviation security systems is critical to the prevention of
future terrorist attacks.

P. McFarlane (*) : M. Hills


Northampton Business School, The University of Northampton,
Cottesbrooke 115, Boughton Green Road, Northampton NN2 7AL, UK
e-mail: paul.mcfarlane@northampton.ac.uk
M. Hills
e-mail: mils.hills@northampton.ac.uk
222 P. McFarlane, M. Hills

Keywords Transportation security . Aviation security . Socio-technical eco-system .


Human error . Security technologies . Foresight

“Our world is fundamentally a socio-technical world, a world deeply


characterised by human and technological interactions: human organisations
are living systems and should be analysed accordingly” (Emery and Trist 1981).
“We cannot prevent the creation of latent failures; we can only make their
adverse consequences visible before they combine with local triggers to breach
the system’s defences” (Reason 1995).

Introduction

In this paper we present early directions in developing a novel approach that contributes
to the established theoretical discourse around transportation security systems. This
unconventional perspective seeks to present a new way of thinking about vulnerabilities,
in complex socio-technical transportation systems, which are essential to global security.
The system we are particularly concerned with is that of commercial aviation security:
complex and inter-dependent human and technological systems which are intended
to facilitate the free flow of people and goods whilst at the same time being
resilient to direct attack by terrorists and other threat groups. We approach this
topic as practitioners in related fields with real-world experience, as well as academics
with a commitment to improving transportation security in practical terms.
As a priority, we are particularly concerned with addressing the problem of how
concealed (or pathogenic) human errors and vulnerabilities within aviation security
systems can be discovered, and mitigated against, before adaptive terrorist, organized
crime or other threat groups skilfully exploit them. In doing so, we underscore the
requirement to view such transportation security systems as socio-technical artefacts
and include an explanation of why drawing upon the analogy of biological eco-
systems to self-regulate against malign behaviour is of potential value to enhancing
existing aviation security systems.
Situated in this context then—the purpose of this paper is to stimulate comment
about the challenges and opportunities presented by exploring an alternative approach
to developing aviation security systems which consider the benefit of applying high
level appreciations from socio-technical and adaptive eco-systems. The benefit being
that pathogenic and human-mediated technical errors and vulnerabilities can be
revealed and mitigated against—before they are exploited by adaptive terrorists. In
the forthcoming section of this paper we discuss the necessity for a change in
approach—in the context of the constant challenge that is faced by aviation security
systems. We then turn to explain how aviation security systems may become more
effective, and introduce the innovative concept of—aviation security as a socio-
technical eco-system. Finally, the paper concludes having identified an area that has
not benefited from substantial or novel academic attention—by proposing that the
benefits of a radically fresh approach could be significant to improving aviation
security systems.
Developing immunity to flight security risk: prospective benefits 223

Background: the constant challenge

The events of 11th September 2001 (henceforth, 9/11), presaged a new era of aviation
terrorism which has revealed that aviation security is not solely constrained to matters
relating to passenger safety—it is also fundamental to maintaining national security
(Jenkins 2012). The prevention of further “terrorist disasters” (Perrow 2007, p.9)
needs to be the predominant goal of global aviation security—because existing
security systems still continue to operate in a weakened state (Feakin 2011). And
whilst the decapitation of al-Qaeda’s hierarchy appears to have reduced the threat, the
interdiction of the 2012 plot to cause an explosion on board a U.S.-bound
aircraft by a suicide bomber wearing a sophisticated “underpants bomb” (BBC 2012;
MacAskill and Black 2012; Seper 2012) is a timely reminder that targeting aviation
still remains fixed within the lens of terrorists, criminals and other threat groups.
Elwell (2008), amongst others, has clearly warned of the risk of recurrent attacks.
He states that:
“Terrorists will visit aviation again and again… [They] are learning from their
mistakes and from the mistakes of others … their attack methodologies [are]
chang[ing] quickly, adapting to new security measures” (p.11).
Yet despite this warning, we contend that aviation security systems have still not
learned—and in particular have yet to develop the necessary foresight to overcome
the capability of the adaptive and innovative terrorist. As far back as 1999, Wilkinson
and Jenkins (1999, p.5) stressed that “we must never again allow our security to lag
behind the tactics and weapons of the terrorist”, and whilst this remains the case, the
terrorists will always be one step ahead. We suggest that by developing and
implementing a socio-technical and eco-system-based approach to aviation security—
the ability to prevent this ‘security lag’ could be built-in to improve existing aviation
security systems.
We think that it’s important to recognise that aviation (and indeed any form of
transportation) security is, in part, only an illusory and transient deterrent—and yet at
the same time claims to be completely inviolable. With this being the case, the
adaptive terrorist has but a relatively simple problem to resolve. They have only to
look behind the security illusion, convinced that there must be weak points. In lifting
this curtain, they identify system vulnerabilities and develop methodologies that
could have a catastrophic effect or be used to transport a catastrophic effect elsewhere
(e.g. if weapons/explosives/components/precursors are transported undetected to
another destination). However, we are not pessimistic: in fact, we believe that the
issues are not insurmountable because 9/11 and the other attempts to cause airborne
explosions1 on commercial aircraft are not improbable occurrences—but a series of
carefully planned events which deliberately and skilfully exploited the vulnerability of
aviation security systems “to recognise the existence of ineffective countermeasures”
(Woods 2006, p.24).

1
See Richard Reid ‘The shoe bomber’, 2001; Operation Overt ‘The liquid bomb plot’, 2006; Umar Farooq
Abdulmutallab ‘The underpants bomber’, 2009; Operation Hemorrhage ‘The printer cartridge plot’, 2010
& 2012 plot to explode a U.S. aircraft. All were a series of attempts to cause explosions on commercial
aircraft.
224 P. McFarlane, M. Hills

Because the 9/11 terrorists were innovative in their exploitation of the ineffective
countermeasures—the attacks for them were low cost. Why is it the case that if
outsiders with limited means and resource can find gaps in a system: system insiders
cannot identify and close these gaps ahead of their malign exploitation? Why can
system designers not apply this (available) knowledge and construct more reliable
security systems which prevent the creation of embedded vulnerabilities, or neutral-
ises them before they are available to, and asymmetrically exploited by the terrorists?
The answer to how this state of affairs has arisen is provided, in part, by the evolution
of aviation security. In just the same way that other complex socio-technical systems
which we rely on have evolved—aviation security systems are also the victim of
“reactive and backward looking” (LaTourrette and Jenkins 2012, p.3) policies and
decision-making. By endlessly adding new to legacy systems; by bridging gaps that
weren’t anticipated with new modules of technology and by reverse-engineering from
detected compromises to institute after-the-fact repairs and patches, this indispensable
system, designed to preserve national and global security, still continues to operate in
a destabilised condition.
In addition, whilst describing and mitigating aviation terrorism has been the focus
of significant academic writing, a review of the available literature has revealed that
current research is dominated by embryonic studies which still argue that the solution
can be found in the development of ‘smarter’ security technologies, which incorpo-
rates concepts such as the trusted traveller program (Riley 2011). In these studies,
however, there is also a paucity of “detailed conceptual analysis” (Jenkins 2012, p.8)
which explores the requirements for aviation security systems to move above and
beyond being reactive and backward looking in responding to what intelligence on
potential attack vectors or reporting on failed—e.g. near misses—tells us in terms of
the need for implementing new layers of technology (scanners, identity databases,
behavioural profiling systems).

Applying socio-technical appreciations to aviation security systems

Peltu et al. explain that the “socio-technical approach is essentiality about taking a
holistic view of all relevant factors” (Peltu et al. 2008, p.21). In this case, the “joint
optimisation of the social and technical systems” (Mumford 2006, p.321) encapsu-
lates the interdependent relationships between humans and technology (Emery and
Trist 1960). In the abstract, the technical subset comprises of the processes, tasks and
the technology, which is required to transform inputs into system outputs, and the
social system relates to the attributes (attitudes, skills and values) and relationships of
people who are responsible for the system processes and governance structures
(Bostrum and Heinan 1977; Kaven et al. 1999; Margerison 1989). And whilst there
are many examples of the application of socio-technical principles and appreciations
in the organisational workplace—“socio-technical ideas are equally applicable and
beneficial to other [unconventional] settings where technology is deployed” (Baxter
and Sommerville 2011, p.6).
Above all, effective socio-technical system performance relies on the joint opti-
misation of the technical and social subsystems because focusing upon one of these
systems to the exclusion of other is likely to lead to degraded system performance
Developing immunity to flight security risk: prospective benefits 225

(Badham et al. 2000; Baxter and Sommerville 2011). With this in mind, the current
asymmetry towards investment and psychological reliance on complex technology
makes it clear that aviation security has still not embraced—how applying a socio-
technical approach can assist our understanding of system vulnerabilities and threats.
The literature offers nothing by way of explanation of how this condition became
orthodox, because there is no reason why aviation security systems cannot be
contemplated as a socio-technical system, after all, other individual elements of the
aviation industry, such as commercial aircraft—the adopted weapon and logistics
network of the contemporary terrorist—are emphatically considered by the literature
as socio-technical systems.
What is most interesting, is that the wider aviation industry is reliant upon intricate
and autonomous technological solutions to “manage complex and ill structured risk
problems” (Pidgeon and O’Leary 2000, p.16). And yet, practitioners and academics
have still not capitalised and drawn upon the benefits of adopting a socio-technical
approach to aviation—or indeed many other—transportation security systems. That
is, if the 9/11 Commission had adopted a socio-technical approach during its enquiry,
it would have revealed that, beside the failures of the intelligence agencies, the human
mediated errors and vulnerabilities in the system design and operation—were signif-
icant in the success of the attacks. At this level, the modes of failure were in fact
attributable to a biased reliance upon complex technologies and system process
(failure to search passengers after metal detector alarm activations and below average
use of metal detectors and x-ray machines) that were ineffective in the functions that
they sought to fulfill. Now as much as there have clearly been improvements post
9/11, aviation security systems are still constructed with layers of technological
defences, each of which a derivative of a previous mode of failure which has been
exploited by the terrorists. The biased reliance upon technology, and the retrospective
application of further layers does not consider the capability of the intelligent and
adaptive terrorist to identify and exploit new latent vulnerabilities. And it is only
because explosive devices have failed to detonate, or because of intelligence-led
disruption of plots to cause explosions on commercial aircraft, that there have not
been many more deaths of those in the air and on the ground. To understand how
aviation security systems may become more effective, it is necessary to appreciate the
system being examined.
The political and economic stability of many countries is dependent upon the
effectiveness of these systems to safely facilitate the constant flow of people and
products by commercial air transport. And for the most part, there is no doubt—
because of their vast complexity and scale that, aviation security systems can be
colloquially described as the ‘mother of all socio-technical problems’. Amongst the
many layers of complex technologies, these systems typically (and unavoidably)
contain and conceal latent vulnerabilities. At all levels, there are a multitude of
interactions between the human operators and organisational processes which control
it’s functioning. The global system encompasses hundreds of thousands of em-
ployees, in an array of operating environments, all having to comply with different
and opposing organisational cultures and structures. Uncertainty requires the sharing
of common information and data, which has to have the capacity to permeate
geographical, organisational and departmental boundaries. And, above all, the sys-
tems are required to interact with millions of citizens (the travelling public) whilst, at
226 P. McFarlane, M. Hills

the same time counter adaptive, motivated and intelligent terrorists and other threat
groups who are constantly seeking to exploit the smallest of vulnerabilities in system
technologies and operational processes.
So, to some extent it is understandable how aviation security systems become
permeated with exploitable errors and vulnerabilities. And why it would almost be
impossible to completely eradicate all of the modes of failure that are just waiting to
be discovered by hostile groups. But there is a way forward. By using a socio-technical
approach, our analysis advances a series of propositions—that could mitigate three
important areas, which (in the opinion of the authors) form barriers to effective aviation
security. The remainder of the paper is discussed from the perspective of the proposi-
tions set out below. Aviation security systems should:

i. Move away from an overly narrow focus on retrospective technological reliance


to reduce the risk of future events;
ii. Establish the capability to embed active foresight during system design and
modification; and
iii. Maintain system equilibrium by developing a better capacity to proactively
identify and neutralise latent errors and vulnerabilities.

The paradoxical risk of technological reliance

The last 40 years of aviation security has created a bloated and unhealthily
technology-dependent system. The automation of tasks, through the use of tech-
nology (x-ray machines, full body scanners and explosive detection devices), has
sought to eradicate latent vulnerabilities, which have been revealed and exploited
by terrorists in the ways described earlier. Technology does have an important role
in the system. But it is fundamental to socio-technical principles that technology is
just one factor in the equation. And, when applying socio-technical appreciations to
aviation security, it is clear that the existing system does in fact have a narrow
techno-centric perspective. This point itself creates potential for system vulnerabil-
ities because “Airport security based solely on technology is nearing its limits in
thwarting the various physical threats to aviation” (Kirschenbaum et al. 2012a,
p.374). The reason for this situation is twofold. Firstly, the human operator is unable
to complete many of the complex security tasks which are now required. Secondly,
to reduce cost technology is employed to optimise security operations—to make
them more efficient at processing large volumes of passengers and freight. Never-
theless, balance must be applied because these human constructed systems create a
paradox—and the risks—set out by Bainbridge (1987) have been known for many
years. Whilst this is the case in many other socio-technical systems, the pace of
technological development in aviation security has blinded system designers to this
paradoxical risk.
Bainbridge (1987) warns that whilst modifying systems, designers will, them-
selves contribute to the creation of further latent pathogens during the insertion of
automated technologies. And the same designer will also still leave the human
operator to complete tasks (e.g. bridge between technical systems) which the designer
cannot automate. For the purpose of illustration—a timely reminder that Bainbridge’s
propositions are still valid is set out in the report into the crash of Air France flight
Developing immunity to flight security risk: prospective benefits 227

447 in 2011. In this socio-technical system, the report attributes blame to pilots being
trained to place complete and unquestioning reliance upon automated technological
systems. When these systems failed or offered competing data, latent system patho-
gens were revealed because the pilots were not sufficiently trained or experienced to
recognise the dangerous situation the aircraft was in—and therefore did not apply
corrective action (Bureau d’Enquetes et d’Analyses 2012). Technological reliance
actually increases the risk of system failure because it generates further “vulnerabilities”
and “error opportunities” (Dekker 2005, p.152), which are hidden and deeply buried in
the intricate layers of the security system and that may only be discovered in extremis
and when it is too late to act on the findings.
Similarly, in aviation security, the paradoxical risk of reliance upon these types of
technologies is compounded with their relationship with the human system operators.
The technologies require that the human operators trust and have confidence in their
capability to perform automated functions. Nevertheless, Kirschenbaum et al. (2012a)
in a recent study found that there are many examples of rule breaking and non-
compliance with operating procedures because the security staff did not have 100 %
confidence in the automated technologies. Security related decisions were often
informal avoiding the rules and regulations, which were designed for the effective
operation of the security system. And a lot of the automated threat warnings were
simply ignored by security staff because they were assumed to be false alarms
(Kirschenbaum et al. 2012b; Kirschenbaum and Mariani 2012). Put more simply,
the decision to implement and use automated screening devices, which because of
their incomplete design and propensity for false alarms increases the risk that the
security operator will ignore the system when in fact it is genuine alert.

Learning to develop system foresight

The next step—because we now know that terrorists have successfully exploited the
limitations of technology to identify particular explosive substances and component
parts, is that it has to be asked what is known of the effectiveness of the techno-
logical counter measures before they are implemented into the security system? Just
as with the transplantation of a donated organ into a patient’s body—what checks
and tests are run against the new insertion? Are the technologies implemented
knowing that they have limitations (in the same way that companies produce
commercial software and hardware and deal with known or potential problems
through after market patching and upgrades) or, is the situation one where the
designers believe that they have developed technologies which are sufficient to
mitigate the known risk, and it is not until the terrorist overcomes it that the
vulnerability then becomes known? If the terrorists know the vulnerabilities, then
why is it that to proactively counter the same vulnerabilities—system designers
cannot apply the same knowledge? This, in part, is because technologies are not
tested in their operational environment before they are implemented in aviation
security systems. Operational testing should be mandatory to identify and modify
modes of failures and vulnerability that are associated with the integration of
technologies into the system (Hofer and Wetter 2012).
That said, the socio-technical principle of incompleteness has always emphasised
that system design and modification should be a “reiterative” exercise (Cherns 1976,
228 P. McFarlane, M. Hills

p.791) of constant review, and a “never ending process of improvement” (Peltu et al.
2008, p.20). And because of this, the socio-technical view is also equally applicable
to the retrospective analysis and understanding of socio-technical system failures. It
allows the investigator to move between the domains and their relationships’ so that
we are not drawn to focus upon “purely technical causes”, because we now know that
“the majority of large scale [failures] arise from a combination of … factors” (Turner
1978, p.3). To investigate system failure and breakdowns, this methodology also
offers a range of conceptual models that can be used to provide theoretical explana-
tions as to the causation of socio-technical disasters. Analysis of these current models
(see Turner 1978; Shrivastava et al. 1988; Grabowski and Roberts 1996; Toft and
Reynolds 2005; Ibrahim et al. 2002; Aini and Fakhrul-Razi 2010 for a comprehensive
review) has revealed that the utility of the retrospective application of these models
does not, however, contribute much to developing the necessary foresight in aviation
security systems, which can be used to modify and neutralise the vulnerabilities
which are (or could be) so successfully exploited.
Again, what is also interesting is that one of the most significant—and underlying
problems of aviation security has been the failure to learn the lessons of previous
terrorist attacks. Now, this is not failing to plug the holes in the system, but that of not
incorporating foresight into the system design when it needs to be modified after an
attack or detected risk event. A lack of, or sheer neglect to include, foresight, in
simple terms, explains why terrorists continue to be successful in their efforts to
breach security mechanisms. In part, this is a consequence of aviation security being
too narrowly focused towards layered technological counter measures, as being the
panacea to prevent further attacks, rather than being recognised as its Achilles heel.
Many of the difficulties encountered by aviation security systems could be resolved
by adopting a socio-technical approach, which incorporates more than just learning
from past events—to that which includes foresight to counter the terrorists’ asymmetric
advantage.
The concept of “active foresight” (Toft and Reynolds 2005, p.126) is well
established in socio-technical theory. It neatly explains how organisations should
apply the learning from past experience to develop mitigation procedures, which
should reduce the risk of further events. Toft and Reynolds in their exploration of the
benefits of this notion, sought to overcome the “difficulty of imagining or predicting
the relation[s] between people, technology and context” (Baxter and Sommerville
2011, p.9). To learn and adapt, aviation security does not have to rely upon waiting
for further low probability events. It can get ahead, by active analysis of the terrorist
adversary in other operating environments where there is evidence of their modus
operandi being adapted to counter enhanced security measures (for example in
Afghanistan where Islamist insurgents constantly re-design their improvised explo-
sive devices to counter the technological detection and tactical avoidance capabilities
of the security forces). This acquired information and knowledge can then be
embedded into the system so that it can, in principle, autonomously implement
modification of known system vulnerabilities before they are exploited. The chal-
lenge, however, is twofold. Firstly, as has been discussed we cannot learn about
system vulnerabilities that have not yet been identified or exploited. And secondly,
we need to establish how this information could be collected and thereafter translated
into a format that on-going maintenance of the security system can take account of.
Developing immunity to flight security risk: prospective benefits 229

At the very least, in applying socio-technical appreciations, we will be compelled to


think of how the concept of foresight can be applied to aviation security systems.
Starting to think about it in this context—we propose—is a turn towards the right
direction.

An atypical approach for aviation security: towards the concept


of a socio-technical eco-system

Aviation security systems are dynamic and ever-changing habitats, which require the
capacity to calibrate perceived models to actual environmental risks—whilst the
system is in operation. We have argued—using socio-technical appreciations that
the human, technical and social elements of aviation security systems need to be
congruent and mutually supportive to each other. And whilst there are benefits of
aviation security being viewed as a complex socio-technical system, a further part to
these early considerations, however, are that this approach will always be somewhat
limited.
It is with this in mind that Hills (2012) points out that “biological analogies are a
powerful source of innovative thinking”. The literature is rich in examples of applied
systems ecology, however, very little is known of the benefits and potential to
mitigate risk and vulnerability—by thinking of aviation (or other forms of transpor-
tation) security systems as ‘socio-technical eco-systems’. The study of eco-systems
has intrigued natural and social scientists from across a range of paradigmatic
disciplines. There are studies, which discuss with varying degrees of specificity,
digital, knowledge, human, cybernetic and cyber-systems—as eco-systems. Howev-
er, this has not been applied to aviation security systems. It is intriguing why the
literature is incomplete—missing this vital component because aviation security is
one of the largest—complex systems; connecting countries and continents to provide
the infrastructure for global functioning. And thus, these systems are indeed complex
socio-technical [eco]-systems, just that, currently, they are not thought of this way.
Therefore, this paper also sets out early directions on the potential benefits of this
approach to reduce the risk to flight security.
To unpack this analogy, it is not too abstract or esoteric to think of aviation security
socio-technical systems as an eco-system habitat of living and technological organ-
isms, which react to external stimuli, and at the same time are capable of internal
adjustment to maintain a homeostatic state. Systems theory explains that the element
of a system, through a form of (tangible and intangible) communication transforms
inputs, to outputs, which are the system objectives. During this process, however, the
system may become unbalanced (vulnerable) by external influence and stimuli, and
calibrating action will be required to return the system to a constant state. Natural
biological eco-systems are very effective at regulating to achieve complex homeo-
static states, and by analogy, this tentatively offers opportunities because the notion of
eco-system calibration, through feedback and regulating action, should be transfer-
able and applicable to other dynamic systems. In short, if it works in one system then
it should work in another–including aviation security systems.
The latent and concealed vulnerabilities that are exploited by terrorists—can be
thought of as erroneous energy sources which disturb the balance and flow between
230 P. McFarlane, M. Hills

system components (Hudson 2007). The vulnerabilities transform the system from a
state of equilibrium, to an asymmetric condition, which (as discussed above) provides
a competitive advantage to the adaptive terrorist. To understand this in the context of
aviation security systems—the system becomes more vulnerable to terrorist exploi-
tation because the heuristic bias, consequential to the unavoidable amplification of
the future terrorist risk, generates a one-dimensional technological dependant re-
sponse, which as we have argued, is void of any form of system foresight. In other
words, the heuristic judgments drawn upon by policy makers and system designers
during the development and modification of aviation security systems increase the
likelihood of the hosts’ decision making being causal to the creation and concealment
of further pathogenic system errors and vulnerabilities.
For example, after 9/11 policy makers and system designers implemented techno-
logical screening devices into aviation security systems to detect the metallic
weapons and components that were used in the hijackings. Because of a lack of
foresight to future threats, these devices did not have the capacity to anticipate or
detect—that the terrorist, to avoid detection, had simply modified their modus
operandi to use non-metallic (ceramic) components. And this was what happened
in the case of Umar Farouk Abdulmutallab when he attempted to cause an explosion
on a Northwest Airlines Flight in 2009 having successfully overcome the enhanced
technological counter measures by simply modifying the components used in the
explosive device.
In the abstract, our idea also contends that in this case terrorist exploitation morphs
aviation security systems from defender—to vulnerable prey. The modification of the
host security system—through over reliance upon technological countermeasures—and
unconsciously modifies (away from a constant state) the behaviour of the socio-
technical habitat—to that where it has become more vulnerable to predation. This (as
discussed above) creates and conceals further unknown vulnerabilities within the socio-
technical system. And, in this context, it is very difficult—if not impossible without
counter balancing the modified system state, for aviation security systems to maintain a
constant operating condition which is bereft of aberrant latent vulnerabilities.
In the real world, this idea appears to have merit. Because of a disproportionate
response to the fear of future attacks, the United States as well as Britain, Canada and
Australia have significantly increased their expenditure on homeland security
(Mueller and Stewart 2011). This approach, according to Mueller and Stewart
(2011, p.4), has manifested in the deployment of full body scanners at airports at a
cost of “$1.2 billion per year” without any form of cost-benefit analysis to warrant
their implementation as an effective security measure. The risk reduction which can
attributed to full body scanners is unknown—yet, on the other hand, their implemen-
tation has introduced a wholesale change of more complexity and new opportunities
to create and incubate vulnerabilities in aviation security systems. This technology,
like many others in the system, does not incorporate the necessary foresight to
counter future threats. Thus, if the psycho-social compulsion which is a derivative
of terrorist exploitation of transportation security systems is avoided by neutral
decision making, then, the system would still remain a defender—in equilibrium
and free from exploitable errors and vulnerabilities.
So having set our early directions, what do eco-system appreciations tell us about
socio-technical aviation security systems? First, to prevent further attacks, the host
Developing immunity to flight security risk: prospective benefits 231

security system must be resilient to amplified risk—modifying system behaviour.


Achieving this is most difficult because the terrorist rate of tactical adaption is quicker
than the development of effective aviation security systems to deter and detect
terrorist methodology and attack because, in short, we cannot counter what we don’t
yet know about. Particularly, when we don’t know what we don’t know about.
Second, whilst this remains to be the case, as in nature, for the security system to
be effective, it must re-balance the system to counter and castrate the effect of terrorist
exploitation. Metaphorically, in response to the attack, the analogy suggests that
should deterrence be unsuccessful the host has the option to either develop toxins
to deter; implement behavioural avoidance of certain activities or develop a complex
immune system with antibodies, which are resistant to terrorist attack. Whatever
solution is developed, foresight will always be necessary for the countermeasures to
be effective. This approach will, regardless of unavoidable errors, ensure the long-
term survival of the security habitat to protect, whilst the terrorist adversary continues
to exist and present a threat. Obviously, the challenge for the decision makers and
designers is to operationalise the analogy through the development of practical
interventions.

The future requires an intelligent and self aware system

Conceptually, aviation socio-technical [eco]-systems, therefore, must be existen-


tial. Even though we don’t yet know of all existing system vulnerabilities, and
cannot see them, we do know however, according to Reason (1995) that they
will be present as latent system pathogen. The challenge is just a matter of
detecting them. To do this, the only option is for the system to use its own
experience to dynamically recognise malevolent structures—that are cloaked
latent pathogens, lying dormant until they are revealed by terrorist exploitation.
In other words, our notion contends that the system must be ‘intelligent’ and
‘self-aware’. It must sense the stimuli, and the change in energy levels, which
will un-balance the system habitat because pathogens—to remain undetected
will have to draw upon the resources of the system habitat.
The self-aware, in contrast to current post hoc adjusting system, must be proactive
in self-learning—to determine what is hostile to the operating environment. Patho-
gens are hostile infiltrators. To counter the threat to the habitat—the system must be
used against itself. An intelligent and self-aware system will persistently challenge
the habitants to determine their authenticity. A hostile pathogen will be idle to system
output and although cloaked, will be visible to internal system inspectors and
modifiers—which are self-aware of their operating environment. And through intro-
spection, the system will quarantine, neutralise and modify the space created by the
latent pathogens.
To achieve this, trust is a system constant. Central to being self-aware—the system
will simply question elements that it does not trust. The system inspectors will
interrogate the elements, which although idle to system output, create erroneous
changes in system behaviour and—place demand on energy sources without fulfilling
a useful function. The inspectors would interrogate the untrustworthy elements with
questions—with the correct answer only being known to the genuine and non-
malevolent components. The answer to the questions would not previously be known
232 P. McFarlane, M. Hills

or be able to be acquired, hence, there will always be something that gives them away
and makes them visible. Once visible and their structures identified, this knowledge
can be applied to developing preventative interventions to exclude further pathogens
from entering the system. This concept is something we don’t yet fully understand—
but we are working on it. This will be true system foresight and researchers are invited
to contribute, and develop systems—which operationalises this embryonic idea.

Conclusions

In this paper we set out to demonstrate that there are benefits in applying high level
appreciations of socio-technical and eco-systems—to transportation security systems,
with particular reference to aviation security. In doing so, we have advanced a series
of propositions, and have identified that this unconventional perspective can be used
as a building block to overcome some of the current limitations of aviation security
systems—to prevent terrorists exploiting man-made technological vulnerabilities.
Particularly, when viewing aviation security systems as a socio-technical eco-
system, it becomes clear how modes of failure and vulnerabilities are created and
concealed within the system.
Whilst current thinking remains constrained to conventional axioms, retrospective
adjustment remains the reality of the current capability of aviation security system
because—although the concept of technological measures being embedded with
active foresight is a nice idea; in its current form it is just a notional fallacy? For
active foresight to be effective, it needs to predict what the system does not know, and
requires knowledge of the unknown to implement anticipatory measures to counter
and mitigate system vulnerabilities. Furthermore, if foresight did exist, then, the
system would be infallible to risk events, and because of this, presently, it can only
remain abstract. Predicting latent vulnerabilities that, in the future are going to be
exploited by the terrorist adversary is—using conventional thinking, still remains a
utopian ideal. To overcome this problem, this paper has also introduced the notion
of aviation security systems being intelligent and self-aware of their operating
environment, and the pathogenic vulnerabilities, which become exploitable.
Finally, novelty—as alternative and innovative thinking—is now a necessity to
resolve complex socio-technical relationships, which create vulnerability in aviation
security systems. The benefits of this perspective is particularly important to trans-
portation systems, such as aviation security, which has the constant challenge of
protecting the travelling public from future terrorist attacks. It is also hoped that our
socio-technical and eco-system appreciations of aviation security can be applied to
future policy and operations, and inspire further research and development by
practitioners and scholars because—the terrorists only have to get lucky once.

References

Aini MS, Fakhrul-Razi A (2010) Development of socio-technical disaster model. Safety Sci 48:1286–1295
Badham R, Clegg C, Wall T (2000) Socio-technical theory. In: Warwowski W (ed) Handbook of
ergonomics. John Wiley, New York, pp 23–32
Developing immunity to flight security risk: prospective benefits 233

Bainbridge L (1987) The ironies of automation. In: Rasmussen J, Duncan K, Leplot J (eds) New technology
and human error. Wiley, London, pp 276–283
Baxter G, Sommerville I (2011) Socio-technical systems: from design methods to systems engineering.
Interact Comput 23:4–17
BBC (2012) Undercover Agent in al-Qaeda Bomb Plot ‘was British’. BBC [online], 11 May. http://
www.bbc.co.uk/news/world-us-canada-18029242. Accessed 9 December 2012
Bostrum RP, Heinan JS (1977) MIS problems and failures: a socio-technical perspective. Mis Quart
1(3):17–32
Bureau d’Enquetes et d’Analyses (2012) Final report on the accident on 1st June 2009 to the Airbus A330-
203 registered F-GZCP operated by Air France flight AF 447 Rio de Janeiro – Paris. http://
www.bea.aero/en/enquetes/flight.af.447/flight.af.447.php. Accessed 6 December 2012
Cherns A (1976) The principles of sociotechnical design. Hum Relat 29(8):783–792
Dekker SWA (2005) Ten questions about human error: a new view of human factors and system safety.
CRC Press, New York
Elwell R (2008) The threat to aviation by terrorist acts. The log. British Association of Airline Pilots
Association
Emery FE, Trist EL (1960) Socio-Technical Systems. In: Churchman CW, Verhulst M (eds) Management
Science Models and Techniques, Vol 2. Oxford, Pergamon, pp 83–97
Emery FE, Trist E (1981) Introduction to volume 1 systems thinking (volume 1) selected readings. Penguin,
Harmondsworth
Feakin T (2011) Insecure skies? challenges and options for change in aviation security. Royal United
Services Institute for Defence and Security Studies, London
Grabowski M, Roberts KH (1996) Human and organisational error in large scale systems. IEEE T Syst Man
Cy A 26(1):2–16
Hills M (2012) A new perspective on the achievements of psychological effects from cyber-warfare
payloads: the analogy of parasitic manipulation of host behaviour. J Law Cyb 1(1)
Hofer F, Wetter OE (2012) Operational and human factors issues of new airport security technology—two
case studies. J Air Transp Manag 5(4):277–291
Hudson P (2007) Parasites, diversity and ecosystems. In: Renoud F, Guegan J (eds) Parasites and
ecosystems. Oxford University Press, Oxford
Ibrahim MS, Fakhrul-Razi A, Aini MS, Sharif S, Mustapha S (2002) Technological man-made disaster
precondition phase model for major accidents. Int J Disaster Prev Manag 11(5):380–388
Jenkins B (2012) Aviation security: after four decades, its time for a fundamental review. RAND
Corporation, Santa Monica
Kaven CB, O’Hara MT, Patterson EC, Bostrum RP (1999) Excellence in client/server information system
implementations: understanding the STS connection. Manage Decis 37(3):295–301
Kirschenbaum A, Mariani M (2012) Trusting technology: security decision making at airports. J Air Transp
Manag 25:57–60
Kirschenbaum A, Mariani M, Van Gulijk C, Lubasz S, Rapoport C, Andriessen H (2012a) Airport security:
an ethnographic study. J Air Transp Manag 18:68–73
Kirschenbaum AA, Rapaport C, Lubasz S, Mariani M, Van Gulijk C, Andriessen H (2012b) Security
profiling of airport employees: complying with the rules. J Air Transp Manag 6(4):373–380
LaTourrette T, Jenkins B (2012) The goal of efficient security. In: Jackson B, LaTourrette T, Chan E,
Lundberg R, Morral A, Frelinger D (eds) Efficient aviation security: strengthening the analystic
foundation for making air transportation security decisions. http://www.rand.org/content/dam/rand/
pubs/monographs/2012/RAND_MG1220.pdf. Accessed 19 January 2013
MacAskill E, Black I (2012) Underwear bomb plot: British and US intelligence rattled over leaks. The
Guardian [online] 11 May. http://www.guardian.co.uk/world/2012/may/11/underwear-bomb-plot-mi6-
cia-leaks. Accessed 19 December 2012
Margerison C (1989) Introducing change: advisors we consult and methods they use. Manage Decis
22–26
Mueller J, Stewart MG (2011) Terror, security and money: balancing the risks, benefits and costs of
homeland security. Proceedings of the Annual Convention of the Midwest Political Science
Association, Terror and the Economy: which institutions help mitigate the damage? Held: 1 April
2011 Chicago, Illinois
Mumford E (2006) The story of socio-technical design: reflections on its successes, failures and potential.
Inform Syst 16:317–342
Peltu M, Eason K, Clegg C (2008) How a socio technical approach can help NPfIT deliver better NHS
patient care. http://www.bcs.org/upload/pdf/sociotechnical-approach-npfit.pdf. Accessed 28 June 2012
234 P. McFarlane, M. Hills

Perrow C (2007) The next catastrophe: reducing our vulnerabilities to natural, industrial and terrorist
disasters. Princeton University Press, Princeton
Pidgeon N, O’Leary M (2000) Man-made disasters: why technology and organisations (sometimes) fail.
Safety Sci 34:15–30
Reason J (1995) Understanding adverse events: human factors. Qual Health Care 4:80–89
Riley KJ (2011) Air travel security sine 9/11. RAND Corporation, Santa Monica
Seper J (2012) FBI probes bomb plot leaks, The Washington times, 16 May 2012. http://
www.washingtontimes.com/news/2012/may/16/fbi-director-probe-under-way-leak-al-qaeda-plot.
Accessed 21 December 2012
Shrivastava P, Mitroff II, Miller D, Miglani A (1988) Understanding industrial crisis. J Manage Stud
25(4):285–303
Toft B, Reynolds S (2005) Learning from disasters: a management approach. Palgrave Macmillan, New
York
Turner B (1978) Man-made disasters. Taylor & Francis Group, London
Wilkinson P, Jenkins BM (1999) Aviation terrorism and security. Frank Cass Publishers, London
Woods DD (2006) Essential characteristics of resilience. In: Hollnagel E, Woods DD, Leveson N (eds)
Resilience engineering: concepts and precepts. Ashgate Publishing, Aldershot, pp 21–23

You might also like