Professional Documents
Culture Documents
Practical Wisdom in An Age of Computerisation - Blokley
Practical Wisdom in An Age of Computerisation - Blokley
David Blockley
To cite this article: David Blockley (2020) Practical wisdom in an age of computerisation, Civil
Engineering and Environmental Systems, 37:4, 197-213, DOI: 10.1080/10286608.2020.1810675
Introduction
All professions, including civil engineering, are facing some formidable challenges.
We are:
revolution integrating the physical, digital and biological including robotics, artificial
intelligence, nanotechnology, quantum computing, biotechnology, The Internet of
Things (IoT), 3D printing and autonomous vehicles;
(7) replacing products that once were straightforward and understandable to the layper-
son with products that are opaque and understandable only by specialists.
(a) Will there be enough work in the future to maintain full employment, and if so what
will that work be?
(b) Which occupations will thrive, and which ones will wither?
(c) What are the potential implications for skills and wages as machines perform some or
the tasks that humans now do?
The purposes of this paper are fourfold. First to identify some of the potential threats
and opportunities to professional civil engineering practice inherent in these challenges.
Second to suggest that in order to manage them we need to understand better and re-
evaluate the overview of the services we provide. Third to improve the ways in which
we deliver those services through the processes we civil engineers undertake by
effective use of the modelling tools and the AAI (assistive artificial intelligence) available
to us. Fourth to understand and nurture the essence of practical intelligence and
wisdom that cannot easily be computerised (at least in the foreseeable future).
Susskind and Susskind (2015) assert that in the past people have effectively established a
’grand bargain’ with the professions. The professions provide certain types of services for
which the people entrust them to act as gatekeepers of their particular expertise. Doctors
look after medical practical expertise, lawyers look after legal practical expertise, and engin-
eers look after engineering practice and so on. But they ask, as digital information technol-
ogy advances, it is time to revisit the grand bargain and ask if it is still fit for purpose. They
conclude that it isn’t because we now have the means to share practical expertise much
more widely. Their central question is ‘How do we share expertise in society?’ In other
words, they and others, are challenging our profession to justify itself as never before.
One possible consequence of an inadequate response to these challenges, could be that
the big digital technology companies may think of moving into the construction industry.
They may see an opportunity to automate and profit from it as they have done in other
industries such as retailing, manufacturing and telecommunications.
But the picture is not all doom and gloom. The ICE State of the Nation report 2017 states
that an improved use of smart technology, data and analytics offers a way forward in
improving the performance of UK infrastructure and addressing the UK’s ongoing pro-
ductivity deficit.
New technologies are ’already transforming the built environment, but also the shape of our
industry itself. They will also change expectations and demands around the level of trust
required in technologies … autonomous vehicles, machine learning, the internet of things,
artificial intelligence … will change how our existing infrastructure is used … as well as the
skills required to design, build and maintain it.
Shanks (2018) calls for structural engineers to look beyond their discipline to invest in
long-term relationships and innovate through better understanding of how industry
works.
As more engineering tasks are automated some people have begun to question what it is
that engineers contribute. In one discussion (Thornton Tomasetti 2019) one opinion was that
A lot of work that architects and engineers do is straight production: that’s going to go away
first. Then basic decision-making is going to go away—you structural engineers are probably
the first to go because you’ve been so rigorous about proceduralising all your knowledge and
your work.
overseas. We have further limited the space by developing standards and codes that attempt
to define the design parameters of upwards of 95% of the structures being built today.
What jobs are most at threat? The chief economist of the Bank of England has said that
administrative, clerical and production tasks were most vulnerable (Frey 2018). Frey and
Osborne (2013) have estimated the probability of computerisation (by which they mean
computers taking over a significant proportion of the work) for 702 detailed occupations.
They conclude that, at that time, about 47 percent of total US employment is at risk in the
future, with a strong negative relationship with an occupation’s probability of computer-
isation. Civil engineering ranks 84th (with a probability of computerisation p = 0.019) and
civil engineering technicians come in at number 413 (p = 0.75). If we contrast these figures
with those for sales engineers (who rank as 14th) we get some insight into the assump-
tions made in their analysis. The jobs that are the most unlikely to be computerised,
according to the authors, are those that help others, involve negotiation, are creative or
require cognitive skills. They quote examples such as therapists, surgeons, choreogra-
phers, pre-school teachers, and clergy. Those that are most like to be computerised are
those that involve assembling and manipulating objects.
(ANN) that are the basis for many applications such as driverless cars. A longer answer is
that an ANN consists of extremely large sets of graphs with numerical weightings attached
to every node. The nodes are connected by links. The patterns of weights form a transfer
function between a set of inputs and a set of outputs. The patterns are established using
algorithms that ’train’ the ANN. These training algorithms operate on examples of known
inputs that lead to known outputs to adjust the weightings of the ANN. Many thousands of
training examples may be needed. The algorithms are basically simple, but the massive
numbers of interactions used in the training make the overall effect quite complex and
totally opaque. The goal of the training is to find an effective transfer function that
allows new outputs (predictions) from new inputs.
The ANN transfer functions are surprisingly effective but totally opaque because they
are unable to explain or, more importantly, justify their conclusions and the range of appli-
cability and numbers of parameters are rarely stated. Unless and until new research pro-
vides ways in which this can be done then an ANN cannot in anyway be compared to a
professional human decision maker. Without an ability to justify a decision to exercise a
professional duty of care for public safety under the law of tort, no ANN can replace a pro-
fessional decision maker. For a detailed overview of the various implementations of neural
nets see (Neural Nets 2019).
It is unsurprising therefore that many people, inside and outside of the profession see
the potential for improvement. But how do we guard against inappropriate change that
may promise much but gain very little. History tells us that people and organisations
are adaptable and that new types of work will emerge. Therefore, the emphasis should
perhaps be on how we are resilient and prepare for the inevitable changes.
In order to assess the impact of computerisation on jobs we need to look much further
and deeper than Frey and Osborne’s overview analysis and we need to develop what UK-
SPEC and CEEBOK mean by being able to understand, apply and to use engineering knowl-
edge to solve engineering problems. What are the human attributes that are necessary for
good and safe engineering that cannot be computerised?
The common sense understanding of understanding is, at its simplest ‘being able to
restate a problem in one’s own words’. A more nuanced definition might differentiate
between understanding conceptually and mathematically. Still further as well as being
able to explain, we should add interpreting, applying, placing in context or perspective,
empathising and having self-knowledge. UK-SPEC and CEEBOK include examples of
how these levels of understanding might be demonstrated.
A different, but potentially more revealing approach, taken by Blockley and Robertson
(1983) was to analyse the characteristics of a good civil engineer. There is not the space
here to compare this analysis with the O*NET database and UK_SPEC or CEEBOK but a
casual glance demonstrates that the qualities required of a civil engineer in practice are
much more subtle than a data base (even of big data) can capture. In other words, we
can largely discount the conclusions of Frey and Osborne other than as an overview com-
parison. In the next sections of this paper I will simply focus on three high level essential
attributes, suggested by Blockley and Robertson (1983) that cannot easily be compu-
terised. These are technical soundness, professionalism and personal qualities. Specifically,
I will focus on an important single aspect of each of the three as: understanding as a
necessary quality of technical soundness, practical wisdom as an under recognised attri-
bute of professionalism and ethics as a necessary personal quality. First, however I will
examine how we need to get better at the way we formulate our problems and the
role of computers and AI in assisting workflow.
The reason is simple and complex at the same time. Artificial intelligence is presently a
very primitive and very specific form of intelligence – good at some things and poor at
others. AI is good on perceiving and learning but very poor at abstracting and reasoning.
AI systems can occasionally come up with bizarre answers that no human would counte-
nance – for example identifying a toothbrush as a baseball bat. The systems can also be
tricked into giving wrong answers by overlaying invisible distorting patterns designed
by a malevolent expert. Bias or distortion, intended or unintended is built into the data
– for example gender, racial or other cultural assumptions. Any human being can recog-
nise the face of another human being after one glimpse whereas. An AI programme or
ANN requires many thousands of training examples – it is far from mimicking human intel-
ligence. As one acerbic commentator said if a computer ANN had to learn how not to get
killed by a car then it will have to get killed before knowing how not to get killed.
Of course, this is not to say that AI cannot be useful. It may perhaps be better to inter-
pret it as AAI or Assisted Artificial Intelligence (although the term AI is so embedded it will
probably continue to be used). In this way we use it just as we use software for finite
element analysis or as a BIM process. But the modern context has changed. The challenges
outlined at the start of this paper are complex with a high degree of uncertainty. To tackle
them with the help of AAI we need to improve the ways in which we structure our pro-
blems and capture the workflow needed to deliver our objectives. We need a much
firmer grasp of the nature of the resilience we will require to cope with unexpected and
unintended outcomes. We need to be robust in defending ourselves against inappropriate
automation and to make sure that AAI, data from the IoT and robotic interventions are
controlled through a human duty of care. In other words, we need a theory of
workflow, incorporating AI, which is suitable for an age of increasing computerisation.
We need new not just better workflow models but also a theory behind the models
which captures the essence of practical wisdom and capitalises on AAI without compro-
mising professional ethics of duty of care.
Workflow models derive from the way we structure our problems. The latter has been
driven by software and systems analysts and merits much more attention by civil engin-
eers. For further reading see (Blockley and Godfrey 2017 Section 2.3). The simplest
workflow models are flowcharts (used widely for writing computer code) which are
simple, easily understood but very limited in scope and much too prescriptive. Currently
the most developed workflow models are those of IDEF (2019) but they are rarely used in
construction probably because whilst they are less prescriptive than simple flowcharts,
they rapidly become difficult to manage for large complex projects. Of course, the most
widely used, but perhaps not usually described as workflow models, are critical path net-
works typically translated into Gannt Bar charts. Whilst critical path analyses are undoubt-
edly useful, they are about just one aspect – time – and not ‘rich’ enough to capture all
necessary aspects of ‘joined-up’ workflow.
In order to introduce a new approach to problem structuring and modelling of
workflow based on sound theoretical and philosophical foundation, Blockley and
Godfrey (2000, 2017) introduced the interacting objects process model (IOPM). The meth-
odology was first introduced to tackle the challenges of the Egan report (DETR 1998) as
well as those mentioned earlier. The IOPM is based on 5 axioms, 7 principles and 17 Cor-
ollaries (Engineering Synergy 2019) and on a new and extended understanding of what
constitutes a systemic process. In all other workflow models a process is simple a
204 D. BLOCKLEY
transformation of inputs to outputs. In the IOPM a process is a much richer idea. It is a ‘sys-
temic’ process of a potential that drives a flow of change against an impedance. The
potential derives from answers to questions ‘why’. Examples of potential in hard physical
systems are changing velocity and voltage. In soft systems potential is purpose. The flow of
change is captured in answers to questions, ‘who, what, where and when’. Examples of
flow in hard systems are force and electrical current. Flow in soft systems is change –
changes in people (who), changes in states of affairs (what), in context (where) and in
time (when). Impedance is dissipation of energy (friction, damping, resistance), storage
of potential (capacitance) and storage of flow(inductance) – and all are captured by
models based on answers to questions ‘how’.
The IOPM approach:
(1) facilitates an integrated (joined-up) views of ‘big-picture’ high level models with
detailed models;
(2) is based on a strong theoretical and philosophical foundation;
(3) permits the user to navigate through layers of complexity;
(4) provides a common structure of concepts and relationships for all aspects or ‘views’ of
the system;
(5) models the drivers and flows of change in ‘active data’;
(6) is equally applicable to ‘hard’ physical systems as well as ‘soft’ human systems;
(7) can model all aspects of uncertainty including fuzziness and incompleteness;
(8) highlights the importance of good leadership.
Figures 1–6 show examples. Figure 1 illustrates how models can be built in layers of pro-
cesses so that the top layers allow a big picture overview whilst lower layers capture detail.
The layers are logically connected in that success in a group of processes at one level are
deemed to be jointly necessary and sufficient for success for the process in the layer above
to which the group is connected. The processes within each layer are connected in time so
that a critical path may be calculated and shown on a horizontal time axis. Of course, in the
diagram there is space only to show a few processes but in software each layer could of
course fill a number of complete computer screens. Figure 2 illustrates the differing views
and information about a single interacting object capturing the potential as answers to
questions ‘why’, the flow as answers to questions (‘who, what, where and when’) and the
transformations as answers to the questions ‘how’. The diagram also shows the colours of
an ‘Italian Flag’ (Blockley and Godfrey 2017). The flag expresses the level of confidence that
the process owner has in the dependability of the available evidence that this single
process will be successful – green represents evidence for, red represents evidence
against and white represents the incompleteness of the evidence or the extent to
which we don’t know. Note that the relationships between this single interacting object
and other objects is the same for all attributes so that the structure of relationships
between people (who) and things (what) are identical. This is important because it
reduces the likelihood of confusions where structurally related but differing aspects (such
as job titles, organisational charts, functional diagrams and process/activity diagrams) are
often currently modelled in different ways. Figures 3 and 4 show a simple example of a
process of ‘making sandwiches’ with associated critical paths in Figure 3 to illustrate
some of the ‘when’ attributes. Of course, the detail of information about why, how,
who, what, where and when can be much fuller than shown in Figure 4 and held in sep-
arate linked files. Figure 5 shows a simple IOPM for ‘constructing a building’ with some
‘when’ (time) attributes for illustration.
It is worth noting that in this methodology every process must have a named process
owner. One of the important responsibilities of the process owner is to ensure that the
data is continually updated – including the Italian Flag. It is clear from Figure 5 that the
process owner for ‘erecting the structure’ has understood that there is a ‘cladding’
problem (a high degree of red). However, that is not being recognised by the process
owner for ‘constructing the building’ (high degree of green) so the model is telling
them they need to talk.
Finally Figure 6 shows an IOPM for a physical process – in this case a typical beam
element in a structure. The element has neighbouring IOPM beam elements either side
as well as a loading IOPM. The beam IOPM is receiving force and displacement ‘messages’
(values of shear force, bending moment and displacements – only forces are shown for
clarity) from adjacent elements. It then sends out messages based on familiar equilibrium,
compatibility and constitutive equations to its neighbours. The channels of communi-
cation are the degrees of freedom of the element. For more on the IOPM for modelling
physical hard systems see Blockley (1995).
wisdom – sometimes translated as prudence (phronesis). Ars or techne (from which we get
the words art and technical, technique and technology) was concerned with production
but not action. The Greeks did not distinguish the fine arts as the work of an inspired indi-
vidual – that came only after the Renaissance. So techne as the modern idea of mere tech-
nique or rule-following was only one part of what Aristotle was referring to. Episteme (from
which we get the word epistemology or knowledge) was, to the Greeks, of necessity and
eternal; it was knowledge that cannot come into being or cease to be; it was demonstrable
and teachable and depends on first principles. Practical wisdom or phronesis was an intel-
lectual virtue of perceiving and understanding in effective ways and acting benevolently
and beneficently. It was not an art but necessarily involved ethics, not static but always
changing, individual but also social and cultural. Aristotle thought of human activity in
three categories praxis, poeisis (from which we get the word poetry), and theoria (contem-
plation – from which we get the word theory).
As I see it phronesis is a means towards an end arrived at through moral virtue. It is con-
cerned with the capacity for determining what is good for both the individual and the
community. It is a virtue and a competence, an ability to deliberate rightly about what
is good in general, about discerning and judging what is true and right but it excludes
specific competences (like deliberating about how to build a bridge or how to make a
person healthy). It is purposeful, contextual but not rule-following. It is not routine or
even well-trained behaviour, but rather intentional conduct based on tacit knowledge
and experience, using longer time horizons than usual, and considering more aspects,
more ways of knowing, more viewpoints, coupled with an ability to generalise beyond
narrow subject areas. Phronesis was not considered a science by Aristotle because it is vari-
able and context dependent. It was not an art because it is about action and generically
different from production.
Thomas Homer-Dixon (1995) wants to close the ‘ingenuity gap’ between those who
adapt well to complex changes and those that don’t. He wants to see us move from
authoritative top down command and control to a mature collaborative culture that
focuses on enabling others to be successful.
Of course, we humans have shown much ingenuity in the past but there is a danger of a
kind of ’technical triumphalism’ that claims we can meet our challenges simply by carrying
on as we have done. We need to understand that practical rigour is not the same as logical
rigour. Rigour is the strict enforcement of rules to an end. Mathematical logic is the ulti-
mate form of logical rigour: it has one value – truth. Theorems are deduced using
axioms (rules) which are true-by-definition. Physical science aims at precise truth but
truth as a correspondence to the facts, which have to be set in a context. Practical
rigour is much more complex. It is meeting a need by setting clear objectives involving
many values (some in conflict) and reaching those objectives in a demonstrably depend-
able and justifiable way. Practical rigour is about using dependable evidence to move a
process towards a specific goal. That is why the Italian Flag of evidence is so important
to the Iopm. Practical rigour implies practical intelligence, which in turn implies practical
experience. In other words, experience is necessary but not sufficient for practical intelli-
gence – a capacity to learn, reason and understand practical matters. And practical intelli-
gence is necessary but not sufficient for practical rigour. That is because practical
intelligence and rigour require reflective learning and development on that experience.
CIVIL ENGINEERING AND ENVIRONMENTAL SYSTEMS 209
(1) making it work – creating practical solutions to meet explicit needs and delivering a
system valued in a variety of ways, not just cost;
(2) creating appropriate models – understanding – working with nature – making sensible
approximations that respect nature. Far from being the cause of loss of rigour, as our
academic accusers may hold, the approximations of our models are the sources of the
practical rigour required to create a solution that meets the needs. Practical rigour
requires diligence and duty of care that leaves no stone unturned with no sloppy or
slip-shod thinking;
(3) considering the whole as well as the parts. The scientific approach is one where we
look at a problem, break it down into its separate components, take out the difficult
bits that we do not know how to solve and focus on what we can solve. It is a
process of selective inattention. Practical rigour does not have that luxury – it requires
a rigour that deals with the bits of the problem that we do not always understand too
well;
(4) making judgments. Professional opinions are not arbitrary, they are based on Karl
Poppers objective world 3 evidence (1979) of varying dependability. Opinion based
on experience may be less dependable than measurement or standard theory, but
it has to be testable against world 3 objective knowledge, ultimately, perhaps, in
the courts;
(5) exercising creative foresight. Practice requires the creativity to imagine what might
happen – how physical things will respond and how people might behave in future
situations or scenarios;
(6) developing and evaluating dependable evidence. The only clear way to judge the
dependability of evidence is to subject it to as many tests as seems appropriate;
(7) feedback and learning. One of the seven habits of successful people identified by
Covey (2005) is learning to improve or self-renewal.
As Sellman (2012) states that a wise practitioner (phronimo) will recognise not knowing
our competencies, i.e. not just knowing what we know, but also our uncompetencies, i.e.
not knowing what we do not know. He writes
The competent practitioner is not concerned with merely getting through the work, not even
with mere skills acquisition; rather, he/she aspires toward the Aristotelian ideal of doing the
right thing to the right person at the right time in the right way for the right reason.
The virtues required are integrity, openness and honesty with the humility necessary to
acknowledge that there are things we do not know we do not know and the willingness
to act so as to rectify any identified deficits that threaten our claim to competent practice.
Ethics, as the moral rules that determine conduct, are central to wise practice as phron-
esis. Morals are concerned with what is good or bad what we ought to do rather than what
is. Ethics is about values. Values are the worth we give to something – not simply financial
but in all of the relevant ways. Phronesis is action after deliberation based on values, using
210 D. BLOCKLEY
Popper’s point is that all observations contain assumptions which may or may not be
made clear. Whilst those assumptions can be appreciated by a wise practitioner they
will be embedded and unknown in any AI system that relies purely on the data itself
(1) All professions are facing some formidable challenges from climate change and
increasing computerisation.
(2) The potential threats include widespread predictions of job losses and the dangers of
inappropriate unsafe automation. All workers will need to adapt, as their jobs change
to incorporate increasingly capable machines. Some will need help to develop new
social and emotional skills that are essential but hard to automate.
(3) As seen from the outside civil engineering jobs may seem to be vulnerable because
there is a widespread perception that engineers simply apply science that is ’true’
with little or no creativity and judgement.
(4) Some commentators go so far as to argue that we now have the means to share practical
expertise much more widely and they are wanting to know how that could be done.
CIVIL ENGINEERING AND ENVIRONMENTAL SYSTEMS 211
Disclosure statement
No potential conflict of interest was reported by the author(s).
212 D. BLOCKLEY
Notes on contributor
David Blockley is an Emeritus Professor of Civil Engineering and one-time Head of the Department of
Civil Engineering and Dean of Engineering at the University of Bristol, UK. He graduated from the
University of Sheffield in 1964 with a PhD in 1967. He worked for the British Constructional Steelwork
Association in London before moving to Bristol. He holds a DSc from the University of Bristol, is a
Fellow of the Royal Academy of Engineering, of the Institution of Civil Engineers, and of the Insti-
tution of Structural Engineers. He was President of the Institution of Structural Engineers 2001–02.
He was a Non-Executive Director of Bristol Water plc 2003–2009.
References
Arup, University of Bristol, ICE. 2017. How can Civil Engineering Thrive in a Smart City World? London,
UK: Arup.
ASCE, SEI. 2013. “A Vision for the Future of Structural Engineering and Structural Engineers: A case for
change.” American Society of Civil Engineers, Structural Engineering Institute, USA.
BBC. 2019. “Will a Robot Take Your Job?” BBC, UK. Accessed April, 2019. https://www.bbc.co.uk/news/
technology-3406694.
Blockley, D. I. 1995. “Computers in Engineering Risk and Hazard Management.” Archives of
Computational Methods in Engineering 2 (2): 67–94.
Blockley, David, and ICE. 2017. State Of The Nation 2017: Digital Transformation. London, England,
United Kingdom: Institution of Civil Engineers.
Blockley, D. I. 2019. “Practical Wisdom and Why we Need to Value It.” Accessed May, 2019. https://
blog.oup.com/2014/07/practical-wisdom-vsi/.
Blockley, D. I. 2020. Creativity, Problem Solving and Aesthetics in Engineering. Switzerland: Springer
Nature.
Blockley, D. I., and David Blockley. 2012. VSI Engineering. Oxford, Bristol, United Kingdom: Oxford
University Press.
Blockley, D. I., and P. S. G. Godfrey. 2000. Doing it Differently. London: Thomas Telford.
Blockley, D. I., and P. S. G. Godfrey. 2017. Doing it Differently. 2nd ed. London: ICE.
Blockley, D. I., and C. Robertson. 1983. “An Analysis of the Characteristics of a Good Civil Engineer.”
Proceedings of the Institution of Civil Engineers, Part 2 75: 77–94.
CEEBOK (Civil Engineering Body of Knowledge) Preparing the Future Civil Engineer Third Edition.
2019. “ASCE.” Accessed May, 2019. https://ascelibrary.org/doi/pdf/10.1061/9780784415221.
Covey, S. 2005. The 7 Habits of Highly Effective People. Simon & Schuster, USA.
DETR. 1998. Rethinking Construction. London: Department of the Environment, Transport and the
Regions.
Engineering Council 3rd Edition. 2014. “UK-SPEC UK Standard for Professional Engineering
Competence.” Accessed May, 2019. https://www.engc.org.uk/engcdocuments/internet/Website/
UK%20SPEC%20third%20edition%20(1).pdf.
Engineering Synergy. 2019. Accessed May, 2019. http://myengineeringsystems.co.uk/5-axioms/.
Frey, C. B., Bank of England. 2018. “Attitudes Towards Automation Past Present and Future.” Accessed
April, 2019. https://www.bankofengland.co.uk/-/media/boe/files/events/2018/may/attitudes-
towards-automation-past-present-and-future.
Frey, C. B., M. A. Osborne, and David Blockley. 2013. The Future of Employment: How Susceptible are
Jobs to Computerisation. Oxford, England, United Kingdom: Oxford Martin School, University of
Oxford.
Homer-Dixon, T. 1995. The Ingenuity Gap. Accessed May, 2019. https://homerdixon.com/wp-content/
uploads/2017/05/Homer-Dixon-The-Ingenuity-Gap-1995.pdf.
IDEF. 2019. Accessed May, 2019. http://www.idef.com/.
Kiersz, A. 2015. You’ve Got a Big Problem if you Lack ‘Interpersonal Skills’. USA: Business Insider.
McKinsey Global Institute Report December. 2017. “Jobs Lost, Jobs Gained: Workforce Transitions in a
Time of Automation.” Accessed April, 2019. https://www.mckinsey.com/~/media/mckinsey/
featured%20insights/Future%20of%20Organizations/What%20the%20future%20of%20work%
CIVIL ENGINEERING AND ENVIRONMENTAL SYSTEMS 213
20will%20mean%20for%20jobs%20skills%20and%20wages/MGI-Jobs-Lost-Jobs-Gained-Report-
December-6-2017.ashx.
MIT Technology Review. 2013. “How Technology is Destroying Jobs.” Accessed April, 2019. https://
www.technologyreview.com/s/515926/how-technology-is-destroying-jobs/.
Neural Nets. 2019. “A Basic Introduction To Neural Networks.” Accessed May, 2019. http://pages.cs.
wisc.edu/~bolo/shipyard/neural/local.html.
O*NET Content model. 2019. Accessed April, 2019. https://www.onetcenter.org/content.html.
Popper, K. 1976. Conjectures and Refutations. London: Routledge and Kegan Paul.
Sellman, D. 2012. “Reclaiming Competence for Professional Phronesis.” In Phronesis as Professional
Knowledge: Practical Wisdom in the Professions, edited by E. A. Kinsella and A. Pitman, 115–130.
Alberta, Canada: Sense publishers. Chapter 9.
Shanks, J. 2018. “New Means of Construction: What Will be the Next Big Leap?” The Structural
Engineer. Accessed May, 2019. https://www.istructe.org/resources/blog/new-means-of-
construction-next-big-leap/.
Susskind, R., and D. Susskind. 2015. The Future of the Professions: How Technology Will Transform the
Work. Oxford: OUP.
Thornton Tomasetti. 2019. How Far Can Automation Go? Accessed April, 2019. https://www.
thorntontomasetti.com/how_far_can_automation_go/.