You are on page 1of 19

Technology Analysis & Strategic Management

Vol. 19, No. 4, 471–489, July 2007

The Impact of Modelling and Simulation


Technology on Engineering Problem
Solving

MARK DODGSON∗ , DAVID M. GANN∗∗ & AMMON SALTER∗∗


Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

∗ University of Queensland Business School, Australia, ∗∗ Tanaka Business School, Imperial College London, UK

ABSTRACT The paper focuses on the use of modelling and simulation technology in the design and
development of engineering projects. Using case studies, the authors examine the experiences of engi-
neers and designers working with these tools. The paper provides insights into how this technology is
reshaping the way engineers work and solve problems. Engineering design remains a highly uncertain
activity and the costs of failure can be high. It was found that the use of modelling and simulation helps
engineers to better understand physical properties and behaviour—quickly, cheaply and accurately—
before they construct artefacts and systems. Such tools can help engineers ‘learn-before-doing’ and
experiment with the integration of different technologies and components. Modelling and simulation
helps build the ‘design conversation’ between contributors to an engineering project, including cus-
tomers and regulators. It provides opportunities for feedback and learning and can promote open,
interdisciplinary and collaborative working styles. These findings are related to existing literature on
problem solving in engineering design and a future research agenda is proposed that examines the
opportunities for and limitations of these technologies.

Introduction
There is nothing so embarrassing as a public failure.Yet engineers often work on challenging
and difficult projects with high public visibility where the possibility of failure continually
exists and its consequences are considerable. Failures are often the subject of media attention
and contribute to diminishing the reputation of individual engineers, engineering firms and
the engineering profession in general. The mistakes of engineers can be disastrous; Herbert
Hoover said, reflecting on his career as an engineer before entering politics, the engineer
‘cannot bury his mistakes. . . if his works do not work, he is damned’.1 Failures range in scale
from collapsing bridges and buildings to products that misread markets, such as the Sinclair
C5 and the DeLorean car. To help avoid failure, engineers draw on previous knowledge on
what has succeeded and failed in the past and use a range of tools and analytical techniques to
reduce uncertainty and manage risk.Yet there is always a degree of uncertainty as to whether

Correspondence Address: David M. Gann, Tanaka Business School, Imperial College London, South Kensington
Campus, London SW7 2AZ, UK. E-mail d.gann@imperial.ac.uk

0953-7325 Print/1465-3990 Online/07/040471–19 © 2007 Taylor & Francis


DOI: 10.1080/09537320701403425
472 M. Dodgson et al.

a project or technology will work in practice as well as it does in theory. Sometimes success
and failure of the physical artefact will only be known long after it has been in use.2
In this paper, we examine the role of technology in helping engineers solve problems
and ameliorate some of the uncertainties involved in engineering projects. Our focus is the
role played by modelling and simulation technology in developing engineering solutions.
Since the 1960s, computer-based modelling and simulation has gradually been replacing
the traditional, laborious hand-calculations, drafting, physical testing and model build-
ing approach to many design tasks.3 Over recent years, this technology has significantly
expanded its capabilities, aided by more powerful processors, greater visualization capac-
ities, more sophisticated software allowing new functionality across multiple and more
widely diffused platforms, and better data on physical properties. Modelling and simula-
tion is conducted on a variety of technological platforms, including computer-aided design
(CAD), using a number of techniques such as computational fluid dynamics (CFD) and finite
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

element analysis (FEA). Advanced CAD systems, such as CATIA,4 are routinely used in
the design of complex new products, such as the Boeing 777 and Airbus A380 to the rela-
tively simple design of toothbrushes and shoes. Modelling and simulation is also conducted
using combinations of readily available computer software packages, using spreadsheets
and visualizations such as heatmaps.
Dodgson et al. 5 include modelling and simulation among a range of digital technologies
that are being used in modern innovation processes and which they describe as ‘innovation
technology’ (IvT). IvT includes virtual reality, data mining, artificial intelligence and rapid
prototyping. They argue these technologies are becoming ubiquitous, with extensive use in
a wide range of sectors, from pharmaceuticals to the mining and construction industries and
across many different types and sizes of organization. The focus of this paper is modelling
and simulation technology, and for the sake of convenience, this will be described as IvT.
The use of these technologies has increased greatly in recent years among leading innova-
tive organizations.6 Their adoption is forcing many organizations to reconsider the way they
conduct and manage innovative activities. Most of the discussion in this area has focused
on the impact of new electronic tools on the nature of knowledge in a limited number of
technological and organizational environments.7 In particular, it has focused on the poten-
tial for the ‘codification’ of tacit knowledge by means of electronic tools (see the special
edition of Research Policy on the ‘Codification of Knowledge’, 30(9), 2001). Our approach
is different. Our goal is to explore how these new tools are being used by engineers to solve
problems and alleviate sources of failure confronting engineering projects. IvT has other
uses too, such as ensuring the integration of design with manufacturing and operations.
Based on our case studies, we focus on three particular uses of IvT:

1. Shaping and representing potential solutions by encouraging better understanding of


contextual factors affecting engineering design, engaging with a range of contributors
to projects, helping determine successful outcomes
2. Interpreting the behaviour of physical properties and developing efficient technical
solutions to problems
3. Integrating different technologies and components.

We show that by operating in virtual design environments, and with the appropriate skills
and awareness of the limitations of the technology and the engagement of interested par-
ties, designers can have confidence in the appropriateness, accuracy and efficacy of their
solutions.
Modelling and Simulation Technology in Engineering 473

The paper is organized as follows. In Section 2, we describe the methods used in this
study. In Section 3, we briefly introduce our case studies with thumbnail sketches of the key
points relating to modelling and simulation. In Section 4, we review some of the features
of engineering problem solving as described in the literature, analysing how modelling and
simulation affects engineering problem solving in our case studies. Section 5 contains a
discussion and presents conclusions and a future research agenda.

Method
The research is based on an inductive approach using case studies. Following Eisenhardt’s8
inductive case-oriented research approach, we embarked on the case studies to understand
the way in which engineering practices are changing as a result of IvT. The cases form
part of a wider research study carried out by the authors into the use of IvT in firms and
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

in particular projects. They were chosen because they illustrate key emerging aspects in
engineering design and the use of IvT. The authors assembled evidence into material from
which detailed questions were generated relating specifically to the use of IvT in problem
solving and design in engineering projects. A semi-structured questionnaire was used in all
interviews. This was adjusted during the research process in order to ensure that questions
better reflected the experience of the interviewees. The interviews provided an opportunity
to explore different environments, highlighting the particular features of the organization’s
and individual’s history and the managerial approach to using technology in engineering
problem solving and innovation in general; as well as in the specific projects in question.
Details of the interviews conducted are included in an Appendix.

Case Studies
The Leaning Tower of Pisa
The Leaning Tower of Pisa is about 60 m tall, weighs 14,500 tonnes and has foundations
with a diameter of 19.6 m. The tower was constructed as a hollow cylindrical bell tower
for the Cathedral. Construction began in 1173, progressed to about a quarter of its final
size by 1178 and then stopped for about 100 years. Work recommenced in 1272 and the
tower reached the seventh cornice height in 1278. Work to add the bell chamber took place
between 1360 and 1370. Had the construction work taken place in a continuous fashion,
the tower would have collapsed due to unstable ground conditions. However, the two long
pauses in activity provided time for the ground to harden and resist the load of the structure.
In 1990, surveys showed the foundations inclined due south at about 5.5 degrees to the
horizontal and the seventh cornice overhung ground level by about 4.5 m. For most of the
20th century, the inclination of the Tower was increasing. In 1990, the rate of inclination
was twice that in 1930 and an instantaneous buckling failure of the masonry could have
occurred without warning. There were several alternative suggestions on how to prevent the
tower from collapsing: each with supporters and proponents. The collapse of the Leaning
Tower of Pisa would have been calamitous. Quite apart from the implications for Pisa itself,
it would have involved the loss of one of the world’s building icons.
Computer modelling of the tower helped produce a solution that prevented the Tower
collapsing. Stabilization of the Tower was achieved by means of an innovative method
of soil extraction, which induced a small reduction in inclination—not visible to the
casual onlooker. Implementation of this remedial work required computer modelling and
474 M. Dodgson et al.

large-scale development trials with an exceptional level of continuous monitoring and daily
communications to maintain control over the project.9
The leader of the project to prevent the collapse of the Tower of Pisa was Professor John
Burland. Professor Burland’s research interests are in the mechanics of tunnelling-induced
ground movements, their progressive effects on buildings and mitigation. His involvement
in work on the Leaning Tower of Pisa lasted for 12 years—from 1990 to the end of 2001.
He models extensively in his work; for example, in his research on the potential impact on
30 buildings of the construction of the extension to the Jubilee underground line in London:

The engineer is modelling all the time. You do not do anything without modelling,
whether this involves some simple conceptualization or a highly sophisticated model.
Before doing anything on Pisa, we knew it was on the point of falling over, very
delicate. We had to develop a good physical or numerical model in order to investigate
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

stabilization measures. It was unthinkable to go in and try anything without it being


well and truly underpinned by modelling.

This case study is illustrative of the use of IvT in understanding the behaviour of physical
properties and developing efficient technical solutions to problems.

Ricardo Engineering
Ricardo is an engineering technology company conducting research, design and develop-
ment for major automobile manufacturers. Originally based in the UK, it has expanded into
the US, Czech Republic and Germany. With 1,800 employees, it grew its sales from £91
million in 1999 to £160 million in 2005. It has highly specialized expertise in vehicle, engine
and transmission engineering, and works closely with product development departments in
leading automotive manufacturers. It also provides testing and calibration services and has
developed a range of simulation tools in its own software division.
Ricardo’s software engineering group grew in the late 1980s out of an in-house require-
ment for specialized software analysis tools. There were no third-party software products
available at the time and it was necessary for Ricardo to develop its own analysis software to
support engine design projects. The software evolved and enabled fundamental understand-
ing of physical processes in order to make informed design changes. Ricardo’s customers
demanded to use the software and the company repackaged it for them as a product. In
1995, the software group was established in its own right as Ricardo Software, designing,
developing, selling and supporting its own software. Its products are used by a wide range of
industries, including motor sport, automotive, motorcycle, truck, agriculture, locomotive,
marine and power generation. Most of the customers are large automobile manufacturers
and their suppliers. The company works with two main simulation software areas: fluid
systems and mechanical systems.10
One of the drivers of the need for simulations was the increased complexity of the product
to be designed. In the early 1980s, engines were largely mechanical and only a few vari-
ables controlled their performance. Throughout the 1980s and 1990s, however, interactions
between the engine and the rest of the vehicle increased significantly, particularly through
new electronic control systems. As the market for Ricardo software expanded, the firm
became more dependent on external users for feedback and ideas for future developments
to manage this new complexity.
Modelling and Simulation Technology in Engineering 475

In 2003, Ricardo Software employed 55 staff and had three technical centres in Chicago,
Shoreham-by-Sea and Prague.
Ricardo is developing products that integrate different software applications. Various sim-
ulation tools allow designers and analysts to see how different systems within their vehicle
interact with one another and to manage their complex integration, allowing designers to
see how decisions in one area affect other parts of a system. This is a considerable tech-
nological challenge as in Ricardo’s business different parts of the system may be designed
by organizations in various locations around the world; for example, the engine might be
designed in country X, the transmission in country Y and the vehicle in country Z. In
addition to the geographic barriers, each of the different partners involved in the design
process may be using different simulation software, adding to the problems of systems
integration and limiting opportunities for working collaboratively across organizational
boundaries.
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

As more automakers are collaborating on new engine projects, this adds to the challenges
of integration. Mistakes in transmission of data and information from one partner to another
are common in engineering projects. Each organization and country has its own engineering
approach and culture.11 Ideas about how much and what type of information should be shared
with partners often differ from partner to partner. The potential for miscommunication
rises considerably as the number of partners increases. The idea of this new generation
of integration software for collaborative engineering is to help to remove these types of
dangers.
This case study is illustrative of the use of IvT in integrating different technologies and
components.

The Millennium Bridge


The London Millennium Bridge, linking the Tate Gallery and St Paul’s Cathedral, was the
first footbridge across the river Thames to be built for more than 100 years. The bridge design
was awarded through a competition aimed at producing an innovative, modern approach.
The winning team involved collaboration between the sculptor, Sir Anthony Caro, Foster
and Partners architects, and Arup engineers: the team spanned at least three different com-
munities of practice. Working together, they created a minimalist design to give pedestrians
a good view of London, free of traffic and high above the river. The bridge spans 325 m
and is an innovative structure, only 4 m wide, with an aluminium deck flanked by stainless
steel balustrades, supported by cables to each side, designed not to infringe upon impor-
tant, protected views within the city. The bridge span to depth ratio is 63 : 1, around six
times shallower than a conventional suspension bridge. The environmental credentials of the
bridge include the way its slim design reduces embodied energy and pollution. The bridge
is lit from the side at night and gleams in the sunshine during the day. Its slim dimensions
have resulted in it being known as the ‘blade of light’.
The design team used IvT in order to achieve this. They drew upon lessons from other
bridges around the world and they built sophisticated computer simulations to test and
improve their designs.
The result was heralded as a beautiful architectural, engineering and sculptural contri-
bution to London. The bridge opened on 10 June 2000, when between 80,000 and 100,000
people walked across it. When large groups of people were crossing, however, greater than
expected lateral movements occurred and it quickly gained notoriety as the ‘wobbly bridge’.
476 M. Dodgson et al.

A decision was taken to close the bridge only two days after its opening in order to fully
investigate the problem and take remedial action. This led to embarrassment for the design
team and incurred costs. Reputations were at stake and engineers were castigated in the
press and media for not checking for the ‘footfall amplification problem’, which occurs
when groups of people walk across bridges.
The engineers had tested for these potential problems and were puzzled as to why the
bridge had wobbled. Arup communicated the problem to their worldwide network of design
offices, using the internet. An international search was undertaken for a diagnosis and
solution to rectify the problems. Using a combination of virtual models and computer
simulation techniques, together with more traditional physical tests on models and mock-
ups with inputs from a range of diverse sources, the problem of the wobbly bridge was
solved.
This case study illustrates the use of IvT in shaping and representing potential solutions
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

more effectively amongst engineers.

The London Congestion Charge


By 2002, average road speeds in London had fallen to such an extent that travelling by car
was slower than making the same journey by horse-drawn carriage 200 years earlier. In
Britain as a whole, it is estimated that hundreds of millions of hours are wasted each year
as a result of congestion on the roads. This congestion was inevitable and getting worse.
Since the early 1980s, there had been negligible growth in road capacity, whilst traffic grew
at 3% a year. It is estimated that between 2002 and 2010, traffic will increase by 25%.
Finding a solution to congestion is a pressing social, economic and environmental prob-
lem. It is also a political problem in the UK: apart from the resulting economic inefficiencies,
politicians are wary of the growing frustration of 30 million motorists and potential
voters.
The Centre for Transport Studies at Imperial College London was asked in 2001 by the
Independent Transport Commission to conduct a research project into national congestion
charging in England. The main aim of the project was to model and identify the road speed
and financial revenue impact of such charges. The project made the assumptions that charges
would vary according to usage levels and would reflect levels of traffic congestion, road
maintenance, accidents and environmental damage costs. As well as producing the model,
the project also undertook some simulations. For example, the team examined how buses
interact with other traffic. They found that they could understand the general properties of
the system by using visual simulations of the buses represented by coloured dots.
The technologies used in this project enabled its fast completion to a relatively small
budget. The project lasted around 18 months and its total budget was £50,000. A major
follow-up government study into the issue was announced in 2003.
Informed by this project, in February 2003, London took the lead in attempting to address
the problem by introducing a congestion charge for automobile entry into central districts.
The scheme, which was widely criticized before its introduction, is considered successful,
with traffic being reduced by up to 30% at peak times. The Government and motoring
organizations have closely monitored the outcomes of the London scheme with a view to
widening its use to national charges to reduce congestion.
This case study is illustrative of the use of IvT in shaping and representing potential
solutions more effectively by engaging a range of contributors.
Modelling and Simulation Technology in Engineering 477

Design Engineering Problem Solving


We discuss the role of IvT on engineering problem-solving in three major areas: judgment
and validation, creating new combinations of technologies and components, and ‘design
conversations’.

Judgment and Validation


A body of research has characterized the main features of problem-solving in engineering
design.12 Among the central elements that these studies have highlighted is the difference
between scientific and technological problem-solving. Whereas scientific problem solving
primarily concerns the development of theories and predictions about the physical world,
technological problem solving normally starts out with functional requirement and seeks to
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

use scientific principles and technological components and systems to help achieve desired
goals. In doing so, engineers work differently from scientists. They work closely with a
variety of actors in the innovation process, such as the users of the eventual innovation,
to help understand the functional requirements and goals of the task they are undertaking.
They tend to be practical, pragmatic and goal-oriented.13
Engineering knowledge draws from scientific principles, but relies heavily on ‘rules
of thumb’, ‘informed guesses’, ‘routines’ and ‘norms’ of problem-solving that are built
up through engineering education and experience with real world problems. Engineering
knowledge grows incrementally and ‘traditional’ knowledge, such as the second Law of
Thermodynamics, remains an essential element in problem-solving routines and methods.
Indeed, engineering knowledge relies on a set of ‘heuristics’ about problem-solving that
are framed by the technological trajectories the engineers are working within.14 Constant15
refers to development of engineering knowledge as the development of recursive prac-
tice, slow and steady accretion of knowledge about how things work and how they fit
together. Engineers learn to use practical, iterative approaches in their education, which
often combines a problem-oriented approach with theory about physical properties. They
often experiment and improvise, reflecting on knowledge learnt in a range of previous
experiences.16 The combination of theory and practice, reflection and application, builds
engineering judgment over time.
The case study of the Leaning Tower of Pisa shows the extent to which the use of IvT
was based on sound engineering judgment and careful validation of results. Throughout
Burland’s time working on the Pisa project, he was exercised by validating his model. He
spent the first year learning about how the tower was built and what it was that the original
builders did when they built it (they were correcting the lean when they were building).
He felt that until he had a reasonable understanding of what the original builders had been
doing, the team could not develop a good model of construction.
A long time was spent developing numerical models of the tower of different degrees of
sophistication, checking to see if they correlated with their version of history. They built
the tower on a computer and checked this model against the historical research. They were
able to get a very close fit, and this provided confidence in the model.
Then later, the team did things to temporarily stabilize the tower. They further refined
the model based on this work. For example, in 1993, 600 tonnes of lead was placed on the
north side of the tower. This was undertaken very gently and progressively using precise
measurements. The tower responded extraordinarily closely to the predictions in the model.
478 M. Dodgson et al.

At least six permanent solutions to the tower’s lean had been devised and these were
evaluated on the model. Each solution had its protagonists, many of whom were very
dogmatic about what could be achieved and how to do it. However, the model showed
that to implement any of them could have resulted in the tower collapsing. The model
was very useful in testing these ideas before the eventual solution of soil extraction was
decided upon. Modelling showed that in principle the technique of extracting soil would
work. According to Burland, the computer techniques have to be extraordinarily rigorous
to ensure that what you are doing is converging to correct answers. Physical models are
used extensively in the fields of soil mechanics and geotechnics where complex properties
need to be understood. For example, soil exhibits a very complex mechanical behaviour,
and physical and computational modelling is used in order to understand this.
An appreciation of the importance of analytical skills and judgment based on craft and
experience underpins Professor Burland’s concern about the dangers of ill-informed use
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

of models that can produce inaccurate results and lead to catastrophic failure in civil
engineering and construction projects. He pays great attention to validation:

Validation is extremely important. It’s all very well to have your all-singing-all-
dancing model. But how reliable is it? Do you have to calibrate it? A huge amount of
my work involves being sceptical about the particular programmes we are using. We
test the model against known cases. We test it against physics. It is staggering how
often really sophisticated models turn out to be garbage. Quite frightening. There are
terrifying differences between well-known numerical models that can give differences
in response in orders of magnitude.

Burland says that ‘. . .in your bones you begin to understand the ways things broadly behave’.
He continues by arguing that the model sometimes comes out with surprising answers and
these need to be questioned through expert interpretation of physical properties. An expert
engineer can judge whether the results coming from a model are rubbish, not least because
before modelling it is important to estimate what you think the model will show. The model
has to be placed into the context of understanding the whole problem.
A similar finding emerged from the study of Ricardo. Simulation tools are used across
all stages of engine design. There is no point, however, in conducting large-scale detailed
simulation of the system until the design begins to take shape. Instead, the designers start
by focusing on key features of the engine, such as the bore, stroke, number of cylinders
and their configuration. In doing so, they use a range of conceptual tools and draw on
past knowledge and experience. During this concept design phase, the configuration is
established with basic geometric information about the engine and how different parts
of the design relate to one another. At this point, the designers conduct ‘low cost’ and
‘fast concept’ analysis, exploring how the system might function and how its elements
might interact. Elements of the design become more and more established over time, and
more detailed simulation allows the design process to refine the concept design. Designers
in the early stages think about and explore different options and assess the performance
of each alternative compared to design targets, then in later stages seek to optimize the
design.
The use of IvT is also reducing the time and effort expended in physical prototyping and
testing. Once the design moves from digital to physical testing the costs of changing the
Modelling and Simulation Technology in Engineering 479

design increase exponentially. Ricardo would like to have confidence that the design will
satisfy functional requirements before moving onto physical prototyping and testing. This
requires designers to have greater faith in the simulation tools and their ability to predict
the actual performance of the system.
Building physical prototypes and testing the design of a combustion system, such as the
use of different pistons or cylinder heads, is both time consuming and expensive. The use of
the simulation tools alters both the speed and cost of the design process for engine design.
Analysing them on a computer is relatively inexpensive with two advantages accruing,
namely the tangible cost savings in product development and also a reduction in the time-
to-market which potentially provides a major commercial advantage. It is estimated that the
number of physical models has declined dramatically in Ricardo over past years. Regardless
of the power of simulation tools, however, the company appreciates that there will still be
a need to physically test model performance.
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

The increasing use of new and more powerful simulations in auto-engine design has major
implications for skills and roles of designers and analysts. Simulation work in engine design
was traditionally undertaken by analysts, while designers focused on the conceptual, detailed
dimensioning and manufacturing. In this case, designers are generalists with knowledge
about materials, engine features and manufacturing processes. Analysts, in contrast, are
highly specialized users of simulations, able to correctly set up inputs and interpret outputs.
Despite the increased complexity of the products, performing simulation has become easier
as user interfaces have advanced. Thus, designers can become increasingly involved in
analysis. Many designers, however, still lack the depth of knowledge in a particular area
to be able to conduct analysis competently. According to Richard Johns, Head of Ricardo
Software. . .

It is now possible for designers to undertake quite sophisticated analysis and 9 times
out of 10 they will do this without a problem. However, more dangerously, 1 time
out of 10 they will analyse a design and get it wrong because they do not have
the specialist knowledge and experience to be able to recognize mistakes in either
input or output or limitations of the modelling that render the analysis plausible but
wrong.

There is still plenty of room for mistakes in analysis and even experienced analysts can
make mistakes from time to time, especially when the analysis is not routine. In theory,
designers could work from design templates embedded in the tools and using these design
templates, they could quickly create functioning engines. By using such templates, however,
the designers would limit the scope for innovation in the design of the engine. As Johns
says, ‘templates are straightjackets. They are fine if you want to do repetitive analysis but
not if you want innovation’. Once it is necessary to go beyond the template embedded in
the program then the skills and the experience of specialist analysts come to the fore. It is
only these skilled operators and interpreters of the inputs and outputs from the simulation
that can deal with new features and elements in the design.
In this way, the use of the new tools creates a greater need for skilled analysts and
designers. For Ricardo, today’s research becomes tomorrow’s commodity. Deep knowledge
of the simulations and systems, and judgment on their use is required to ensure that mistakes
are avoided and new opportunities for innovation are achieved.
480 M. Dodgson et al.

New Combinations of Technology and Components


Not all engineering involves routine and simple application of problem-solving tools to
new problems. As Vincenti17 highlighted, engineering work varies from procedures in
‘normal’ incremental design to experimentation and the development of ‘radical’ design
solutions.
Simon18 argued that for engineers to solve complex problems, it is necessary for them
to break down problems into modules or small parts. By decomposing tasks, engineers are
able to focus their work on an area of manageable complexity, applying their techniques and
capabilities to a narrow range of problems while leaving other problems to other groups. In
doing so, engineers are able to divide the labour of the engineering task across many different
sub-teams and units, increasing both the efficiency and effectiveness of problem-solving.19
They also need deep knowledge of different technologies and components and how they
fit together. Innovation often involves new combinations of existing technologies applied to
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

a particular engineering challenge. Some engineers work as systems integrators, choos-


ing components and technologies, specifying the interfaces between different systems,
combining new components with different vintages of technology.20
This aspect of engineering problem solving is examined here through the example of
Ricardo Engineering. Software for collaborative engineering requires the adoption of open
systems. This allows organizations to ‘mix and match’ different software tools, including
those developed internally. In-house software tools remain vitally important for many orga-
nizations as they usually contain embedded methods, knowledge and experience that have
evolved over many years. New integration tools, therefore, must allow organizations to
exchange information effortlessly between these different software environments.
Many of Ricardo’s simulation tools have been built up slowly over several years and
emerge out of internal research projects or requirements to satisfy external consultancy
requirements. Lead users have occasionally asked the firm to develop software for a par-
ticular task. In the case of most of the current simulation tools, their capabilities have been
developed well beyond the original intention. Richard Johns suggests that if the archi-
tecture of the software is well thought out then simulation tools can be highly flexible,
malleable and robust, allowing future developers to include a range of new features and
design elements. An example of robustness of design can be seen in the development of
WAVE, its gas dynamics simulation programme. Originally designed as a simulation for
engine performance, it has been extended into many areas such as the prediction of noise
and integration with control, vehicle and mechanical systems. Few of the initial designers
of these simulations could have predicted the eventual uses for their programmes when they
were initially written. However, given their flexibility and robustness, they have made such
a transformation possible.
The use of IvT has allowed designers and analysts to change the way they solve problems.
With the earliest generation of simulation tools, individual components and elements would
need to be analysed on a step-by-step basis, building up understanding of one system with
relatively simple, often inexact, connections to other systems with which they would interact.
Each model would run independently from each other. This was a slow and laborious process
involving considerable effort in translating the impact of outputs from one part of the system
into inputs for the next part of the system. According to Richard Johns, it made the process
of managing system interactions a time consuming, error-prone and approximate process
requiring skilled operators and great care. The new generation of simulation tools helps to
Modelling and Simulation Technology in Engineering 481

resolve this problem and reduce failure rates by allowing designers and analysts to connect
interacting processes and to co-simulate across different simulation tools.
The relationship between designers and analysts in engine design has become more
intertwined as a result of the use of new simulation tools. Decisions by designers can
have implications for a wide range of interacting systems. IvT has created the potential for
new forms of integration across organizations and across countries. Analysts working with
designers can explore and interpret these interactions, seeing how one change may spill
over across the system. The tools force designers and analysts to see the engine as a system
rather than a number of components. It forces them to collaborate with more people and to
open up to new connections across the design process.

Design Conversations
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

Engineering design and problem-solving involves more than engineers simply choosing
among a variety of technical options. Technical choices are shaped by the social and eco-
nomic context in which they are made such that engineering principles usually form only
part of the solution. These have to be integrated within a set of ideas from a range of diverse
disciplines including economics, management, sociology and political science. Conversely,
engineering has become so ubiquitous that engineers are involved in producing and main-
taining artefacts in almost every facet of society and the economy. On one hand, engineering
is becoming more specialized with many new sub-disciplines emerging (such as fire, acous-
tics and environmental engineering); on the other, engineering is becoming more generalized
with some techniques, such as CFD, being applied across a range of disciplines. In conse-
quence, the role of engineers is changing with a shift towards what Williams21 describes as
the ‘expansive disintegration of engineering knowledge’.
To solve the problems that confront them, engineers often rely on a social and informal
culture, networks and patterns of communication.22 Project work is sometimes carried out in
diverse, multidisciplinary teams that span a number of different organizations. Team mem-
bers share experiences, tell stories, sketch and play with artefacts from related engineering
work in order to develop solutions.23
Modelling and simulation has always played a central role in engineering problem-
solving.24 For engineers, models provide a mechanism for learning about artefacts before
and after they have been built.25 Models enable engineers to examine different options and
weigh the choices of structural elements, materials and components against one another.26
However, physical prototypes are expensive and time consuming to create and often unre-
liable, and digital tools are cheap and easily available.27 They provide the basis for digital
models that assist in abstracting physical phenomena, allowing engineers to experiment,
simulate and play with different options whilst engaging with a range of interested parties.
This is leading to a new culture of prototyping in which traditional practices of design are
being opened to more concurrent diagnostic enquiry.28
Added-value in design and development processes comes from creative and schematic
work where designers produce solutions that are beyond the calculations embedded in
routinized software programmes. They do this through ‘conversations’ and the use of
‘visual cues’, interacting with one another and their clients and suppliers.29 This schematic
work involves seeing the interfaces between components and obtaining engagement and
commitment from different specialists and stakeholders.
482 M. Dodgson et al.

The way in which IvT can bring together communities of engineers to solve problems
is seen in the example of the Millennium Bridge. Arup faced a major problem and needed
the expertise to test, measure and analyse the problem and to find an answer as quickly as
possible.
Three research units were set up and worked together simultaneously on the problem.
The first was brought to London from offices around the world and was instructed to test
the bridge and discover problems with the original computer model. They found the model
worked well and design work continued to be based around it. The second group was
selected from external organizations judged to have the best expertise to assist in diagnosis
and problem solving in bridge design. Working in tandem with the Arup teams, the external
groups played an important part in finding the solution. University research groups from
Cambridge, Sheffield, Imperial College and Southampton were commissioned to run tests
on crowd behaviour patterns on bridges. The third team worked on the design of new
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

dampers to be installed to absorb shock and reduce lateral movement. Young engineers
were encouraged to think widely about the problem and the team came up with a potential
solution two months after their exploration began.
During this process, Arup’s engineers found a paper on the ‘lateral excitation of bridges’in
an academic seismic engineering journal. The paper qualitatively described the phenomenon
but did not quantify it. Other searches found similar problems on a heavy roadbridge of a
very different structure in Auckland, New Zealand.
The answer to the problem involved designing dampers that were retrofitted beneath
the bridge’s structure, providing lateral stiffness without interfering with the overall aes-
thetic design of the bridge. In finding this solution, lessons were drawn from the use of
shock absorbers in the car and rail industries and these were applied to bridge design. The
performance of these was verified using IvT and physical models.
The simulation technology used in the design of the bridge, bringing together the engi-
neers, architect and sculptor, was used in finding the solution. It was an important tool in
the extended coordination of the international teams and the university research groups.
One of the lessons for Arup from this experience was that existing engineering knowledge
from Japan had not been accessed by the original UK design team. Recognizing this, Arup
posted results of their investigation on its website, placing findings in the public domain for
others to use. Whilst the problem of the ‘wobbly bridge’ was a temporary embarrassment
for the companies involved, the process of finding a solution brought considerable benefits
to them in the longer term and for bridge design in general.
The way IvT can be used by engineers to engage multiple stakeholders in shaping and
representing understanding is seen in the case of the London Congestion Charge. The
project faced technical and representational difficulties. How do you generate the data
then make it comprehensible to decision-makers and opinion-formers? The model had to
be able to account for nine different types of transport (bus, car etc.) and 8,500 different
parameters (region, time of day etc.). To build this large interdependent matrix reflecting
such complexity would require a huge amount of processing power and a detailed set of
iterating underlying equations. The results of the project needed to be presented to a target
audience of government ministers, press and lay-people. The data generated by 9 × 8, 500
outputs had to be easily understandable.
Professor Stephen Glaister and his team at the Transport Studies Group decided to use the
readily available, and recently upgraded, Microsoft Excel XP programme on the project.
Modelling and Simulation Technology in Engineering 483

Excel was particularly valuable in its ability to iterate, which is the crucial element to
produce convergence into a solution and to simulate random disturbance.
The team hypothesized mathematical relationships, describing how cells would change
as the parameters change. Various data were already available for use, but some needed to
be collected, for example on bus and train users.
With the calculations and simulations completed, the project team decided to represent
the Excel XP output by using a software package called MapInfo, which produces a heat
map related to geographical information. Heat mapping is a common form of visualization
to represent the relationships between complex data. In this example, the colours on the
map represented the predicted increase in speed compared to existing circumstances if a
national congestion charge was implemented.
Graphical representation of the data, for example using heatmaps to visualize results,
allowed patterns to be seen more easily. National congestion charging involves the transfer
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

of relatively large sums of money from road users. The project output showed the extent
to which finances would be transferred out of different regions. For example, government
ministers talked about a revenue neutral system in which regional ‘losers’ and ‘winners’
would balance out in fiscal terms. By looking at the map, it was obvious that the losers would
be concentrated in urban areas and winners in rural areas. Such redistribution, however,
would conflict with another element of government policy of moving resources into urban
regions for inner-city regeneration. The details of these flows were not obvious until the
maps were produced. The model therefore helps government make allocation decisions. In
a politically sensitive issue like a new tax, the ability to use the technology to represent the
data simply to sceptics and critics is an invaluable tool for communication and involvement.
The visual representation of quantitative information can enable a wide range of
stakeholders to understand results from modelling and simulation. It therefore provides
opportunities for more inclusive decision-making in design and problem solving and par-
ticipation in the review of different options. Tufte’s work30 explores how graphical practices
have emerged and explains the benefits as well as some of the pitfalls of making quantitative
data more readily available to non-experts through the use of new visual forms.
Whilst the technology used in the London Congestion Charge is relatively cheap, avail-
able and easy to use, as with our other cases, the capacity to effectively use IvT in this
example depended on a high level of skills in the Centre’s staff. As well as some high-level
programming skills, it required considerable professional judgment to frame the modelling
exercise, appreciate the use of the technical tools, and understand their great strength in
computing and representing the problem and its solution.

Discussion and Conclusions


This paper has considered the consequences of three uses of modelling and simulation tech-
nology for engineering design. Interpreting behaviour of physical properties and integrating
different technologies fall within the domain of normal, routine engineering decision-
making. Results show how modelling and simulation techniques are used in informing
decisions, making judgments about which option to take and validating these judgments
in virtual environments before carrying out physical tests and changes. Our case studies
also show how visual representation of results from models and simulations can assist in
‘design conversations’, enabling deeper levels of problem solving that engage with a wider
484 M. Dodgson et al.

range of stakeholders. They suggest that the most important benefits may come from the
possibilities of engaging stakeholders in shaping potential solutions in relation to contextual
factors associated with specific projects.
Failing to deliver in innovative projects can be highly expensive, damaging and embarrass-
ing. This is particularly so in the case of some of the high-profile public projects described
here. Failure to rectify the wobble in the Millennium Bridge would have severely damaged
the reputation of the major companies involved. Miscalculation leading to the collapse of the
Leaning Tower of Pisa, one of the world’s iconic structures, would have been simply devas-
tating culturally and for the local economy. Inaccurate projections on the consequences of
traffic congestion charges would have had significant consequences for London’s authori-
ties, and may have adversely affected the whole economic and environmental agenda around
dealing with congestion.
All the examples reveal the ways that some uncertainties in innovative projects can be
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

ameliorated by the use of IvT. In the case of the Millennium Bridge, IvT provided the
basis around which a distributed, international team of engineers and designers solved a
particularly visible and expensive design problem. Numerous technical solutions had been
proffered to prevent the Leaning Tower of Pisa from falling down, and modelling showed
that if any of these had been applied the Tower would have collapsed. Very careful iteration
between historical architectural knowledge, computerized modelling and physical changes
led to a high degree of confidence in establishing a correct diagnosis and rectification of
the problem. Ricardo’s capacity to overcome the complex problem of systems integration
and integrate diverse designs and software has assisted its development of a new business.
A combination of software packages enabled planners of London’s Congestion Charge to
assess, in an easily comprehensible form, a very complicated project of considerable social,
economic and political consequence.
IvT assists with the ‘normal’ engineering activities of developing optimal technical solu-
tions to problems and, as seen in the case of Ricardo, it expands the range of engineering
possibilities. The use of IvT is improving the speed and reducing the cost of innovation.31
The cases have shown how it can produce reliable information (seen in the case of the
Leaning Tower of Pisa, where a mistake would have been disastrous) at relatively low cost
(seen in the case of the Congestion Charge) and comparatively rapidly (the Millennium
Bridge problem would not have been solved in anywhere near the time, had the model not
been in place and the teams coordinated around it). Many of these advantages derive from
the capacity of IvT to help engineers ‘learn before doing’.32 It also valuably contributes to
those engineering activities which require solutions deeply bound in their social context and
involving strong levels of integration with the range of players that contribute to projects.
Innovative projects are often complex, multidisciplinary and collaborative. IvT can help
engineers manage projects by providing the means by which various forms of integration
can occur. So, if as Simon33 suggests, engineers solve complex problems by breaking them
down into smaller components, IvT can assist in both analysis at the decomposed level and
in the re-integration of those components. It helps ensure that there is technical compatibility
in solutions, and provides cost-effective ways of manipulating and playing with potential
solutions informed by the requirements of systemic integration and interfaces.
A key facet of IvT is the manner in which it represents complexity in a relatively simple
way. Data in a 9 × 8, 500 matrix is complex and results of analysis may be difficult to
discuss with non-cognate decision-makers. Representation of results as visual heat maps
helps to simplify the results of analysis and aids their communication to interested parties.
Modelling and Simulation Technology in Engineering 485

Such representation facilitated by IvT assists communications and understanding across


boundaries. Success often requires work being carried out in multidisciplinary teams. Very
often this involves collaboration with experts from outside the lead-engineering organization
and these collaborators may often come from non-engineering backgrounds. IvT provides
the opportunity for engineers to operate effectively with international counterparts; across
professions and disciplines; and with politicians and interest groups. Technology therefore
not only assists the conduct of ‘normal’ engineering, optimizing technical solutions, but
also facilitates the ‘design conversation’ between the diverse contributors to the success of
these complex projects, including customers, suppliers and regulators.
There are some obvious dangers with such technologies. Professor Burland, for example,
is concerned about the dangers of ill-informed use of models, which can produce inaccurate
results and lead to catastrophic failure in civil engineering and construction projects.
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

It is very easy to dress this work up using inappropriate terminology to describe the
concepts and tools we are using and this could give it a respectability that it doesn’t
deserve. Modelling says what you are doing. It doesn’t hide behind jargon. We have
hugely powerful numerical techniques available to us. Any undergraduate can get hold
of these. But unless they understand that these are used as a means of modelling reality
the results can convey a sense of magic that doesn’t exist. The value depends upon the
user understanding the reliability of data put in, the assumptions and simplifications
involved in the model and the results coming out.

Furthermore, existing knowledge and skills in the research team are essential to success.
The integration of the two packages used by the team simulating the congestion charge
required software skills and constructive imagination in realizing they could be combined.
Professor Glaister additionally refers to the way he relied heavily on past experience of this
kind of economic modelling to guide him on what the essential economic features were,
and to judge what compromises and simplification had to be made in order to get the model
to work in a way that would be helpful. Recognition of the limitations of the use of IvT by
its users is essential for its successful application.
The integrating potential of IvT remains novel and its consequences for the skills of
engineers are still being enacted. It is possible to envisage a de-skilling trajectory, similar
to that predicted by the application of computers to advanced manufacturing technology
(AMT).34 Whilst some firms may embark on that road, the result is likely to be similar to
those firms that unsuccessfully tried to use AMT to de-skill. Productivity depends upon the
merger of the new technological possibilities with the appreciation and understanding of the
old principles, and ‘ways of doing things’: the established rules of thumb and norms. The
most effective users of IvT may well be those who would find solutions to their problems
without IvT, albeit much more slowly and painfully. What is clear in our studies is that the
technologies of modelling and simulation are being combined with more traditional forms
of engineering design and problem solving in novel ways. This is extending the range
of contributions to engineering problem solving. Sources of ideas and of solutions may
therefore be becoming more dispersed in a shift identified by Williams35 as the ‘expansive
disintegration’ of engineering. At the same time, some techniques for problem-solving are
being routinized in computer models. These are changing the ways in which problems can
be solved, but evidence from these case studies suggests that this is not leading to a simple
process of automation and deskilling. Instead, deep craft knowledge is often required and
486 M. Dodgson et al.

this expertise can provide innovative solutions when aided by the new digital tools. The
study illustrates the continuing importance of craft knowledge and therefore limits attempts
to routinize some aspects of engineering knowledge. Segmented approaches that attempt to
automate all aspects of problem-solving and design activities will not realize the advantages
to be achieved when IvT is combined with specific engineering knowledge.
These advantages are considerable. The famously innovative architect, Frank Gehry, has
commented upon the combination of new technology and processes that have enabled him
to produce his innovative, malleable, plastic form of building design saying that. . .

This technology provides a way for me to get closer to the craft. In the past, there
were many layers between my rough sketch and the final building, and the feeling
of the design could get lost before it reached the craftsman. It feels like I’ve been
speaking a foreign language and now, all of a sudden, the craftsman understands me.
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

In this case, the computer is not dehumanising, it’s an interpreter.36

Further research is needed to explore in greater detail how the use of these techniques is
resulting in changes in engineering practice more broadly. Decisions about timescales and
management of risk, including project management, finance and costs are being linked
together within these approaches, enabling decisions to be taken that span across a range of
procurement, design, engineering and production disciplines. Our examples show how IvT
has the potential to provide the basis for a more open, transparent approach to engineering
design and decision making. It could also enable faster results with greater concurrency of
engineering work on different parts of a problem simultaneously, allowing near real-time
decision making. Its capacity to assist in interdisciplinary decision making by enabling
people from different backgrounds to understand potential solutions opens a whole range
of possibilities for a more informed and democratic engineering praxis (supporting what
von Hippel37 calls ‘democratic innovation’).
Perhaps the greatest benefits will come from the ways in which these techniques assist
a wider range of participants to engage with projects, shaping and representing potential
solutions. This may result in better understanding of the contextual factors that affect engi-
neering design, and by engaging the range of potential contributors to projects, determine
the eventual success of the solution. Highly complex problems in which it has been tra-
ditionally difficult for interested parties with limited expertise to engage may thereby be
assisted through the use of modelling and simulation techniques.

Acknowledgements
We are grateful to the EPSRC’s Innovative Manufacturing Research Centre, which
sponsored the research upon which this paper is based. We are also grateful to the Innova-
tion Strategy and Leadership programme at the University of Queensland Business School
for its financial support. A version of this paper was presented at the Industrial Dynam-
ics, Innovation and Development DRUID Summer Conference, Elsinore, June 2004. We
are grateful to Alice Lam, Michael Dahl and Bengt Ake Lundvall for their comments as
discussants of the paper at this conference. We also wish to thank Jeff Pinto and Dennis
Towill for their comments. The contributions from two anonymous referees are gratefully
acknowledged.
Modelling and Simulation Technology in Engineering 487

Interviews Conducted
Ten interviews were carried out on the Millennium Bridge case, in London and Tokyo in
2002. Key participants were interviewed in Arup, the consulting engineering firm responsi-
ble for the original engineering design and subsequent rectification. Additional information
was collected from project notes published in engineering journals, through attendance at
a Royal Academy of Engineering seminar in London and from analysis published on the
company’s website (www.arup.com).38
A single, extensive interview was used for the Leaning Tower of Pisa case. This was
conducted with Professor John Burland, Imperial College London, in 2004. The interviewee
led the engineering project to save the Tower from collapse, working on the project from
1990 to 2002. Substantial published evidence was also gathered and examined for this
famous case.
Two interviews were carried out for the road transport congestion charging case study.
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

These were both with the lead project manager, Stephen Glaister, Professor of Transport
and Infrastructure at Imperial College London, in 2003 and 2004. Additional material was
supplied by Professor Glaister.
The authors have had a long research relationship with Ricardo Engineering, going back to
the early 1990s, involving particular research projects and engagement of research students.
A lengthy interview with Richard Johns, supplemented by additional material, provided the
basis for this study.
As many of the examples are high-profile public projects, there was substantial public
interest in them, seen in articles and commentaries in the press, which provided further
sources of information. They are completed projects with sufficient time elapsed to confirm
the success of their outcomes.
In most cases, the interviews were taped, transcribed, written-up and returned to intervie-
wees for verification. The condensed case studies were also sent to knowledgeable members
of project teams for checking and verification.

Notes and References


1. H. Petroski, To Engineer is Human – the role of failure in successful design (New York, St Martin’s Press,
1985).
2. Failure is, of course, widespread in many innovative activities; researchers such as Cooper, for example,
reveal the high level of failure amongst new products. Success and failure also occasionally takes time to
assess: the De Havilland Comet for example was an initial success, and was instrumental in creating the
new commercial airline industry, but the product was an eventual failure, marred by a series of crashes
caused by metal fatigue. (R. Cooper, Winning at New Products (Reading, MA, Addison-Wesley, 1993.)
3. A model is a simplified or idealized description of observed behaviour, system or process and establishes
key relationships to show how something works. Simulation involves the imitation of a behaviour, situation
or process by means of a suitable analogy and the manipulation of parameters articulated in a model to
ask ‘what if’ questions should those parameters be changed.
4. CATIA is an integrated suite of computer-aided design (CAD), computer-aided engineering (CAE) and
computer-aided manufacturing (CAM) applications for digital product definition and simulation. Devel-
oped by Dassault Systemes and marketed internationally by IBM, the software is commonly used for 3D
product life cycle design, production and maintenance. CATIA is a premium product; cheaper tools with
less functionality are available at a price available to even the smallest firm. It is possible to simulate
and model on a relatively simple personal computer. These technologies continue to evolve: the present
3- and 4-dimensional CAD systems are, for example, considerably more advanced in functionality than
their predecessors of even a few years ago.
488 M. Dodgson et al.

5. M. Dodgson, D. M. Gann & A. Salter, Think, Play, Do: Technology, Innovation and Organization (Oxford,
Oxford University Press, 2005).
6. S. Thomke, Experimentation Matters (Boston, MA, Harvard Business School Press, 2002); I. Tuomi,
Networks of Innovation: Change and Meaning in the Age of the Internet (New York, Oxford University
Press, 2002); M. Schrage, Serious Play: How the World’s Best Companies Simulate to Innovate (Boston,
MA, Harvard Business School Press, 2000).
7. E. Steinmueller, Will new information and communication technologies improve the ‘codification’ of
knowledge, Industrial and Corporate Change, 9, 2000, p. 361.
8. K. Eisenhardt, Building theories from case study research, Academy of Management Review, 14, 1989,
p. 532.
9. J. B. Burland, The Leaning Tower of Pisa Revisited, Proceedings, 5th International Conference on Case
Histories in Geotechnical Engineering, New York, 2004.
10. Fluid simulation packages include WAVE—a gas dynamics simulation programme enabling performance
simulations to be carried out based on intake, combustion and exhaust system design; and VECTIS—a
three-dimensional computational fluid dynamics programme for solving flow equations governing con-
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

servation of mass, momentum and energy. Ricardo has also developed a range of mechanical systems
simulations, including ENGDYN—a simulation environment for analyzing the dynamics of the engine and
powertrain; VALDYN - for simulating the dynamics of the valvetrain; PISDYN and RINGPAK–simulation
packages for design and analysis of piston/ringpack assemblies; and ORBIT —for detailed analysis of
bearings.
11. See the case study of Ricardo’s partnership with a Chinese company, DLW, in M. Dodgson, Technological
Collaboration in Industry (London, Routledge, 1993).
12. Petroski, op. cit., Ref. 1; E. Constant, The Origins of the Turbojet Revolution (Baltimore, John Hopkins
University Press, 1980); W. G. Vincenti, What engineers know and how they know it (Baltimore and
London, John Hopkins University Press, 1990); H. Simon, The Sciences of the Artificial (Boston, MA,
The MIT Press, 1996); L. Perlow, Time Famine: towards a sociology of work time, Administrative Science
Quarterly, 44, 1999, p. 57.
13. K. Pavitt, Technologies, Products and Organization in the Innovating Firm: What Adam Smith Tells Us
and Joseph Schumpeter Doesn’t, Industrial and Corporate Change, 7, 1998, p. 433; P. Nightingale, A
cognitive model of innovation, Research Policy, 27, 1998, p. 689.
14. G. Dosi, Sources, Procedures and Microeconomic Effects of Innovation, Journal of Economic Literature,
26, 1988, p. 1120.
15. Constant, op. cit., Ref. 12.
16. D. A. Schon, The Reflective Practitioner: How professionals think in action (New York, Basic Books,
2003).
17. Vincenti, op. cit., Ref. 12.
18. Simon, op. cit., Ref. 12.
19. A. Gawer & M. A. Cusumano, Platform Leadership (Boston, MA, Harvard Business School Publishing,
2002); S. Brusconi, The limits to specialization. Problem solving and coordination in modular networks,
Organization Science, 26, 2005, p. 1885.
20. S. Brusconi, A. Prencipe & K. Pavitt, Knowledge specialisation, organisational coupling, and the bound-
aries of the firm: why do firms know more than they make? Administrative Science Quarterly, 46, 2001,
p. 597; C. Baldwin & K. Clark, Design Rules: The Power of Modularity (Cambridge, MA, MIT Press,
2000); A. Prencipe, A. Davies & M. Hobday, The Business of Systems Integration (Cambridge, Cambridge
University Press, 2003).
21. R. Williams, Retooling: a historian confronts technological change (Cambridge, MA, The MIT Press,
2002).
22. T. J. Allen, Managing the Flow of Technology (Boston, Mass., The MIT Press, 1977).
23. A. Hargadon & R. Sutton, Technology Brokering and Innovation in a Product Development Firm,
Administrative Science Quarterly, 42, 1997, p. 716.
24. One of the anonymous referees for this paper refers to the way Isambard Kingdom Brunel relied on John
Scott Russell in design conversations and how modelling has always been used in engineering. We are
grateful for this insight.
25. Petroski, op. cit., Ref 1.
26. Thomke, op. cit., Ref 6.
27. Ibid.; Schrage, op. cit., Ref. 6.
Modelling and Simulation Technology in Engineering 489

28. Schrage, op. cit., Ref. 6.


29. Schrage, op. cit., Ref. 6; J. Whyte, Virtual reality in the built environment (London, The Architectural
Press, 2002).
30. E. R. Tufte, The visual display of quantitative information (Cheshire, CN, Graphics Press, 1983).
31. Dodgson et al., op. cit., Ref. 5.
32. G. Pisano, The Development Factory: Unlocking the Potential of Process Innovation (Boston MA, Harvard
Business School Press, 1997).
33. Simon, op. cit., Ref. 12.
34. H. Braverman, Labour and Monopoly Capital (New York, Monthly Review Press, 1974).
35. Williams, op. cit., Ref. 21.
36. Frank Gehry in S. Pacey, RIBA Journal, November, 2003, p. 78.
37. E. Von Hippel, Democratizing Innovation (Cambridge, MA, The MIT Press, 2005).
38. The relationship between the researchers and Arup has been a lengthy one of over a decade, involving a
number of research projects including the study of the use of IvT in other Arup projects apart from the
Millennium Bridge.
Downloaded by [Universidad de los Andes] at 13:00 10 July 2015

You might also like