Professional Documents
Culture Documents
2 Design Performance
3
4
5
6
7
8
9
1011
1
2
3
4
5
6
7
8
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
1
2 Francis J. ODonnell and
3
4
Alex H.B. Duffy
5
6
7
8
9
1011
1
Design
2
3
4
5
Performance
6
7
8 With 65 Figures
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
Francis J. ODonnell, PhD 1
Scottish Enterprise E-Business Group, 150 Broomielaw, Atlantic Quay, 2
Glasgow, UK 3
4
Alexander H.B. Duffy, PhD, CEng 5
CAD Centre, Department of Design Manufacture and 6
Engineering Management, University of Strathclyde, 7
75 Montrose Street, Glasgow, UK 8
9
1011
1
2
3
4
5
6
British Library Cataloguing in Publication Data 7
ODonnell, Francis J.
Design Performance 8
1. Design, IndustrialManagement. 2. Performance technology 91
I. Title II. Duffy, Alex H. B. (Alex Hynd Black), 1957 2011
658.5752 1
ISBN 185233889X
2
3
Library of Congress Cataloging-in-Publication Data 4
ODonnell, Francis J. 5
Design performance/Francis J. ODonnell and Alex H. B. Duffy 6
p. cm.
ISBN 185233889X (alk. paper) 7
1. Design, Industrial. I. Duffy, Alex H. B. (Alex Hynd Black), 1957 8
II. Title 9
TS171.O36 2004 3011
658.5752dc22 2004052219
1
Apart from any fair dealing for the purposes of research or private study, or criticism or review, 2
as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be 3
reproduced, stored or transmitted, in any form or by any means, with the prior permission in 4
writing of the publishers, or in the case of reprographic reproduction in accordance with the
5
terms of licences issued by the Copyright Licensing Agency. Enquiries concerning reproduction
outside those terms should be sent to the publishers. 6
7
ISBN 185233889X 8
Springer Science+Business Media 9
springeronline.com
4011
Springer-Verlag London Limited 2005 1
Printed in the United States of America 2
3
The use of registered names, trademarks, etc. in this publication does not imply, even in 4
the absence of a specic statement, that such names are exempt from the relevant laws and
regulations and therefore free for general use. 5
6
The publisher makes no representation, express or implied, with regard to the accuracy of the 7
information contained in this book and cannot accept any legal responsibility or liability for any 8
errors or omissions that may be made.
9
Typeset by Florence Production Ltd, Stoodleigh, UK 50
69/3830543210 Printed on acid-free paper SPIN 10969571 51111
1
2
3
Preface
4
5
6
7
8
9
1011
1
2
31 The continual effort to improve performance in business processes attracts
4 increasing attention in research and industry alike. The impact of design
5 development performance on the overall business positions this area as an
6 important performance improvement opportunity. However, design devel-
7 opment is characterised by novelty, uniqueness and non-repeatability, which
8 provides particular challenges in dening, measuring and managing its
9 performance to achieve improvement.
2011 This book explores the support provided by both general research in busi-
1 ness process performance and design research for supporting performance
2 improvement in design development. The nature of design development in
3 industrial practice is further revealed, and requirements for its modelling and
4 analysis to achieve improvement are highlighted.
5 A methodology for the modelling and analysis of performance in design
6 development that encapsulates a formalism of performance and an approach
7 for its analysis is established. The formalism is composed of three models,
8 which capture the nature of design development performance and support its
9 measurement and management. The E2 model formalises and relates the key
3011 elements of performance, i.e., efciency and effectiveness. The Design Activity
1 Management (DAM) model distinguishes design and design management
2 activities in terms of the knowledge processed, while the Performance
3 Measurement and Management (PMM) model describes how these activities
4 relate within a process of measuring and managing performance. The PER-
5 FORM approach is dened as a means to analyse an aspect of performance,
6 effectiveness and the inuence of resources on that aspect. A computer- based
7 tool that supports its implementation in industry and provides the capability
8 to identify appropriate means to improve performance is presented.
9 The overall methodology is illustrated using worked examples and indus-
4011 trial practice to reveal different elements. This highlights how the methodol-
1 ogy addresses requirements for modelling and analysing design development
2 performance to achieve improvement. Strengths and weaknesses of the
3 methodology are revealed and resulting insights are discussed.
4
5 Frank ODonnell and Alex Duffy
6 January 2005
7
8
9
50
51111
v
1
2
3
Contents
4
5
6
7
8
9
1011
1
2
31 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
4 1.1 Scope of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
5 1.2 Key Areas Covered . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
6 1.3 Book Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
7
8
9
2011
Part I The Need for Design Performance
1
2 2 The Nature of Performance in Design . . . . . . . . . . . . . . . . . . . . . . . . . 7
3 2.1 Business Process Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
4 2.1.1 What Is Performance? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
5 2.1.2 Why Assess and Analyse Performance? . . . . . . . . . . . . . . . . 8
6 2.1.3 Achieving Improved Performance . . . . . . . . . . . . . . . . . . . . 8
7 2.1.4 Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
8 2.2 Performance in Design Development . . . . . . . . . . . . . . . . . . . . . . 11
9 2.2.1 Analysing Performance in Design . . . . . . . . . . . . . . . . . . . 12
3011 2.2.2 Characteristics of Performance in Design . . . . . . . . . . . . . 13
1 2.2.3 Factors Inuencing Design
2 Development Performance . . . . . . . . . . . . . . . . . . . . . . . . . 14
3 2.2.4 Coherence in Design Performance . . . . . . . . . . . . . . . . . . . 15
4 2.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5
6 3 Design Performance Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
7 3.1 Overview of Performance Research . . . . . . . . . . . . . . . . . . . . . . . 19
8 3.1.1 Trends in Performance Research . . . . . . . . . . . . . . . . . . . . 20
9 3.2 Dening and Modelling Performance . . . . . . . . . . . . . . . . . . . . . 21
4011 3.3 Design and Design Activity Performance . . . . . . . . . . . . . . . . . . . 23
1 3.4 Supporting Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2 3.5 Identication and Quantication of
3 Inuencing Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4 3.5.1 Empirical Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5 3.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
6
7 4 Industrial Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
8 4.1 Current Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
9 4.1.1 Business Level Approaches . . . . . . . . . . . . . . . . . . . . . . . . 38
50 4.1.2 Design Development Analysis . . . . . . . . . . . . . . . . . . . . . . 39
51111 4.2 Industrial Insight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
vii
Contents
4.2.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 1
4.2.2 Overview of Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2
4.2.3 The Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 3
4.2.4 Trial Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4
4.2.5 Further Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 5
4.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 6
7
8
Part II A Methodology for Enhanced Design Performance 9
1011
5 A Formalism for Design Performance Measurement and 1
Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 2
5.1 Activity Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 3
5.1.1 A Knowledge-Based Model of Design . . . . . . . . . . . . . . . . 56 4
5.2 Activity Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 5
5.2.1 Design and Management . . . . . . . . . . . . . . . . . . . . . . . . . . 61 6
5.2.2 A Model of Design Activity Management . . . . . . . . . . . . . 63 7
5.3 Managed Activity Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . 65 8
5.3.1 Decomposition Relationships . . . . . . . . . . . . . . . . . . . . . . 65 91
5.3.2 Temporal Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 2011
5.4 A Design Performance Model E2 . . . . . . . . . . . . . . . . . . . . . . . . 69 1
5.4.1 Efciency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 2
5.4.2 Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 3
5.4.3 Relating Efciency and Effectiveness . . . . . . . . . . . . . . . . . 77 4
5.4.4 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5
5.5 A Model of Performance Measurement and 6
Management (PMM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 7
5.5.1 Measuring and Managing Effectiveness . . . . . . . . . . . . . . . 80 8
5.5.2 Measuring Efciency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 9
5.5.3 Achieving Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 3011
5.6 Summary and Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 1
2
6 The Impact of Resources on Effectiveness . . . . . . . . . . . . . . . . . . . . . 89 3
6.1 Inuences on Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 4
6.1.1 The Inuence of Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . 90 5
6.1.2 Inuence of Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 6
6.1.3 Inuence of Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 7
6.2 Resource Knowledge (R) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 8
6.3 Relating Resource Use and Effectiveness . . . . . . . . . . . . . . . . . . . 94 9
6.3.1 Resource Impact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 4011
6.3.2 The Resource Impact Model . . . . . . . . . . . . . . . . . . . . . . . 96 1
6.3.3 Nature of Impact Prole . . . . . . . . . . . . . . . . . . . . . . . . . . 96 2
6.3.4 Assessing Relative Impact . . . . . . . . . . . . . . . . . . . . . . . . . 99 3
6.3.5 Resource Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 4
6.4 Principles of Analysing Impact of Resources . . . . . . . . . . . . . . . 102 5
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 6
7
7 The PERFORM Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 8
7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 9
7.2 Specication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 50
7.2.1 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 51111
viii
Contents
ix
Contents
x
1 Introduction
1
Introduction
must be provided using the correct metrics in the correct manner in order to 1
fully evaluate and/or validate the impact of research [6]. The ultimate success 2
of these research efforts may then become apparent through systematic meas- 3
urement of performance, e.g. on a before-and-after basis or as part of an 4
internal/external benchmarking initiative. Yet there are many uncertainties 5
surrounding the practice of performance measurement in this area. 6
Considerable work has already been carried out in measuring the perfor- 7
mance of the manufacturing process [7] but there are inherent differences 8
in the nature of design/development compared to manufacturing [8] which 9
make performance measurement more difcult [9]. The measurement of 1011
knowledge based activities in design where outputs are less tangible than in 1
processes such as manufacturing poses particular problems. The impact 2
of activities within the conceptual design stage may not be evident for a 3
number of years, i.e. when the success of the product in the market has been 4
established. 5
6
7
1.1 Scope of the Book1 8
91
This book attempts to establish an underlying theory or formalism of per- 2011
formance as applied to the design process, i.e. from the identication of a 1
need through to the description of a solution to meet that need. Activities and 2
their inter-relationships are considered to be the key elements that make up 3
the process of design. This process is presented in a variety of models adopt- 4
ing different viewpoints. For example, models of design as a problem solving 5
process as presented in [10], phase models such as those presented in [11, 12], 6
and knowledge based models such as that dened in [13]. The knowledge- 7
based viewpoint of design is adopted in this work as a basis for describing the 8
process. The work does not focus on additional activities required for product 9
development as a whole such as marketing, manufacturing, etc., as described 3011
in [1416]. 1
Although existing process models of design help to characterise the focus 2
of the book they do not comprehensively describe it. These models focus pri- 3
marily on the activities required to create a design solution without an analy- 4
sis of the activities involved in managing the process by which that solution 5
is developed, i.e. design management activities. To comprehensively analyse 6
performance it is necessary to consider performance in relation to both the 7
design (artefact) and the design process. The term design development is used 8
throughout this work to indicate that both the design and its development are 9
being considered. 4011
The presented methodology is aimed at providing a new and clear under- 1
standing of performance in design development and utilising this under- 2
standing to determine a means by which performance may be analysed and 3
improved. Although the understanding developed here provides a basis for 4
determining metrics with which to measure performance, the work does not 5
prescribe metrics for use in design development as these are considered to be 6
specic to each case. 7
Many factors may be considered when analysing performance within a 8
process such as design development. This work does not involve the analysis 9
of aspects such as the motivation of designers, loyalty, power relationships, 50
individual reward schemes, etc. which inuence performance. That is, the 51111
2
Introduction
3
Introduction
4
1
2
3
4
5
6
7
8
9
1011
1 PART I
2
3
4
5
6
7
8
9
The Need for
2011
1
2
Design
3
4
5
6
Performance
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
1
2
3
4
2 The Nature of
Performance in
Design
7
The Need for Design Performance
value, reecting the result of performance across all business processes. The 1
Production Manager may describe performance of the production process(es) 2
in terms of the number of products produced over a given time. A Product 3
Development Manager may discuss the performance of the product develop- 4
ment process in terms of the percentage of turnover due to new products. 5
These descriptions of performance provide different but related viewpoints 6
of the phenomenon. That is, share value, number of products produced, etc. 7
are used as indicators that reect aspects of the phenomenon but do not 8
describe its nature. A formal denition of performance is required to support 9
common understanding as a basis for carrying out analysis in the area. 1011
1
2
2.1.2 Why Assess and Analyse Performance? 3
4
Maintaining or improving performance is an aim of most businesses and 5
understanding the performance of the business processes plays a central 6
part in the implementation of performance improvements, i.e. assessing and 7
analysing the performance is a fundamental part of managing a business 8
and is often difcult to distinguish from the general management task 91
[1822]. For example, Sinclair et al. [22] in their study of 115 companies found 2011
that in most cases there was no separate performance management system 1
and measurement was integrated within the overall management process. 2
Although the analysis of performance is seen as an integral part of man- 3
aging a business a number of authors present specic reasons or benets for 4
carrying out performance analysis [19, 23, 24]. These may be summarised as 5
follows: 6
To allow comparison against some standard. For example the comparison 7
of performance of different organisations in the practices of benchmark- 8
ing [25]. This allows a diagnosis of whether performance is increasing or 9
decreasing. 3011
To support planning and control. The analysis of performance against 1
a plan supports the decisions necessary to make more effective future 2
plans and exercise control in order to increase the chances of achieving an 3
existing plan. 4
5
To provide insight in to particular business processes, e.g. through identi- 6
cation of those factors inuencing performance and the nature of their 7
inuence, and support organisational learning. 8
To motivate individuals and groups within an organisation to improve 9
their performance. 4011
The benets of analysing performance outlined here may be related within 1
an overall aim, i.e. the improvement of performance. That is, all of the ben- 2
ets may be seen as elements that are necessary to achieve performance 3
improvement1. 4
5
6
2.1.3 Achieving Improved Performance 7
8
Many of the functions of a performance measurement system may be cate- 9
gorised within the area of decision support, i.e. the results of performance 50
analysis may be used to assist organisational actors in making decisions. 51111
8
The Nature of Performance in Design
9
The Need for Design Performance
10
The Nature of Performance in Design
1 2.1.4 Coherence
2
3 Within an organisation the measurement of performance may take place in
4 a dispersed and often unrelated manner, i.e. the performance improvement
5 process illustrated in Figure 2.1 may be applied to different business processes
6 in isolation. This can result in sub-optimisation where a particular business
7 process may be performing very well but its performance may have a negative
8 inuence on the performance of other processes and ultimately, the overall
9 performance of the organisation.
1011 Achieving coherence, i.e. aligning performance throughout the organisa-
1 tion, has been identied as a key requirement in the design of performance
2 analysis approaches [7, 26, 33, 34]. de Haas [33] identies the requirement for
3 coherence as an attribute of a performance measurement (PM) system which
4 causes performance (i.e. achieved scores on performance indicators) by the
5 group acting upon that system to contribute to the performance of other
6
interdependent groups and, thereby, to contribute to the performance of the
7
organisational entity as a whole. Although the work reported by de Haas is
8
focused on the contribution of individuals or groups within an organisation
9
the denition of coherence may be equally applied to business processes.
2011
That is, coherence causes the performance of a process to contribute to the
1
2 performance of other interdependent processes and to the business as a whole.
3 These interdependent processes may be inter-related in a variety of
4 complex ways, e.g. a process/sub-process relationship may exist or the
5 processes may exist at the same hierarchical level. For example, design may
6 be viewed as a sub-process of product development. The performance of the
7 design process must be aligned with the overall performance of the product
8 development process to ensure that design will contribute positively to the
9 performance of the product development process. Similarly the design and
3011 manufacturing processes should be aligned so that high performance in one
1 does not adversely affect the performance of the other and subsequently,
2 overall product development performance. Achieving coherence is necessary
3 to ensure performance improvements within any area of the business deliver
4 an overall benet to the business.
5
6
7 2.2 Performance in Design Development
8
9 The above discussion presents an overview of performance in relation to
4011 general business process and illustrates some of the more generic elements.
1 However, we focus here on performance in design, i.e. a specic business
2 process. Design may be seen as a process of goal-directed reasoning where
3 there are many possible (good) solutions and although the process can be
4 supported methodologically, it cannot be logically guaranteed [15]. The
5 process is inherently uncertain in progressing from an initial need to arriv-
6 ing at a solution that meets that need. The literature reports various models
7 of the design process that may be categorised as either descriptive or pre-
8 scriptive in their approach [35]. Descriptive models, such as those found in
9 [13, 36] describe how the design process is carried out, often created through
50 the use of protocol studies [37, 38], while prescriptive models prescribe how
51111 the design process should be carried out, e.g. those presented in [12, 39, 40].
11
The Need for Design Performance
The basic design cycle proposed by Roozenburg [15] may be seen to encom- 1
pass key activities of design, i.e. analysis, synthesis and evaluation and pre- 2
sents a basis for describing performance analysis. The cycle begins with 3
identication of a difference between the current state and some desired 4
future state (e.g. the existence of a new artefact). The initial activity of analy- 5
sis is aimed at clarifying the desired future state and dening it in terms of 6
a statement of the problem including goals (i.e. function) to be achieved. 7
Although the information available at the initial stages of the design cycle is 8
often incomplete, inconsistent and ambiguous [13] and analysis may only 9
produce initial goals, criteria (i.e. measures) should be established at this 1011
point, which form part of the value system referred to during evaluation. The 1
synthesis activity draws on the creativity of the designer to produce a solu- 2
tion proposal(s) intended to meet the goals established in analysis. The eval- 3
uation activity involves the identication of how well the solution meets the 4
goals and ultimately whether a satisfactory value has been achieved with a 5
particular solution(s). 6
A review of design process models is presented in [41] that discloses char- 7
acteristics of the design process. The intangible nature of design is revealed 8
which indicates the additional challenges in terms of dening, analysing 91
and improving performance. The following sections outline specic issues in 2011
relation to performance in this area. 1
2
3
2.2.1 Analysing Performance in Design 4
5
The difculties associated with dening performance in general are equally 6
apparent in the area of design performance and the existing design process 7
models provide little insight into the phenomenon. When we refer to perfor- 8
mance in design it can be related to different areas, e.g. the performance of 9
the design solution (artefact) in terms of its properties, such as the top speed 3011
of a car, or the performance of the process by which the solution was created 1
in terms of duration or cost. These areas of performance are related in some 2
way but the relationship is unclear. 3
4
5
2.2.1.1 Design (Artefact) Performance 6
7
The evaluation activity within the basic design cycle represents an area of 8
performance measurement in design, i.e. the measurement of the solution 9
(artefact, product, etc.) and assigning values to it, in order to guide decision 4011
making. That is, an aspect of the performance of the solution, how well it 1
meets initial goals, is measured in order to establish how the design cycle 2
should proceed. This area of performance measurement forms a critical part 3
in the basic design cycle and is inherent in the design process. 4
5
6
2.2.1.2 Design Activity Performance 7
8
The evaluation of solution proposals (i.e. designs) within the basic design 9
cycle does not provide a complete view of performance in design. In order to 50
create and evaluate such solutions it is necessary to use resources such as 51111
12
The Nature of Performance in Design
1
2 Performance
3 in Design
4
5
6
7
8 Design
Activity
9 (Artefact)
1011
1 Figure 2.2 Performance relationships in design.
2
3
4 designers, tools, etc. over a period of time resulting in costs being incurred.
5 In addition to the output discussed above, performance in design also encom-
6 passes the process by which that output was created. For example, the basic
7 design cycle may be repeated many times within a phase model of the design
8 process, e.g. that presented in [12, 40, 42], to arrive at a satisfactory solution.
9 The overall performance in design will be determined both by the resulting
2011 design (artefact) and how well the activities required to produce that artefact
1 are carried out (Figure 2.2). These two areas of performance are often referred
2 to as product and process performance within the literature4.
3
4 2.2.1.3 Relating the Key Elements of Performance
5
6 As the artefact is created using design activities there is a causal link between
7 the performance of the artefact and the activity, e.g. the types of resources used
8 in the activity will inuence the performance of the artefact. Also, the com-
9 plexity of the artefact will inuence the activity performance, e.g. a more
3011 complex artefact may mean that the activity duration is longer [43]. As
1 discussed above both artefact and activity performance are components
2 of overall design performance. These relationships are illustrated within a
3 simplistic model in Figure 2.2. This model fails to fully dene the relationships
4 between the different performance elements. The nature of these relation-
5 ships is complex and existing design activity/process models do not address
6 this adequately.
7
8
9 2.2.2 Characteristics of Performance in Design
4011
1 In comparison to areas such as manufacturing the analysis of performance in
2 design is more complex. For example performance in manufacturing is con-
3 cerned with the measurement of a more tangible output, i.e. physical goods,
4 and the activities are more easily dened and understood. Cusumano [9] sug-
5 gests that compared to manufacturing, product development presents unique
6 difculties for researchers. . . . and that activities required to make new
7 products require considerable conceptualisation, experimentation, problem
8 solving, communication and coordination. . . . rather than simply assembling
9 components or doing other relatively routine tasks. Brookes [44] also high-
50 lights the differences between performance measurement in manufacturing
51111 and design (Table 2.1).
13
The Need for Design Performance
14
The Nature of Performance in Design
1 provide concrete evidence in support of such claims [45, 46]. However, there
2 have been numerous attempts to determine the relationship between the
3 approaches, methods and tools employed in the design process and the level
4 of success, e.g. [4749] the results are often empirically observed correlation
5 between factors and performance with limited theoretical understanding [50].
6 Through establishing the link between particular approaches, methods and/or
7 tools, the aspects of performance they inuence and the nature of that inu-
8 ence, more informed decision making regarding the selection and use of such
9 approaches, methods and/or tools can take place.
1011
1
2 2.2.4 Coherence in Design Performance
3
4 The area of coherence in performance was discussed in a general manner
5 in section 2.1.4. The issues raised regarding coherence apply equally to per-
6 formance in design. The following section describes particular aspects of
7 coherence as applied to design performance.
8
9
2011 2.2.4.1 Scope of Analysis
1
2 Design generally takes place within a business context and therefore the
3 performance may be analysed at many different levels from the market
4 impact of the overall product development program to the performance of
5 individual design projects and ultimately, specic design activities. In addi-
6 tion performance analysis can be applied over a particular range of activi-
7 ties e.g. single/multiple phases in a project or a series of individual activities
8 within a phase. Therefore the subject of analysis can be dened in relation
9 to a particular range and level (Figure 2.3) which describes the scope of the
3011 investigation. It is necessary to dene this scope clearly to ensure accurate
1 interpretation of any results and to understand the relationship between these
2 results and those obtained in other areas of the organisation.
3
4
5
6 RANGE
7
8 LEVEL
9 Product Development Program
4011
1 Project A
2
3
4
5
6
7 Phase i
8
9 Activity i
50
51111 Figure 2.3 Scope of performance analysis.
15
The Need for Design Performance
16
The Nature of Performance in Design
17
The Need for Design Performance
to identify the relationship between such methods, tools, etc. and improved 1
performance when implemented in industry is currently lacking. There 2
is a need to establish a means to identify this relationship and support 3
decision making on the best way to improve performance. 4
5
The discussion in this chapter on the nature of performance in design allows
6
some further points to be elaborated to reect the scope of this book:
7
Much of the research in this area focuses on the assessment and analysis 8
of the performance of particular business processes. This book is focused 9
primarily on the analysis of performance within design development, 1011
although many of the concepts discussed may be equally applicable to other 1
processes. 2
Activities are seen here as the basic elements of a process, i.e. business 3
processes are composed of a number of activities. Therefore the analysis 4
of performance at the level of individual activities is necessary to provide 5
a fundamental understanding of performance in this area. 6
The area of assessing performance is dominated by the denition of suit- 7
able metrics. We do not aim here at specifying metrics per se, but rather 8
at providing a formalism that supports their denition in any specic 91
situation. 2011
1
The empirical studies of performance form a substantial part of the liter-
2
ature in this area, i.e. studies aimed at observing design and analysing past
3
performance in order to draw conclusions from trends, dene descriptive
4
models, etc. The developed formalism is based on a fundamental under-
5
standing of performance in design and, although developed with reference
6
to design practice, is prescriptive in nature.
7
8
9
Notes 3011
1 Performance improvement may involve the maintenance of existing levels of performance
1
under increased constraints. 2
2 Assessment is used here where many authors use measurement. Measurement, in the 3
authors view, implies more accuracy and certainty than is often possible in this area while 4
assessment supports the practice of estimation.
5
3 Although the terms metric and measure may have different meanings within the literature
they are used synonymously within this book to mean either an area/type of measurement 6
such as time, or a specic measure such as project lead time. 7
4 The term artefact is considered to be more generic than product and may be used in rela- 8
tion to different design disciplines such as industrial, engineering, architectural, etc. The 9
term activity is used as opposed to process as processes are composed of activities and an
activity may be considered to be a fundamental element of a process. 4011
1
2
3
4
5
6
7
8
9
50
51111
18
3 Design Performance
Measurement
19
The Need for Design Performance
1
Business Processes 2
Theoretical analysis 3
Product Development Performance measurement 4
approaches 5
Design
.
Empirical studies 6
Research reviews 7
Manufacturing 8
..
9
1011
Figure 3.1 Areas and types of performance related research. 1
2
3
Many of the articles and books reviewed in this chapter fall primarily within 4
the general area of business process performance measurement and manage- 5
ment. Some of the work is generic in terms of being applicable across all busi- 6
ness processes while other work is aimed at more specic processes such as 7
product development, design, manufacturing, etc. (Figure 3.1). Within such 8
areas the type of research and focus may vary widely and include empirical 91
studies aimed at determining relationships between performance in different 2011
processes, the design and implementation of approaches for measuring per- 1
formance, the development of theoretical models of activities that relate their 2
performance, etc. Design research, in particular the area of design activity/ 3
process modelling, is briey reviewed in order to analyse how it meets the 4
requirement to distinguish and relate the performance of the design and of 5
the activities involved in creating it. 6
7
8
3.1.1 Trends in Performance Research 9
3011
There has been considerable research published in the area of performance, 1
e.g. Neely [18] identied that between 1994 and 1996 some 3,615 articles on 2
performance measurement were published. He refers to the growth in mem- 3
bership of accountancy institutes and conferences on performance as indica- 4
tors of the increased interest in this area. However, in comparison to areas 5
such as manufacturing, measuring the performance in product design is 6
relatively undeveloped [44]. For example, at the PM2000 conference in the 7
UK [53] no papers focused specically on the analysis of design development 8
performance, from a list of over 90. Many authors have recognised the 9
particular difculties in measuring the performance in design/development 4011
activities [44, 54, 55] and identifying the causal relations between design per- 1
formance and business success [9, 56, 57]. These difculties arise from the 2
less tangible nature of outputs from design activities, i.e. knowledge, the often 3
long duration and wide range of inuences from design to market launch, the 4
difculty in dening/measuring design quality, etc. 5
The decline of management accounting as the only way to measure busi- 6
ness performance [21, 29, 58, 59] is an indication of the move toward mea- 7
suring less tangible aspects of performance, e.g. those related to knowledge 8
intensive activities in design. Johnson and Kaplan [59] suggest that traditional 9
accounting methods are unsuited to organisations where the product life 50
cycle is short and research and development assume increased importance. 51111
20
Design Performance Measurement
21
Table 3.1. Performance and performance related denition
Author/Source Element Dened Denition Context
Cordero [65] Performance Effectiveness (i.e. measuring output to determine if they help accomplish objectives); Research and
Efciency (i.e. measuring resources to determine whether minimum amounts are Development/
used in the production of these outputs). Organisation
Dwight [66] Performance The level to which a goal is attained. General
Neely et al.[67] Performance Efciency and Effectiveness of purposeful action. Business
Rolstadas [68] Performance A complex interrelationship between seven performance criteria: Organisational
effectiveness System
efciency
quality
productivity
quality of work life
innovation
protability/budgetability
Clark & Fujimoto [69] Dimensions of Performance Total Product Quality, Lead Time and Productivity (level of resources used) Product Development
Doz [70] Dimensions of Performance Focus in Development, Speed of Development and R&D Efciency Product Development
Emmanuelides [71] Dimensions of Performance Development time, Development Productivity (use of resources) and Total Design Product Development
Quality. (project)
Moseng & Bredrup [72] Dimensions of Performance Efciency, Effectiveness and Adaptability Manufacturing
Neely et al. [7] Dimensions of Performance Time, Cost, Quality and Flexibility. Manufacturing
van Drongelen[73] Performance Measurement The acquisition and analysis of information about the actual attainment of General
company objectives and plans, and about factors that may inuence this attainment.
Sinclair & Zairi [22] Performance Measurement The process of determining how successful organisations or individuals have been Organisations/
in attaining their objectives Individuals
Andreasen & Hein [14] Efciency Ratio of increase in (clarication + risk reduction + detail + documentation) Product Development
TO (increase in costs)
Grifn & Page[82] Productivity A measure of how well resources are combined and used to accomplish specic, General
desirable results.
Duffy [80] Design Productivity Efciency and Effectiveness Engineering Design
Goldschmidt [83] Design Productivity Efciency and Effectiveness Engineering Design
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
91
4011
3011
2011
1011
51111
Design Performance Measurement
23
The Need for Design Performance
24
Design Performance Measurement
1 Task
2 Market, company, economy
3
4
5 Plan and clarify the task:
Conceptual Design
Embodiment Design
3
Preliminary layout
4
5
6 Define the construction structure:
Evaluate weak spots
25
The Need for Design Performance
author provides a viewpoint, identifying the need for managing such a trade- 1
off, and does not relate this within an activity/process model to further illus- 2
trate how it might be achieved. Similarly, Hales identies the need to address 3
aspects of the design and the process in managing the design project but does 4
not illustrate the trade-off or how conicts might be managed. 5
From the review of design activity and process models presented here, it 6
is clear that they provide signicant insight into the activities, stages, etc. in 7
design. There is a reasonable consensus on the types of activities involved 8
in design, their sequence, etc., and the evaluation of the output in relation to 9
the design goals is a key component of the models discussed. The analysis 1011
of performance in relation to the activities carried out in design is restricted 1
to literature addressing the management of design at the project level, e.g. 2
[51]. However management activities are carried out at every level in 3
design, from corporate boards to individual activities, and therefore there is 4
a requirement to analyse performance in relation to activities at all levels. This 5
requirement is not supported in other work. 6
7
8
3.4 Supporting Coherence 91
2011
The variation in scope of analysis when carrying out performance measure- 1
ment highlights the need for coherence (alignment within and between goals), 2
as described in Chapter 3. The need for coherence has been identied widely 3
in research [7, 19, 29, 33, 34, 8890]. 4
The work of de Haas [33] is focused on behaviour of individuals within an 5
organisation and does not address coherence in relation to the technical 6
system as considered within this book. LockamyIII [89] investigates the align- 7
ment of goals within a manufacturing/ production environment to the organ- 8
isational goals. The work is based on the Theory of Constraints, which focuses 9
on throughput, inventory and operating expense within a manufacturing 3011
facility. The application is therefore in a more repetitive and predictable 1
environment than design development. Neely [7] provides a review of litera- 2
ture in the area and while identifying the need for coherence the paper is not 3
aimed at proposing solutions. The work described in [90] is based on the 4
Performance Measurement Questionnaire (PMQ) of Dixon [29], which is 5
reviewed in further detail below. Tsang [34] proposes the adoption of the 6
Balanced Scorecard to support the holistic measurement of maintenance per- 7
formance. The scorecard is also reviewed in more detail in the following 8
section, which reviews more generic approaches in terms of their support for 9
coherence in design development performance. 4011
The Balanced Scorecard [20] is becoming one of the most widely used per- 1
formance measurement and management frameworks in industry. The score- 2
card is aimed at achieving strategic alignment between the strategy of an 3
organisation and the behaviour of individuals at all levels working to achieve 4
the strategic goals. According to the authors, the balanced scorecard 5
6
translates vision into a clear set of objectives which are then further translated into a 7
system of performance measurements that effectively communicate a powerful, 8
forward-looking, strategic focus to the entire organisation.
9
The scorecard provides a framework incorporating four views of the organi- 50
sation where objectives and measures may be dened, i.e. 51111
26
Design Performance Measurement
27
The Need for Design Performance
28
Design Performance Measurement
29
The Need for Design Performance
30
Design Performance Measurement
31
The Need for Design Performance
quality and suggest that these would be driven by the qualications of the 1
designers. These output measures are highly signicant in many industries; 2
indeed the authors conclude that development productivity is a very 3
important driver of business success. 4
The authors dene 28 process management variables but within this set 5
they confuse actual performance variables, e.g. meeting schedules, and 6
meeting budgets, with variables which inuence performance, such as team 7
size, concurrence, and use of value engineering. That is, the variable meeting 8
schedules provides an indication of how well the project met the time goals 9
which were specied, while team size provides an indication of the 1011
resources used but not the results achieved. 1
The work does not provide enough detailed analysis at the level of product 2
development activities to support performance improvement in this area. 3
In particular the variables used do not include any analysis of the impact 4
of particular tools or techniques, the focus of much interest in design 5
research [3, 94]. 6
The characterisation of industries, in an attempt to develop industry 7
specic results, presents the results on process performance in two broad 8
categories, i.e. Measurement/Large Systems Industry and the Computer 91
Industry. Within these categories many of the industries may be pursuing 2011
very different product development strategies/goals and therefore the 1
causal relationships within the model in Figure 3.3 may vary signicantly 2
[95]. This work does not take such individual strategies into account within 3
the results. 4
5
The work described in [9], [75] and [96] focuses on specic inuencing
6
factors, e.g. design tools, and evaluates their impact on performance. Grifn
7
[75] looks at the impact of engineering design tools on efciency and effec-
tiveness using data from the Product Development and Management 8
Associations (PDMA) 1995 Best Practices in Product Development survey. 9
The results of the study are based on the data from 383 respondents of a postal 3011
survey using a detailed questionnaire. The practices surveyed included 1
Computer-Aided Design (CAD), Computer-Aided Engineering (CAE), Design 2
for Manufacture (DFM) and Concurrent Engineering (CE). However, these 3
terms are not dened within the report although it is suggested that CE is a 4
tool/method for speeding the design process while DFM is a tool for reduc- 5
ing design defects and it is not clear if detailed denitions were provided to 6
respondents. Within this book CE is seen as an overall approach to carrying 7
out activities concurrently. This may be facilitated by the use of methods such 8
as Design for Manufacture that in turn may be assisted using particular CAD 9
tools. Therefore CE Tools could be seen to include some CAD applications, 4011
DFM, etc. The lack of clarity on what each practice means within this report 1
limits the value of the results in terms of the conclusions that are drawn, par- 2
ticularly when presented as a guide to implementing different practices to 3
support particular PD strategies. 4
Cooper [52] reports on an extensive survey of the factors driving the success 5
of new products. This work identies the product itself and its attributes as 6
the most important factor and lists others including the use of cross-func- 7
tional teams, thoroughness of activities in the early stages, etc. The work does 8
not provide detail on the use of particular tools or methods to support the 9
activities in product development and the results may be used as a high level 50
guide only. 51111
32
Design Performance Measurement
1 Some studies in this area are more specically focused on one aspect of
2 performance, e.g. the work of Cooper [97] and Doz [70] investigates the key
3 drivers of speed and timeliness in product development projects. Also, Zirger
4 et al. [98] look at the impact of various acceleration techniques on devel-
5 opment time. Grifn [75] investigates the inuence of specic design tools on
6 performance. In some cases the performance of Product Development has
7 itself been investigated as an inuencing factor on overall business perfor-
8 mance e.g. [57]. None of the studies offer conclusive evidence of a generic
9 nature and the lack of a framework e.g. a generic performance denition
1011 and/or model, in which this work may be related, limits the potential rele-
1 vance to the research community. In addition, the context in which the inu-
2 encing factor is analysed is often not clearly dened which limits the
3 conclusions that may be drawn. For example, the introduction of CAD may
4 have a signicant impact on performance where a substantial amount of
5 routine design is carried out but the inuence on a design process focusing
6 on innovation will be different.
7 A general critique of this type of research is provided within the meta-analysis
8 reported in [64] which highlights many of the key weaknesses in empirical studies.
9 The authors provide reference to the growing number of these studies and suggest
2011 that in spite of the use of increasingly sophisticated statistical techniques much of
1 the work identifies rather than explains influencing factors. That is, the theoretical
2 understanding of relationships is limited and the work is . . . disjointed and lacking
3 with respect to concise conclusions on which factors should command the most
attention.
4
5 This review highlights the lack of consistency in this type of work in terms of
6 the elements studied, leading to a lack of convergence in terms of the results.
7 In addition, the validity of particular studies of success and failure may be
8 questionable due to the natural tendency not to report failure and the biases
9 that are inherent in much of this type of research [50].
3011
1
2 3.5.1.1 Other Approaches
3
4 Hales [51] suggests that dening, understanding and working with the factors
5 inuencing the course of the (design) project are critical elements in manag-
6 ing engineering design. He identies inuences on design at different levels,
7 such as project and individual, and provides checklists for analysing
8 the nature of these inuences and identifying actions required to improve the
9 success of the project. The analysis attempts to establish the inuences that
4011 different sets of factors have on the overall project, e.g. corporate factors,
1 project prole, personnel factors, etc. Although the checklists encourage a sys-
2 tematic approach to controlling the design project it assumes that an indi-
3 vidual, i.e. the design manager, can establish the link between a particular
4 factor and the overall goal of the project. Such analysis would benet from
5 decomposition of the goal into particular goals and priorities and analysis of
6 the inuencing factor in relation to each of these goals. That is, the complex-
7 ity of the problem could be decreased through the use of a decomposition
8 approach, contributing to greater reliability of the results. In addition the
9 analysis of inuences at different levels by different individuals may result in
50 a series of unrelated checklists and therefore a lack of alignment among the
51111 inuencing factors in relation to the overall goal.
33
The Need for Design Performance
34
Design Performance Measurement
1 3.6 Summary
2
3 This chapter began by specifying the requirements for a methodology
4 supporting performance modelling and analysis to achieve performance
5 improvement. The overview covers a broad range of contributions from a
6 wide range of areas and types. It has focused on key contributions in relation
7 to each of the requirements with a brief review of related work.
8 Some of the key issues that must be addressed for design performance are:
9
1011 The comparative difculty in analysing performance in design develop-
1 ment, i.e. its uniqueness in comparison to activities such as manufactur-
2 ing, and the lack of suitability of nancial analysis in describing
3 performance in this area.
4 The lack of research in design development performance within the wider
5 area of performance measurement.
6 The confusion and the need for a common language within this area to
7 support common understanding.
8 The need to ensure alignment/coherence of performance throughout all
9 parts of an organisation.
2011 The incomplete view of performance implicitly assumed within much
1
design process research.
2
3 The need to analyse the factors inuencing performance in order to under-
4 stand how performance may be impacted and/or predicted in the presence/
5 absence of these factors.
6 The chapter has presented an overview of existing work and identied par-
7 ticular weaknesses in relation to the different areas addressed. The following
8 points summarise the ndings:
9
3011 The research contains a very wide range of terms that reect different
1 aspects of performance and its related elements. There exists no common
2 framework in which to relate these terms and understand how they relate
3 to the phenomenon of performance. This contributes to difculties in inter-
4 preting the results of research in this area and hinders progress in the
5 research.
6 Design research has substantially improved our understanding of the
7 design process and general agreement exists among many of the process
8 models presented in research. Design and its management are both
9 addressed in the research but these areas are not jointly addressed in the
4011 process models and their relation dened. This limits our understanding
1 of performance in design development.
2 Coherence in performance is accepted as a key requirement within the
3 broad literature on performance. A number of approaches are reviewed
4 here that offer solutions but are not considered directly applicable in design
5 development. The approaches provide more information on what to do
6 than how it might be done.
7 The identication and assessment of factors inuencing performance
8 is widely reported as empirical research. The limitations of this type of
9 research in a non-uniform/repeatable environment such as design devel-
50 opment have been described. Further approaches have been reviewed and
51111 their weakness discussed.
35
The Need for Design Performance
Based on the work reviewed in this chapter the following are key issues 1
that a methodology for modelling and analysing performance in design and 2
development should address: 3
4
It should present a comprehensive formalism of design development per-
5
formance that denes and relates the key elements, e.g. efciency and effec-
6
tiveness and claries related elements such as inuences on performance.
7
The formalism should allow design and its management to be distinguished 8
at the level of activities in design development, i.e. it should provide a more 9
comprehensive view of performance than current models of the design 1011
process. Further, the relationship between these should be determined 1
within a process model of design and its management. 2
The modelling and analysis of performance should address the issue of 3
coherence and avoid sub-optimisation or isolated assessment of perfor- 4
mance within a particular area of design development. 5
The analysis of performance should allow the identication of inuencing 6
factors and the nature of that inuence in a manner that supports the 7
implementation of future performance improvements. 8
91
Part 2 of this book species key components to support the requirements
2011
specied here and presents the elements within an overall methodology for
1
performance modelling and analysis in design development.
2
3
4
Notes 5
1 The table is not intended to represent an exhaustive coverage of all terms and denitions
6
but presents a sample to indicate the range of interpretations. 7
2 It should be noted here that dening the dimensions of performance does not constitute a 8
generic denition of performance as the appropriate dimensions may vary while the den- 9
ition should remain consistent.
3011
3 Product Development Management Association
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
36
1
2
3
4
4 Industrial Practice
37
The Need for Design Performance
38
Industrial Practice
39
The Need for Design Performance
There are implicit assumptions within the approach, relating to best prac- 1
tice, which are not widely agreed in research. For example in the descrip- 2
tion provided above there is an assumption that carrying out activities in 3
parallel is superior to a sequential process. A number of authors have raised 4
issues around the adoption of Concurrent Engineering approaches [46, 5
109] illustrating that such a general assumption is inappropriate. Indeed 6
it is suggested that activities should not simply be carried out in parallel 7
but should be co-ordinated to give the right results, in the right place, at 8
the right time, for the right reasons [110]. 9
1011
A concept for analysing the worth of a research concept, Design Coordin-
1
ation (DC), as an approach to Product Development has been developed and
2
presented in DMMI 95. This concept was developed through a collabora-
3
tion of industrial and academic partners and was based on the view that
4
approaches such as Design Coordination should not be adopted by industry
5
until an appreciation of its potential to support an organisation is obtained.
6
Design Coordination is dened as [110]:
7
a high level concept of the planning, scheduling, representation, decision making and 8
control of product development with respect to time, tasks, resource utilisation and 91
design aspects. 2011
The concept is proposed as a basis to support a number of aspects of product 1
development, i.e.: 2
3
Concurrent Engineering 4
Decision Support 5
Design Management 6
Product Management 7
Team Engineering 8
9
Within each of these aspects there are considered to be a number of DC means 3011
(methodologies/tools) as presented in Table 4.1. 1
The analysis approach presented by Duffy in [111] is based on determin- 2
ing how well the DC means support the enterprise. Recognising that the 3
support provided was likely to vary across different organisations an initial 4
step of dening and prioritising the enterprise requirements was carried out. 5
The various DC means, such as Design for X, Process Modelling and Variant 6
Management are subsequently analysed in terms of how much impact they 7
have on meeting the requirements of the enterprise. This ensures that results 8
of such an analysis are more directly relevant to the organisation and are not 9
based on generic assumptions on the value of different approaches. A matrix 4011
is proposed to support and provide structure for this analysis based on the 1
Quality Function Deployment (QFD) approach [112]. 2
The approach provides a framework to obtain greater insight into the 3
impact of DC Means on performance against particular goals. From such an 4
analysis those means with greater impact on goals could be further exploited 5
to increase performance. That is, the concept seemed to provide a basis for 6
achieving performance improvement. Further, the expertise of one of the co- 7
creators of the original approach was readily available to support implemen- 8
tation of the approach within an industrial trial. Such a trial was considered 9
appropriate in establishing insight into the requirements for performance 50
analysis in industry. This insight is captured here. 51111
40
Industrial Practice
41
The Need for Design Performance
42
Industrial Practice
43
The Need for Design Performance
44
Industrial Practice
1 Enterprise Requirements
2 3 3 5 3 5 3 1 1 3
3
Achieve Spec
4
Integration
Reputation
Coherence
Meet prog
Customer
5
Rework
Profit
Ease
6
Risk
PD Aspects and DC Means
7
Concurrent Engineering
8
DFX 1 9 9 3 1 9 9
9
DFX Management 1 3 3 1 3 9
1011
Life cycle issues 1 1 9 9 3 9
1
Providence 3 1 3 3 3 3
2
XTD 3 3 9 3 9 9 9 1 9
3
Decision Support
4
Authority, responsibility and control 3 9 3 9 9 9 3 9 9
5
Conflict resolution 3 9 3 9 9 9 3 3 1 9
6
Consistency management 3 9 9 3 9 3 1 9
7
Effective communication 1 9 9 9 9 9 3 3 9 9
8
Information integration 1 3 9 9 3 9 3 1 9
9
Integration and coherence 1 9 9 3 3 9 3 3 1 9
2011
Knowledge management 3 3 9 3 1 9 3 1 9
1
Multi-criteria management 1 9 9 9 9 9 9 3 1 9
2
Probability and risk assessment 1 9 3 9 9 9 3 1
3
Design Management
4
Design experience re-use 5 3 9 3 1 3 9 1 1
5
Distributed design authority 1 9 9 9 1 9 9 3 1 9
6
Planning scheduling and control 1 1 3 9 3 1 9
7
Process modelling 3 1 9 9 1 1 9 1 9
8
Resource management 3 1 9 3 3 3
9
Right-first-time, re-work and iteration
3011
control 1 9 3 9 1 9
1
Task management 5 1 3 9 1 3
2
Product Management
3
Configuration management 3 9 9 3 1 9 1 1 1 3
4
Design re-use and standardisation 5 9 9 9 1 9 3 1 3
5
Evolution management 3 3 3 9 9 1 1
6
Integration and Control 1 3 9 1 9 9
7
Integrity 1 3 9 9 9 9 3 3
8
Variant management 1 9 3 9 3 3 3 3
9
Viewpoint management 3 9 3 1 3 9
4011
Team Engineering
1
Inter-Intra integration 3 1 9 9 9 9 9 9 9 9
2
Formation 3 3 3 3 3 9 3 3 3 9
3
Assessment and enhancement 3 1 1 1 3 9 9 9 1
4
Empowerment 1 9 3 3 9 9 9 3
5
Negotiation support/management 3 9 3 9 9 9 3 9 9 1
6
7
8
Figure 4.2 Company A data input (ideal).
9
50
51111
45
The Need for Design Performance
management
6.0 3.7 3.8
2
Contribution
2.0 2.1
Integrity
4
Empowerment Assessment and
2.0
Variant management 5
enhancement 1.0 6
Viewpoint
0.0
management 7
0.0 1.0 2.0 3.0 4.0 5.0 8
Ease
(c) (d)
9
50
Figure 4.3 Results from the Company A trial. 51111
46
Industrial Practice
1 The results were presented to the analysis team a short time (approx. 30
2 minutes) after they had completed their data input in the matrices to ensure
3 continued engagement in the process. The results support discussion around
4 areas where the organisation may focus efforts in improving performance.
5 The automation of much of the analysis via computer support allows what-
6 if scenarios to be developed during this discussion session. This served to
7 conrm the results with the participants, i.e. many did not fully believe the
8 results and were allowed to alter particular values to simulate the effect on
9 overall results. The presentation of the results in the same day contrasted with
1011 the previous experiences of the company where the results would be obtained
1 a number of weeks later when many had forgotten the rationale behind them.
2
3
4 4.2.3.7 Agreed Target Areas
5
6 In the process of agreeing the target areas for performance improvement the
7 analysis team reviewed the overall results as presented in Table 4.4 and
8 the graphical outputs. In general, the team tended towards selecting those DC
9
2011
1 Table 4.4. Overall Company A results (ordered by ideal impact)
2 % of Total (Ideal)
3 No. Ease DC Means current ideal difference
4 1 L Multi-criteria management 1.31 5.19 3.87
5 2 M Inter/Intra integration 0.97 4.86 3.99
6 3 L Effective communication 0.61 4.86 4.35
7 4 L Distributed design authority 0.48 4.64 4.17
5 M Authority, responsibility and control 0.68 4.49 3.81
8 6 M Conict resolution 0.97 4.37 3.40
9 7 M Negotiation support/management 0.43 4.15 3.72
3011 8 M XTD 0.63 4.03 3.40
1 9 L Information integration 0.72 3.90 3.17
10 H Design re-use and standardisation 0.57 3.76 3.19
2 11 L Integrity 3.32 3.74 3.42
3 12 L Integration and coherence 0.97 3.69 2.72
4 13 L Probability and risk assessment 0.52 3.60 3.08
5 14 M Process modelling 0.25 3.13 2.88
6 15 M Knowledge management 1.04 3.08 2.04
16 M Consistency management 0.77 3.01 2.24
7 17 M Conguration management 0.43 2.97 2.54
8 18 M Formation 0.61 2.92 2.31
9 19 L DFX 0.43 2.80 2.47
4011 20 M Evolution management 0.36 2.58 2.22
21 L Integration and Control 0.66 2.56 1.90
1 22 L Empowerment 0.00 2.51 2.61
2 23 H Design experience re-use 0.72 2.27 1.54
3 24 L Right-rst-time, re-work & iteration control 0.00 2.20 2.20
4 25 L Planning scheduling and control 0.32 0.13 1.81
5 26 L Variant management 0.50 2.11 1.61
27 M Viewpoint management 0.20 1.97 1.77
6 28 M Assesst & enhancement 0.18 1.68 1.50
7 29 H Task management 0.18 1.56 1.38
8 30 L Life cycle issues 0.05 1.56 1.52
9 31 M Resource management 0.23 1.43 1.30
32 L DFX Management 0.14 1.34 1.20
50 33 M Providence 0.02 0.61 0.59
51111
47
The Need for Design Performance
48
Industrial Practice
1 4.2.4.2 Weaknesses
2
3 From a general viewpoint there was confusion over many of the terms being
4 used and their exact meaning/nature was questioned. In particular the
5 following were unclear:
6
7 The participants had difculty understanding the nature/meaning of
8 impact and contribution. Other terms were also referred to in discussing
9 this area, such as the effect and the effectiveness of a DC means.
1011 The relationship between a PD Aspect and a DC Means.
1 The meaning of each of the different DC Means in relation to Company
2 As environment, e.g. integration and control.
3 Although the list of DC means covered all of the resources that Company
4 A had identied within their organisation, there was a sense that they would
5 have been more comfortable dening their own individual resources and
6 groupings.
7 Although the analysis was automated as far as possible some of the for-
8 matting and presentation of graphical results had to be done manually
9 which restricted exibility in terms of creating what-if scenarios. That is,
2011 it was difcult to meet the needs of the analysis group in an appropriate
1 time period. Greater exibility in adding/removing resources/goals would
2 enhance the support provided.
3
4
5 4.2.5 Further Analysis
6
7 Following this initial trial and a review of the results a more fundamental
8
understanding of performance and formalism is presented in Chapter 5. As
9
this was being progressed a repeat analysis was carried out at Company A to
3011
assess the degree of improvement after a two and a half year period into
1
Project P1. This provided an opportunity to further evaluate the approach and
2
conrm the ndings in the initial trial.
3
The analysis was conducted at Company A three years later and was based
4
on the data originally obtained (Section 4.2.3). That is, the resources,
5
goals/priorities, etc. were considered to be unchanged for the purpose of inter-
6
nally benchmarking performance over the three-year period. The ideal and
7
8 current impact values from the original analysis served as the basis for com-
9 paring the updated performance. The analysis team was only asked to estab-
4011 lish the updated impact of the resources on the goals. This was input to the
1 software system that provided graphical results of the comparison between
2 the original and the updated. The use of the data from both the original and
3 updated analyses resulted in three sets of impact relationships, i.e.:
4 Ideal (established in the original analysis)
5 Original
6 Updated (3 years later)
7
8 A number of graphs were plotted to illustrate comparisons across the three
9 data sets. An overall comparison is shown in Figure 4.4 that relates the
50 contribution (as a percentage of total contribution) of resources [114]. The
51111 graph compares the current and ideal, as established in the original analysis,
49
The Need for Design Performance
% of Total (Ideal) 1
0.0 1.0 2.0 3.0 4.0 5.0 6.0
No. Ease MEANS FOR IMPROVEMENT 2
1 L Multi-criteria management
2 M Inter/intra integration 3
3 L Effective communication
4 L Distributed design authority 4
5
6
M
M
Authority, responsibility and control
Conflict resolution
5
7 M Negotiation support/management 6
8 M XTD
9 L Information integration 7
10 H Design re-use and standardisation
11 L Integrity 8
12
13
L
L
Integration and coherence
Probability and risk assessment
9
14
15
M
M
Process modelling
Knowledge management
1011
16 M Consistency management 1
17 M Configuration management
18 M Formation 2
19
20 M
L DFX
Evolution management
3
21
22
L
L
Integration and Control
Empowerment
Ideal 4
23 H Design experience re-use Updated 5
24 L Right-first-time, re-work and iteration control Updated
25 L Planning, scheduling and control 6
26
27 M
L Variant management
Viewpoint management
7
28
29
M
H
Assessment and enhancement
Task management
8
30 L Life cycle issues 91
31 M Resource management
32 L DFX Management 2011
33 M Providence
1
Figure 4.4 Comparison of results over a three year period at Company A. 2
3
4
with the current situation as dened 3 years later. As all factors remained con- 5
stant, with the exception of impact, the changes between the original values 6
and the updated ones are due solely to changes in impact. It can be seen from 7
Figure 4.4 that the ideal values were exceeded for a number of resources. 8
Following the original analysis Company A implemented a number of ini- 9
tiatives to address the key target areas identied using the analysis. The 3011
updated review highlighted that they had achieved signicant improvements 1
in their use of resources to achieve their goals. However, the improvements 2
gained raised further questions around their understanding of impact of 3
resources. In a review of this second analysis, the team commented that in 4
the original analysis they had limited knowledge of the systems, techniques, 5
etc. that could be employed by the organisation. The team felt their knowl- 6
edge was such that they underestimated the ideal impact of different methods, 7
tools, etc. as they were not as aware of these as the facilitators. This resulted 8
in a situation where they exceeded the ideal impact in the updated analysis 9
and raised the issue of dening what the ideal actually means. 4011
1
2
4.3 Summary 3
4
The company considered the industrial trials successful as they resulted in 5
an increased insight into their design development performance and pro- 6
vided them with focus areas for improvement. Although there were issues of 7
understanding the company felt that the results of the second trial were largely 8
accurate and that signicant improvements had been made, i.e. comparing 9
the original to the updated (3 years later) results. Some of the participants 50
provided general comments on the approach: 51111
50
Industrial Practice
1
2 Without the audit and analysis . . . many of the critical issues would have
3 received only a token effort, in favour of the intuitive techniques. Progress
4 to date has been so much more certain and condent, with the team armed
5 with the knowledge that they are not just indulging in fanciful HR and
6 formal techniques but are indeed doing more to guarantee the success of a
7 very large and complex undertaking.
8 Technical Director
9
1011 I was particularly impressed with the fact that while the [facilitating] team
1 provided the method, it was our input and experience which generated the
2 results which were appropriate for us. Through this approach, we have been
3 able to provide ourselves with a solid foundation for our improvement
4 initiatives and a realistic and pertinent set of objectives for our future
5 development
6
7 Technical Project Manager
8
9
2011 The review and trial reported here provided rst hand knowledge of indus-
1 trial issues and resulted in the identication of key areas of focus for the
2 remainder of the research, i.e. to address the existing weaknesses. Part 2
31 of the book is aimed at providing a more fundamental understanding of
4 performance and developing the underlying concepts that address these
5 weaknesses.
6
7
8 Notes
9
3011 1 This system formed a preliminary framework for the design and development of a nal
1 system solution suitable for industrial application and used in the evaluation of the research
reported here.
2 2 The actual name of the company and some additional details are considered condential
3 and withheld in this book.
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
51
1
2
3
4
5
6
7
8
9
1011
1 PART II
2
3
4
5
6
7
8
9
A Methodology
2011
1
2
for Enhanced
3
4
5
6
Design
7
8
9
Performance
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
5
A Formalism for
Design Performance
Measurement and
Management
55
A Methodology for Enhanced Design Performance
G 1
2
3
4
5
Activity 6
I O
(A)
7
8
9
1011
R 1
2
Figure 5.1 Knowledge-processing activity. 3
4
design activity while being generically applicable across multiple activities. 5
As highlighted in Section 1.1 activities are the fundamental elements that 6
transform input to output and are the basic components of processes, phases, 7
projects, etc. An activity model is presented here (Figure 5.1) focusing on 8
knowledge in design. This model is based on IDEF [115], one of the 91
Integrated Computer Aided Manufacturing Denition (IDEF) techniques, 2011
which was specically created to model activities, processes or functions. 1
2
3
5.1.1 A Knowledge-Based Model of Design 4
5
Design may be seen as the processing of knowledge [116], i.e. knowledge is 6
continuously evolved as a result of specic activities between extremes of 7
abstract versus concrete and general versus specic [36, 41, 117]. Figure 5.1 8
illustrates such an activity and the key categories of knowledge that relate to 9
it. All inputs and outputs may be represented as forms of knowledge, e.g. a 3011
designer is represented in this model as a knowledge resource (R), the state 1
of the design prior to the activity may be described as the knowledge input 2
(I), etc. Four categories of knowledge are identied here: 3
Knowledge Input (I): the knowledge present prior to the activity. 4
5
Knowledge Output (O): the knowledge present as a result of the activity
6
taking place.
7
Knowledge Goal (G): the knowledge that directs and constrains the activity. 8
Knowledge Resource (R): the knowledge that acts on the input to produce 9
the output. 4011
These categories are detailed further below and it is shown that the category 1
in which an element of knowledge resides is not xed, but derived from the 2
context of the model, i.e. the activity to which it is related. For example, an 3
output of one activity may act as a constraint on another. 4
5
6
5.1.1.1 Input and Output 7
8
The input and output refer to the initial and nal states of knowledge, in rela- 9
tion to both the design1 and the design activity2. Typically, the knowledge 50
would include details of: 51111
56
A Formalism for Design Performance Measurement and Management
1 a) The design i.e. the input could be a concept sketch with an output being a
2 more detailed description of the design.
3 b) The design activity i.e. the input could be knowledge of current expendi-
4 ture related to the resources used in activities prior to the start of the activ-
5 ity being modelled. The output will also have a value for expenditure, which
6 will represent an increase on the input value, as further resources will have
7 been used during an activity.
8
9 The distinction between design and design activity knowledge is further
1011 elaborated in Section 5.2.
1
2
3 5.1.1.2 Resources
4
5 Designers are utilised within activities as sources of knowledge along with
6 knowledge of their particular attributes such as cost, availability, etc. People
7 are generally viewed as the core resource in design [118] but many other
8 resources, such as computer tools, materials, techniques and information
9 sources (e.g. standards) are also used. These resources incur a cost when used
2011 by an activity, e.g. the use of a method such as Quality Function Deployment
1 (QFD) [112] may involve a training or licensing cost. Although resources may
2 be categorised, e.g. Eynard [119] distinguishes between human, material and
3 informational, for performance management purposes all resources may be
4 represented in the form of knowledge that can be utilised within different
5 activities. These resources will have various attributes such as cost, capability,
6 availability, etc. [120] which are evaluated when being allocated.
7
8
9 5.1.1.3 Goals and Constraints
3011
1 Goals (G) are specic elements of knowledge that direct the change in the
2 state of the design from initial input (I) to nal output (O) states. A goal refers
3 to a future situation, which is perceived by the goal originator to be more
4 desirable than the current situation [15]. It is suggested that goals are almost
5 ubiquitous in design although they are often implicit and ill dened [121].
6 In design and development, goals are generally stated in the form of objec-
7 tives which make up the design specication e.g. a design goal to maximise
8 the value of a particular property (p) in the design, Gj: p = Max. Design goals
9 such as this allow the evaluation of the solution with respect to a datum in
4011 a way that allows a meaningful rank ordering of design proposals [15]. Figure
1 5.2 (a) shows that two proposals, O(s1) and O(s2), are considered. A value for
2 the property (p) may be established for each proposal and compared against
3 the datum (p=0). In this case, with the goal stated as Gj: p = Max, the com-
4 parison may establish the extent to which the proposals meet the goal.
5 Although both may be considered solutions as they meet the goal to some
6 extent, O(s1) is more desirable than O(s2) due to its greater difference to the
7 datum in the preferred direction3.
8 Design constraints place stricter limits on the desired future situation and
9 provide a basis for the validation of solutions. For example, a constraint may
50 be introduced which states the minimum value for a property (Gj: p pmin).
51111 In Figure 5.2 (b) the value of p for both proposals, O(s1) and O(s2), is again
57
A Methodology for Enhanced Design Performance
58
A Formalism for Design Performance Measurement and Management
1 G
2
3
4
5
6
7 DG DAG
8
9
1011
1
2
3 DGj DGm DAGj DAGm
4
Figure 5.3 Goal decomposition.
5
6
7
8 activity goals and altering aspects such as resource allocation to achieve
9 optimal performance [126]. The design management activity requires addi-
2011 tional knowledge of the goals, i.e. their relative priorities, to support decision
1 making with respect to multiple criteria [127, 128]. Often priorities for goals
2 are dened somewhat informally e.g. designers may rank goals in relation to
3 how difcult to satisfy they perceive them to be [36]. The prioritisation of
4 goals is further discussed in Section 7.2.3 of this book.
5
6
7 5.1.1.4 The Context of Knowledge Categories
8
9 A particular activity denes the input, output, goal and resource knowledge.
3011 That is, the category within which an element of knowledge exists is depen-
1 dent on the context in which it is described. For example, in Figure 5.4 the
2 activity of dening the Product Design Specication (PDS) [129] is illustrated.
3 This activity begins with knowledge of the customer need as input (I). The
4 goal (G) of the activity could be to create as comprehensive a specication as
5
6
Comprehensive
7
Specification
8
9
4011
1
2
3 Customer Define
PDS
4 Need PDS
5
6
7
8
9 Designer
50
51111 Figure 5.4 Knowledge context.
59
A Methodology for Enhanced Design Performance
possible. The output knowledge (O) is represented in the PDS, which is created 1
using a designer as the resource knowledge (R). 2
However, the category of each element of knowledge described above is a 3
result of the context represented in Figure 5.4. That is, the same knowledge 4
elements could exist within a different category of knowledge in a different 5
context. For example, at a higher level the customer need could be represented 6
as goal knowledge as one of the overriding goals in design is to satisfy cus- 7
tomer need. Similarly, the output of the activity create design specication 8
could include goals that are often used to guide subsequent design activities, 9
i.e. goals exist within elements of the Product Design Specication such as 1011
cost, weight, etc. (Figure 5.5). The output could also contain additional 1
resource knowledge, for example increased expertise of the designer, gained 2
through learning in design [130]. This knowledge would be available as 3
resource knowledge for subsequent design activities (Figure 5.6). 4
5
6
G1 7
8
91
2011
Activity
O1 G2 1
I1 2
(A1)
3
4
5
Activity O2 6
R1 I2
(A1) 7
8
9
3011
R2 1
Figure 5.5 Output goal knowledge.
2
3
4
5
G1 6
7
8
9
O1 G2 4011
Activity
I1
(A1) 1
2
3
O2
4
Activity
R1 I2 5
(A2)
6
7
8
R2
9
50
Figure 5.6 Output resource knowledge. 51111
60
A Formalism for Design Performance Measurement and Management
61
A Methodology for Enhanced Design Performance
62
A Formalism for Design Performance Measurement and Management
1 the design activities [15] and is referred to here as Design Activity Resource
2 (DAR) knowledge. Typically this knowledge will include meta-knowledge
3 of the product domain, design activities and resources, and knowledge of
4 process management methods such as Project Network Techniques [137].
5
6
7 5.2.2 A Model of Design Activity Management
8
9 When activities are discussed in the literature, there is often no distinction
1011 between design activities and design management activities at the funda-
1 mental level such as that discussed above. For example, when the activity of
2 sketching is referred to, it is implied that a single activity addresses both
3 the design (i.e. the sketch itself) and design activity (e.g. the time taken to
4 complete the sketch) goals. Such an activity is referred to here as a managed
5 activity (A) which have sub-activities of design (Ad), aimed at achieving design
6 goals (DG), and design management (Am) aimed at achieving design activity
7 goals (DAG).
8 Given the basic design activity representation presented in Figure 5.1 and
9 the distinction between design and design management presented above, a
2011 further new model is introduced in Figure 5.8 to describe design and its man-
1 agement. This Design Activity Management (DAM) model distinguishes
2 between a design activity and a design management activity. It provides
3 a framework in which performance may be measured and managed (see
4 Section 5.5).
5
6
7 G
8
9
3011
DG DAG
1
2
DO
3 DI Design
4 Activity
5 (Ad )
6
7 I O
8
9
4011
1 Design DAO
DAI Management
2
Activity
3 (Am )
4
5
6 DR DAR
7
8
9 R
50
51111 Figure 5.8 Design Activity Management (DAM) model.
63
A Methodology for Enhanced Design Performance
The categories of input (I), output (O), goal (G) and resource (R) knowl- 1
edge, presented in Figure 5.1, are decomposed to reect categories related to 2
either design or design management activities as follows: 3
4
I DI and DAI
5
O DO and DAO 6
7
G DG and DAG
8
R DR and DAR 9
1011
These knowledge categories are further formalised in relation to the DAM 1
model. The category to which an element of knowledge belongs will again be 2
dependent on the context as highlighted in Section 5.1.1. 3
1. I , O , G and R : Refer to the overall input, output, goal and resource knowl- 4
edge as previously described. 5
2. Design Input (DI): This refers to the initial state of knowledge, or working 6
knowledge [124], with respect to the design prior to the design activity. The 7
state of the design may be described at any point from statement of need 8
through to nal description of the artefact. Therefore, design knowledge 91
input may be represented as a design specication, a design concept, full 2011
working drawing, etc., depending on the design activity to which it relates. 1
2
3. Design Output (DO): This refers to the nal state of knowledge with respect
3
to the design after the design activity. DO therefore represents the result of
4
the evolution of DI through a specic design activity. The representation
5
of this knowledge will vary in a similar way to DI and will depend on the
6
activity to which it relates.
7
4. Design Activity Input (DAI): This refers to the initial state of the design 8
activity knowledge with respect to the design management activity. This 9
could contain the output from a previous management activity and would 3011
typically refer to the cost of activities prior to the activity, time consumed 1
by prior activities, etc. 2
5. Design Activity Output (DAO): Refers to the nal state of the design activ- 3
ity knowledge after the execution of the design management activity. For 4
example this knowledge could contain the time elapsed, incurred cost, etc. 5
Therefore it represents the DAI with the addition of any resources used. 6
6. Design Goal (DG): The design goal(s) being addressed by the design activ- 7
ity. An example of a design goal could be to minimise the weight of the 8
product. These goals have been discussed in Section 5.1.1 where design 9
constraints are represented within DG. 4011
7. Design Activity Goal (DAG): The design activity goal(s) being addressed by 1
the design management activity. An example of such a goal could be the 2
desired lead time for a design activity. In addition, the relative priorities 3
between the design and design activity goals will be present within this 4
knowledge as derived from the higher level goal 5
6
8. Design Resource (DR): The knowledge required to carry out design activ- 7
ities with respect to a particular product domain, as discussed in Section 8
5.2.1. This knowledge includes: 9
Knowledge of particular disciplines e.g. electronics, metallurgy, etc. 50
Knowledge of the particular product domain e.g. gearboxes. 51111
64
A Formalism for Design Performance Measurement and Management
65
A Methodology for Enhanced Design Performance
66
A Formalism for Design Performance Measurement and Management
2
3
4 Design
5
6
7
Design
8 Management
9 Evaluation
1011 Evaluation
1
2 Synthesis
Decomposition
3
4
5 Analysis
6
7
8
9
Design
2011 Cycle
1
2
3
4 Figure 5.9 Managed activities within a process.
5
6
7
8 DG DAG
9
3011 Configuration
1
2 DGC DAGCM
3
4
5
6 Selection
7
8 DGS DAGSM
9
4011
Selection
1
2
3
4
5
6 Selection
7 Management
8
9
50
51111 Figure 5.10 Activity-goal decomposition.
67
A Methodology for Enhanced Design Performance
Dependent activities exist where the output of one task is required as input 1
to another, as represented in the general relationship between synthesis and 2
evaluation. Figure 5.11 illustrates a dependent relationship within the mod- 3
elling formalism used here. These activities therefore take place in series, i.e. 4
in this case selection would be completed before the conguration activity 5
begins. 6
Independent activities are those that may be carried out in parallel, or 7
concurrently, as there is no interaction between the inputs or outputs (Figure 8
5.12). This type of activity relationship underpins much of the work in 9
Concurrent Engineering where the basic proposal is that activities should 1011
be carried out in parallel whenever possible, thus shortening lead times [45, 1
109, 143145]. 2
Interdependent Activities more closely reect the iterative nature of design 3
[146]. For example, in the basic design cycle the evaluation of the design pro- 4
posal will follow the synthesis step. The results of this evaluation may feed- 5
back to synthesis in an iterative manner aimed at improving the design 6
proposal. Similarly, in the activities discussed here, it may be revealed during 7
the conguration activity that it is not possible to meet the goal for that activ- 8
ity (DGC). This may require the activity of selection to be carried out again in 91
order to bring new components into the potential solution. The iteration 2011
between activities such as selection and conguration may be repeated until 1
an acceptable solution is found. Therefore, the outputs and inputs may be 2
interrelated as shown in Figure 5.13. 3
In each of the relationships above it should be noted that when the design 4
process moves from one design activity to another there is a corresponding 5
transfer in the design management activity, i.e. when moving from selection 6
to conguration there is a corresponding move from selection management 7
to conguration management. 8
9
3011
1
Configuration 2
3
4
DIC DOC 5
6
Selection 7
8
9
DIS DOS 4011
1
2
3
4
5
6
7
8
9
50
Figure 5.11 Dependent (series) activities. 51111
68
A Formalism for Design Performance Measurement and Management
1 Configuration
2
3
4 DIC DOC
5
6 Selection
DI DO
7
8
9 DIS DOS
1011
1
2
3
4
5
6
7
8
9 Figure 5.12 Independent (parallel) activities.
2011
1
Configuration
2
3
4
5 DOC
DIC
6
7 Selection
8
9
3011 DIS
1 DOS
2
3
4
5
6
7
8
9
4011
1 Figure 5.13 Interdependent activities.
2
3
4
5 5.4 A Design Performance Model E2
6
7 The widespread use of efciency and effectiveness to describe performance
8 has been highlighted in Chapter 4, in addition to the variety of interpretations
9 of these terms when applied in design and development. Efciency () and
50 effectiveness () are presented in this book as fundamental elements of per-
51111 formance that may be used to fully describe the phenomenon. A new model,
69
A Methodology for Enhanced Design Performance
E2, is presented here, based on the design activity representation in Figure 5.1, 1
as a means to clearly determine the phenomenon of design performance and 2
allow efciency and effectiveness to be distinguished and related. Efciency 3
is related to input, output and resources, while effectiveness is determined by 4
the relationship between output and goal(s). These elements are presented 5
within the E2 model providing a fundamental representation of activity per- 6
formance. A scenario is subsequently developed in Section 5.5 to show how 7
E2 may be applied to design and design management activities within the 8
DAM model. 9
1011
1
5.4.1 Efficiency 2
3
Efciency: ratio of useful work performed to the total energy expended or 4
heat taken in [147]. 5
In general, the efciency of an activity is seen as the relationship (often 6
expressed as a ratio) between what has been materially gained and the level 7
of resource (material) used. If we assume that an activity transforms an input 8
to an output, using resources, under the direction of goals and constraints, 91
the efciency of that activity may be described as follows: 2011
1
(A) = M+ : RU and M+ = O I
2
Where: 3
4
(A): Efciency () of an Activity (A)
5
I: Input (Material) 6
7
O: Output (Material)
8
+ 9
M : Material gain
3011
RU: Resource (Material) used
1
For example, if we assume that moving a car from a point A to a point B is a 2
single activity we can establish the efciency of this activity by adopting the 3
above formalism. In this case the input and output would be representative 4
of the location of the car before (I) and after (O) travelling from A to B. The 5
material gain (M+) is the distance between the two locations (O I). The fuel 6
is represented as an available resource (R) of which a certain amount is used 7
(RU) during the activity of moving the car. Therefore the efciency () is 8
described in the ratio of distance travelled to fuel used (M+ : RU), often given 9
in kilometres/litres. 4011
Assuming design as a knowledge processing activity (Ak) (Figure 5.14), the 1
difference between the output (O) and the input (I) denes the knowledge gain 2
from the activity (K+). The cost 6 of the activity may be determined by mea- 3
suring the amount of resource knowledge used (RU). Therefore, the efciency 4
of this activity may be depicted as in Figure 5.14 and formulated as a ratio: 5
6
(Ak) = K+ : RU and K+ = O I
7
Where: 8
9
(Ak):Efciency () of an Activity (Ak)
50
I: Input (Knowledge) 51111
70
A Formalism for Design Performance Measurement and Management
1 O: Output (Knowledge)
2 +
K : Knowledge Gain
3
4 RU: Resource (Knowledge) Used
5
This formalism assumes that a quantitative comparison of the input and
6
output knowledge (I and O) can be carried out which results in a description
7
of the level of knowledge gained in the activity (K+). Similarly, it is assumed
8
that the level of knowledge used in the activity may be measured and that the
9
relationship between both quantities may be expressed in a meaningful form.
1011
1
2
3
5.4.1.1 Efciency Metrics
4
In practice a variety of metrics are used to determine efciency, reecting
5
different aspects of the input, output or resource knowledge. For example the
6
cost of using a designer within an activity may be measured to reect
7
the amount of nancial resource used in utilising this knowledge source.
8
The following example describes the use of time and cost to establish levels
9
of efciency within two activities A1 and A2 (Figure 5.15). The example illus-
2011
trates that different types of efciency may be measured and that an overall
1
efciency value should encompass these different types.
2
3
4 G
5
6
7
8 I O
Activity
9 (Ak )
3011
1
2
3 R
4
5
6
Efficiency (h)
7
Figure 5.14 Efciency ().
8
9
4011
G1 G2
1
2
3
4
Activity Activity
5 I1 O1 I2 O2
A1 A2
6
7
8
9 R1 R2
50
51111 Figure 5.15 Activity comparison.
71
A Methodology for Enhanced Design Performance
Assume that the input and output knowledge is the same for both activi- 1
ties (A1 and A2) and therefore the knowledge gain is equal, i.e. 2
3
(O1 I1) = (O2 I2)
4
K1+ = K2+ 5
6
Using Time (t) consumed as a metric to determine time-based efciency,
7
where t1 and t2 are the duration of activities A1 and A2 respectively:
8
9
1(t) = K1+ :t1 And 2(t) = K2+ :t2 1011
if t1 > t2 1
then 1(t) < 2(t) 2
3
Using Cost (c) as a metric to determine cost based efciency, 4
where c1 and c2 are the duration of activities A1 and A2 5
respectively: 6
7
1(c) = K1+ :c1 And 2(c) = K2+ :c2 8
if :c1 < :c2 91
then 1(c) > 2(c) 2011
1
In the example described above the efciency of A1 is less than the efciency 2
of A2 with respect to time7, while it is greater than A2 with respect to cost. 3
Thus, different types of efciency may be determined for an activity based on 4
the specic metrics used. However, to establish an overall efciency value 5
requires the use of an objective function, which reects the weighting placed 6
on metrics such as time, cost, etc., as used in general optimisation problem 7
solving [148]. This results in a value of efciency that allows a direct com- 8
parison between activities A1 and A2. That is, in the case of the example given 9
above: 3011
1
Overall Efciency 2
3
1 = K1+ : R 1 And j = K2+ : R2
4
Where: 5
6
R1: Total Resource Used in Activity A1
7
R2: Total Resource Used in Activity A2 8
9
This analysis of efciency can be applied to both design and design manage-
4011
ment activities although it may be difcult to distinguish between these
1
efciencies in some cases (see Section 5.5.2). Therefore:
2
(Ad) = DK+ : DRU and DK+ = DO DI 3
4
Where:
5
(Ad): Efciency () of Design Activity (Ad) 6
DI: Design Input (Knowledge) 7
DO: Design Output (Knowledge) 8
9
DK+: Design Knowledge Gain 50
DRU: Design Resource (Knowledge) Used 51111
72
A Formalism for Design Performance Measurement and Management
1 And
2
(Am) = DAK+ : DARU and DAK+ = DAO DAI
3
4 Where:
5
(Am)D: Efciency () of Design Management Activity (Am)
6
7 AI: Design Activity Input (Knowledge)
8
DAO: Design Activity Output (Knowledge)
9
1011 DAK+: Design Activity Knowledge Gain
1
DARU: Design Activity Resource (Knowledge) Used
2
3 Given the scenario of using time and cost metrics described above it is evident
4 that the efciency of a particular activity can be determined by the type of
5 metrics used and how they are combined for an overall measure. It should
6 be noted that the efciency of an activity is considered here to exist irrespec-
7 tive of whether it is measured or not, i.e. it is an inherent property of the
8 activity/resource relationship. The selection and application of metrics to
9 determine efciency allow particular views of efciency to be created, e.g. cost
2011 based efciency. The metrics used must therefore not only closely reect the
1 type of efciency phenomenon that is required to be measured (viewed) but
2 also the actual types of efciencies that are inherent in the activity.
3
4
5 5.4.2 Effectiveness
6
7 Effective: having a denite or desired effect [147].
8 Activities are generally performed in order to achieve a goal, i.e. have a
9 desired effect. However, the result obtained from performing an activity may
3011 not always meet the goal. The degree to which the result (output) meets the
1 goal may be described as the activity effectiveness. That is:
2
(A) = rC (MO , MG)
3
4 Where:
5
(A) : Effectiveness () of Activity (A)
6
7 rC: Relationship (Comparative) 8
8
MO: Material Output
9
4011 MG: Material Goal
1
For example a goal for a manufacturing activity might be to have ten or less
2
defective parts produced during a week of production (MG). The number
3
of defective parts produced (MO) may be measured and compared with the
4
goal (using rC ) to establish the manufacturing activity effectiveness. Clearly,
5
in this case, a more effective activity is characterised by fewer defective parts.
6
Effectiveness of activities in design where knowledge is processed (Ak) is
7
focused on the nature and comparison of the knowledge expressed in goals
8
and outputs [149, 150]. Therefore, activity effectiveness is depicted in Figure
9
5.16 and can be expressed as:
50
51111 (Ak) = rC (O , G)
73
A Methodology for Enhanced Design Performance
Effectiveness () 1
G 2
3
4
5
I O 6
Activity
7
(Ak )
8
9
1011
1
R 2
Figure 5.16 Effectiveness ().
3
4
5
6
Where:
7
(Ak) : Effectiveness () of Activity (Ak) 8
91
rC : Relationship (Comparative)
2011
O: Output (Knowledge) 1
2
G: Goal (Knowledge)
3
This formalism assumes that the output knowledge (O) and goal knowledge 4
(G) may be described in a manner which allows a direct comparison between 5
them, and a relationship to be determined which indicates how closely they 6
match. 7
8
9
5.4.2.1 Effectiveness Metrics 3011
1
In design, multiple goals are likely to exist with different priorities (as dis- 2
cussed in Section 5.1.1). Therefore the activity effectiveness with respect to a 3
single goal will often represent only a partial view of effectiveness measured 4
using a particular metric. That is, effectiveness may be described in relation 5
to particular type of goals, e.g. cost goals, in isolation of other goals such as 6
time, artefact aesthetics, etc. 7
For example, within an individual activity (A) (Figure 5.17) there could 8
exist two goals, derived from a higher level goal of meeting the specication 9
(G), which relate to the life in service (G1 ) and the dimensional accuracy (G2 ) 4011
of the design. Therefore, specic elements (O1 and O2 ) of the overall out- 1
put knowledge (O) must be evaluated (using time and dimension based 2
metrics for example) and compared (rC) to determine the respective levels of 3
effectiveness (1 and 2 ). That is: 4
5
1 = rC (O1 , G1)
6
and 7
8
2 = rC (O2 , G2)
9
where it is assumed that a common comparative function, rc, may be 50
used to establish both levels of effectiveness. In some instances the type of 51111
74
A Formalism for Design Performance Measurement and Management
1 G
2 (A)
3
2
4 G1 G2
5 1
6
7 O2
8 Activity
9 I (A) O1 O
1011
1
2
3
4
5 R
6 Figure 5.17 Effectiveness types.
7
8
9 comparative function used may vary and is dependent on whether a con-
2011 straint or a goal is represented within G1 and G2.
1 In the case described here the effectiveness of a particular design proposal
2 may be high with respect to life in service but low with respect to dimensional
3 accuracy. The determination of overall effectiveness for the activity, (A),
4 presents a multiple criteria decision making problem, similar to that des-
5 cribed in Section 5.4.1. That is, to fully evaluate the design proposal some
6 value of overall effectiveness must be established. There are a variety of
7 techniques that may be adopted to support optimisation [148] although the
8 nature of the design activity is such that obtaining values with which to carry
9 out optimisation is difcult, e.g. determining values for aesthetic elements.
3011 The weighting method [54, 148] presents one of the simpler approaches to
1 this type of problem. For example if we consider the priority (or weight) of
2 the life in service goal (G1) to be W1 with respect to that of the dimensional
3 accuracy goal (G2) of W2, then we could say:
4
Overall Effectiveness = (A) = (l W1) + (2 W2)9
5
6 It may be seen that the effectiveness metrics relate to the specic goals that
7 are present, e.g. some time-based metric is required to establish life in service
8 of the design proposal. The use of a particular metric provides a view of effec-
9 tiveness without providing the complete picture. That is a view of effective-
4011 ness with respect to life in service is obtained in isolation of other views.
1 Therefore a holistic approach to the determination of metrics, based on the
2 activity goals and priorities, should be adopted to provide comprehensive
3 analysis in this area. Such a comprehensive and fundamental approach is
4 currently lacking in the eld where there is a lack of coherence and integra-
5 tion between activity modelling, goal specication and metric determi-
6 nation. This results in metrics being applied without a full description of the
7 relation between the effectiveness obtained and the overall effectiveness,
8 i.e. performance may be described in relation to all the specied goals, but
9 not holistically.
50 The DAM model has shown that two types of goal exist within a MA, i.e.
51111 design goals (DG) and design activity goals (DAG), which relate to design
75
A Methodology for Enhanced Design Performance
76
A Formalism for Design Performance Measurement and Management
1 Where:
2
(Am)r: Effectiveness () of Design Management Activity (Am)
3
4 rC: Relationship (Comparative)
5
DAO: Design Activity Knowledge Output
6
7 DAG: Design Activity Knowledge Goal
8
9 Establishing the overall effectiveness of a MA will involve the analysis of both
1011 design effectiveness and design management effectiveness and the use of some
1 form of objective function [148], be it formal or informal, to take account of
2 design and design activity goal priorities.
3 To ensure objectivity in measuring effectiveness it is necessary to express
4 both output and goal knowledge at the same level of abstraction and using
5 the same units of measurement. For instance, in the design of a gearbox shaft
6 it is not appropriate to describe the design activity output (DO) as a small
7 shaft when shaft size of 20 +/ 2mm has been specied as a design constraint
8 (DG). To determine the level of effectiveness in this case it is necessary to (a)
9 express the design constraint as small, medium or large, (b) provide a
2011 specic size of the shaft in millimetres, or (c) transform both DO and DG to
1 some intermediate common description.
2 In many cases it may be difcult to establish if maximum effectiveness has
3 been achieved with respect to particular goals. For example, if the design goal
4 is to maximise the life in service of the product it is difcult to determine
5 if/when maximum effectiveness has been achieved in the design activity. Such
6 a goal expresses a desirable direction for the value of life in service, i.e. greater
7 is better, but does not specify a target, which is considered to be the maximum.
8 Such a goal may be used as a basis for the comparison of different design pro-
9 posals, i.e. to support the measurement of relative effectiveness and decision
3011 making among alternatives [15, 40].
1
2
3 5.4.3 Relating Efficiency and Effectiveness
4
5 Efciency and effectiveness focus on related, yet contrasting performance ele-
6 ments. The efciency is inherent in the behaviour of a particular activity/
7 resource combination. It may be measured without any knowledge of the
8 activity goals, although the goals may inuence the behaviour of resources
9 used in the activity and consequently the level of efciency resulting from
4011 their use (see section 6.1.2).
1 Effectiveness, in contrast, cannot be measured without specic knowledge
2 of the activity goals. As is the case in measuring efciency, the measure-
3 ment of effectiveness involves the analysis of the activity output (O). However,
4 effectiveness is obtained through analysing a specic element of the output
5 knowledge, i.e. that which relates to the goal(s) of the activity.
6 In certain cases there exists a direct relationship between effectiveness and
7 efciency. This relationship exists when the specic element of the output
8 knowledge, which is evaluated to establish effectiveness, also describes an
9 element of the resource used. For example, a design management activity may
50 have a specic cost related goal of minimising the activity cost, i.e. DAGj:
51111 C = Min. Therefore the element of the output knowledge (DAO) that must
77
A Methodology for Enhanced Design Performance
78
A Formalism for Design Performance Measurement and Management
1
2
3 G Effectiveness ()
4
5
6
7
8 I Activity O
(Ak )
9
1011 Sentral
1
2
Efficiency ()
3 R
4
5
6 Figure 5.19 Performance model E2[1].
7
8
9
2011 of the activity and set at easily achievable levels. Efciency describes the inher-
1 ent behaviour of design and design management activities and, although it
2 does not indicate goal achievement, it can provide insight into design man-
3 agement effectiveness. Therefore, to obtain a fully informed view of activity
4 performance it is critical that both efciency () and effectiveness () are
5 evaluated (Figure 5.19). Therefore:
6 Design Development Performance
7 Efciency () and Effectiveness ()
8
9 That is, the performance of a MA, incorporating design and design manage-
3011 ment activities, is completely described within the elements of efciency and
1 effectiveness as presented above.
2
3
4
5 5.5 A Model of Performance Measurement and
6 Management (PMM)
7
8 The concept of managed activities, having both design and design manage-
9 ment activities, has been presented in Section 5.2. These managed activities
4011 are the fundamental elements of the design process, i.e. the design process
1 consists of a number of managed activities with relationships such as those
2 discussed in Section 5.3. The managed activities that make up the design
3 process may be dened formally or informally as a result of planning activi-
4 ties. A formal denition might involve a specic phase of project planning
5 and result in the generation of an overall project plan with activities, goals,
6 resources, etc. Informally, planning activities can occur at any stage in the
7 design process, e.g. the statement of a short term strategy for the location of
8 specic information [36]. These planning activities are aimed at developing
9 strategies for how to proceed based on knowledge of available resources
50 (DAR), potential efciencies of activity/resource combinations, previous
51111 levels of performance, etc.
79
A Methodology for Enhanced Design Performance
Having established the design and design activity goals the focus subse- 1
quently moves to ensuring these goals are achieved [36], i.e. optimising overall 2
effectiveness. In an informal sense, a designer will continually evaluate the 3
effectiveness of their activities, e.g. checking their watch to assess time elapsed 4
(design management effectiveness), evaluating the aesthetic strengths of a 5
particular concept (design effectiveness), etc. More formally, effectiveness 6
may be reviewed through simulating product behaviour and evaluating results 7
at specic stages as represented within many of the phase models of the design 8
process discussed in Chapter 4. 9
1011
1
5.5.1 Measuring and Managing Effectiveness 2
3
The denition of efciency and effectiveness presented in the E2 model may 4
be applied to both design activities and design management activities. 5
However, E2 does not describe how these activities, or their performance, are 6
related in order to ensure that the overall performance of the MA is optimised. 7
The measurement of design and design management effectiveness is pre- 8
sented here as a critical part of controlling a MA. The following highlights 91
how the E2 model may be applied within the DAM model to realise a process 2011
model for Performance Measurement and Management (PMM) in design 1
development. The description below focuses on a typical sequence of 2
events in evolving the state of the design from DI to DO, highlighting the main 3
decision points. 4
5
1. The design activity (Ad) takes DI as input and directed by knowledge of the
6
specic design goal (DG), produces an output (DO) aimed at meeting
7
the goal. Based on the denition of effectiveness described earlier this
8
output will be compared against the goal to determine the level of design
9
effectiveness, (Ad), achieved in the activity (Figure 5.20).
3011
2. The resulting level of design effectiveness (Ad) is used as an input of 1
control knowledge into the design management activity (Figure 5.21). The 2
description of design effectiveness may describe how well a design goal has 3
been met or whether a constraint has been satised or not (see Section 4
5.1.1). 5
3. The design management activity analyses design management effective- 6
ness, (Am), using knowledge (including meta-knowledge) of the resources 7
being used in both the design and design management activities. This 8
knowledge is primarily time and cost based i.e. it refers to the time con- 9
sumed or cost incurred during a particular activity-resource relationship. 4011
1
2
(Ad ) 3
DG 4
5
DI DO 6
Design
7
Activity
Ad 8
9
50
Figure 5.20 Design effectiveness. 51111
80
A Formalism for Design Performance Measurement and Management
1
2 (Ad )
3 Design
4 Management
5 Activity
6 (Am )
7
8 Figure 5.21 Effectiveness input.
9
1011
1 (Am )
2 DAG
3 (Ad )
4 Design DAO
5 DAI Management
6 Activity
7 (Am )
8
9 Figure 5.22 Design management effectiveness.
2011
1
2 This is compared against knowledge of the design activity goal (DAG), e.g.
3 to achieve a design lead time of 1 month, to determine the level of design
4 management effectiveness (Figure 5.22).
5
4. Utilising design activity resource (DAR) knowledge the design management
6
activity evaluates the relationship between design and design management
7
effectiveness and decides on the controlling action, if any11, which must be
8
taken as an attempt to optimise overall effectiveness. This controlling
9
action will typically involve changing the goals or resources in order to
3011
achieve a change in effectiveness.
1
2 Figure 5.23 is evolved from Figure5.8 to illustrate the decision points and ow
3 of control knowledge (shown as dashed lines) within a MA and serves to sum-
4 marise the steps described above. That is, the model describes the process of
5 measuring and managing performance in relation to both design and design
6 activity goals. The following outlines the types of controlling action that may
7 result from the evaluation of design and design management effectiveness:
8 At decision point ci the decision options are to terminate the activity having
9
established satisfactory levels of design and design management effective-
4011
ness or to continue with the activity.
1
2 At decision point cj the decision options are to redene goals and/or alter
3 resource allocation.
4 At decision point ck the decision options are to redene design goals (DG)
5 and/or design activity goals (DAG). For example, the outcome of the design
6 management activity may be to set a new launch date for the project. In
7 contrast, it may be more appropriate to reduce the targets specied in some
8 design goals, e.g. life in service, while maintaining the original planned
9 launch date.
50 At decision point cl the decision options are to alter design resources (DR)
51111 and/or the design activity resources (DAR). For example, the outcome from
81
A Methodology for Enhanced Design Performance
G 1
2
3
4
DG DAG ck 5
6
DO
DI Design 7
Activity 8
(Ad ) 9
1011
I O 1
2
3
4
Design DAO
DAI
5
Management
Activity
ci 6
cj 7
(Am )
8
91
DR DAR cl 2011
1
Stop 2
R 3
4
Figure 5.23 Performance Measurement and Management (PMM) process model.
5
6
7
the management activity may be to allocate additional design resources to
8
achieve increased design effectiveness with a probable negative impact on
9
design management effectiveness.
3011
The focus of the steps described above has been to optimise overall effec- 1
tiveness based on goal priorities as discussed in Section 5.4.2. However, the 2
controlling actions are based on design activity resource (DAR) knowledge, 3
which includes the inherent efciency of design activity/resource combina- 4
tions based on previous activities. This knowledge of efciency must be 5
acquired as part of the performance measurement activity. 6
7
8
5.5.2 Measuring Efficiency 9
4011
The efciency of an activity/resource combination in achieving a degree of 1
effectiveness provides an indication of the resources (e.g. time, money, etc.) 2
required in design development. This knowledge, which is often implicit, sup- 3
ports decision making, such as the allocation of resources, within the design 4
management activity in optimising overall effectiveness. The efciency may 5
be measured for both the design and design management activities using the 6
formalism provided in the E2 model. However, in practice it may be difcult 7
to clearly distinguish design and design management activities. The individ- 8
ual designer may be continually switching between design and design man- 9
agement activities, focusing on design and design activity goals respectively. 50
Although protocol studies have shown that different types of activity can be 51111
82
A Formalism for Design Performance Measurement and Management
83
A Methodology for Enhanced Design Performance
effectiveness. This E2 and the DAM models are integrated within a further 1
model for Performance Measurement and Management (PMM) in design 2
development, focused on controlling activities to achieve optimum effective- 3
ness. The key elements developed in the chapter are summarised here: 4
5
Design is modelled as a knowledge processing activity generating an output 6
(O) from a given input (I), under the direction of goals and constraints (G)
7
using varied resources (R).
8
Two key types of activity take place at any level within design development;
9
Design activities (Ad) aimed at satisfying design goals (DG) and Design 1011
Management activities (Am) aimed at meeting design activity goals (DAG). 1
These activities are inextricably linked although this link is often managed 2
informally at the level of individual designer activities (e.g. sketching, 3
calculating, etc.). 4
Design Development Performance, relating to the design and the design 5
management activities, is fully described in the concepts of Efciency () 6
and Effectiveness (). 7
The efciency of an activity (design or design management) is a ratio 8
between the quantity of knowledge gained (K+) and the level of resources 91
used (RU), i.e. it is a measure of how economical (in terms of time, money, 2011
etc.) the activity is. 1
The effectiveness of an activity (design or design management) is for- 2
malised as the comparative relationship (rC) between the output achieved 3
(O) and the specic goal(s) (G) being addressed. 4
Although the overall goal in design development may generally be to con- 5
tribute positively to prot, two types of goal are present within different 6
levels of activity, i.e. those that relate to the design (artefact) and those that 7
relate to the activity by which the design is realised. Both design (DG) and 8
design activity goals (DAG) must be considered for a complete analysis of 9
design performance. 3011
The measurement of effectiveness may be carried out against design goals,
1
for design activities, and design activity goals, for design management 2
activities, at any level of activity in design development. The values for 3
effectiveness are used to support the local control of the activities as 4
described within the process model for Performance Measurement and 5
Management (PMM). 6
Measuring design and design management effectiveness, (A ) and (A ),
7
d m 8
gives a comprehensive view of performance with respect to design and
9
design activity goals. However, a complete description of performance
must include the related measure(s) of efciency (), which gives further 4011
insight into effectiveness. 1
2
2
The denition of efciency provided in E may be applied to individual
3
design and design management activities. However, in practice it may be 4
more feasible to measure efciency of an overall MA. 5
The overview of the work in design performance, provided in Chapter 3, high- 6
lighted some weaknesses in this eld and identied areas that need to be 7
addressed. The following briey summarises the key features of the concepts 8
presented in this chapter. 9
1. Formalism of Performance: The E2 model, (Figure 5.19), presents a generic 50
and comprehensive description of performance in relation to a goal 51111
84
A Formalism for Design Performance Measurement and Management
1 directed activity such as design. The model details and relates the key ele-
2 ments of performance, efciency () and effectiveness (). Other elements
3 such as inputs, outputs, goals and resources are described within this model
4 and their relation to performance is determined.
5 2. Relating Design and its Management: Design and Design Management
6 activities are distinguished within the Design Activity Management (DAM)
7 model (Figure 5.8). Design activities address design goals while design
8 management activities are aimed at design activity goals and the overall
9 goal of the MA. The DAM model describes these activities and their related
1011 inputs and outputs. The E2 and DAM models provide a basis for dening
1 a process model for Performance Measurement and Management (PMM)
2 presented in Figure 5.23. The PMM model relates design and design man-
3 agement activities through performance measurement and management.
4 The measurement of effectiveness is seen to be critical in providing deci-
5 sion support for the local control of such activities. The model presented
6 is again generic in nature and may be applied at any level of design to
7 support measurement and management.
8 3. Requirement for Coherence: The models focus on the performance of activ-
9 ities in design development. The E2 and PMM models may also be used to
2011 support performance analysis where the focus may be on a number of activ-
1 ities, for example within a design phase/stage, the overall design process,
2 etc. The decomposition of activities and goals as discussed in Section 5.3.1
3 highlights that the E2 performance model may be applied at any activity
4 level within such a decomposition. The existence of a Goal Breakdown
5 Structure, and maintenance of the relationships within it, allows the E2
6 model to be applied in a manner that supports alignment within design or
7 design activity goals. In addition, the DAM and PMM models support
8 explicit denition of design and design activity goals and the alignment of
9 performance across these goals, i.e. it allows trade-offs to be identied and
3011 managed in achieving optimum overall effectiveness. Therefore, given the
1 correct conditions, such as the existence of a GBS, goal priorities, etc.,
2 the models developed here may be applied to give a degree of coherence
3 in performance measurement.
4
4. Inuences on Performance: The primary inuences on performance are
5
modelled as resources within the E2 representation and are therefore clearly
6
distinguished from output. Such an inuence could be a CAD system, an
7
expert designer, or a whole approach such as Concurrent Engineering
8
(CE)12. The degree of inuence is given by the change in efciency and/or
9
effectiveness resulting from the incorporation of a new tool, method, etc.
4011
Activities themselves may act as resources, i.e. the output knowledge of
1
some activities may be used as resource knowledge within another activity.
2
3 Efciency and effectiveness are the key elements of performance and it is
4 proposed that, although effectiveness provides direct knowledge of goal
5 achievement, efciency provides further insight into performance and sup-
6 ports planning activities in design. The key characteristics of efciency and
7 effectiveness are summarised in Table 5.1.
8 It can be seen from the table above that efciency is an inherently complex
9 property of a knowledge processing activity and although the knowledge
50 resource used in activities may be established using metrics based on time,
51111 cost, etc., evaluating the amount of knowledge gained is more difcult. It was
85
A Methodology for Enhanced Design Performance
86
A Formalism for Design Performance Measurement and Management
1 10 In some of the literature this has been described as product and process effectiveness.
2 However process effectiveness does not focus on the effectiveness of an individual activity
but on a collection of activities dened within a process.
3 11 It may be desirable to take no controlling action, i.e. to maintain all goals, resources, etc. as
4 they currently are and allow the managed activity to continue.
5 12 Although CE may be seen simply as carrying out activities in parallel the knowledge of
6 how to achieve CE may be modelled as resource knowledge here, which might be used at a
planning phase of design.
7
8
9
1011
1
2
3
4
5
6
7
8
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
87
1
2
3
4
6 The Impact of
Resources on
Effectiveness
89
A Methodology for Enhanced Design Performance
90
The Impact of Resources on Effectiveness
1
2
level of innovativeness
3 market size, growth, etc.
technological maturity
4 spend on R&D
PLANNING .
5 .
6
7
8
9
FEEDBACK/
1011 CONTROL
1
2
3
4 .
5
6 IMPLEMENTATION
7
8 .
9
2011 . .
1
2
3 Figure 6.1 Planning and implementation of strategy.
4
5
6 in taking a product from concept to mass production [9]. Therefore, estab-
7 lishing a causal link between strategy and effectiveness has been proven
8 difcult in many of the studies reported in the literature [64].
9
3011
1 6.1.2 Influence of Goals
2
3 Goals may both represent the starting point for the development of a strat-
4 egy and be a component of the results. That is, a new product strategy may
5 be created to achieve particular high level goals such as prot and subse-
6 quently further goals may be specied that are related to the delivery of
7 elements of the strategy, such as levels of product innovativeness, R&D
8 budget, etc. As highlighted in Chapter 5 the categories of knowledge (i.e. input,
9 output, resources and goals) are dened by the activity to which they relate.
4011 For example in the case of strategic planning activities (Figure 6.2) the input
1 knowledge may represent the overall goals of the organisation. The goal of
2 the strategic planning activity may be to dene a coherent plan, which is based
3 on achieving those goals. Therefore, the output could be activities and sub-
4 goals dened within such a plan. That is, although goals inuence the output
5 of an activity, this output may itself be composed of activities and/or goals.
6 The denition of goals and measurement of effectiveness with respect to
7 these goals has an impact on the behaviour of resources, e.g. in the case of
8 human resources goals may provide motivation to the individual. The obser-
9 vation of Johnson [155] that What you measure is what you get reects the
50 view that when people are aware of being measured they tend to respond
51111 through changes in their behaviour. However, it is proposed here that it is
91
A Methodology for Enhanced Design Performance
Coherent 1
Plan 2
3
4
5
Strategic Activities/ 6
Organisational
Goal Planning Goals/ 7
. 8
9
1011
1
Planning 2
Resources 3
Figure 6.2 Strategic planning activity. 4
5
6
not measurement, per se, that directly inuences behaviour but the goal that 7
is linked with the measure(s) and the knowledge that performance against 8
this goal will be assessed. For example, if the time taken for a particular 91
activity is being measured there is often a goal (implicit or explicit) of min- 2011
imising the duration of that activity. It is this goal, and the knowledge that 1
performance against the goal may be visible to peers, superiors, etc. that often 2
inuences the behaviour of those involved in the activity. Therefore commu- 3
nication of goal knowledge (G) to the individual (R) involved in an activity 4
(A) will inuence the output obtained (O) and consequently, the level of 5
effectiveness (), as illustrated in Figure 6.3. 6
7
8
6.1.3 Influence of Resources 9
3011
Although goals (G) may have an indirect impact on the output (O), via the 1
behaviour of the resources (R) as presented above, it is the resources them- 2
selves that have a direct inuence as presented in the E2 model. Essentially, 3
an activity cannot take place, and realise any level of effectiveness, in the 4
absence of a resource. Considerable effort is spent on ensuring the right 5
resources are used at the right time, to carry out the right activities for the 6
7
8
G 9
4011
1
2
3
Activity
I
(A)
O 4
5
6
7
8
R 9
50
Figure 6.3 The inuence of goals. 51111
92
The Impact of Resources on Effectiveness
1 right reasons, to give the right results [156]. Resources are seen as a critical
2 inuence on the effectiveness of design activities.
3 The work presented here is concerned with further analysing the inuence
4 of resource knowledge on activities in design. This knowledge (R) is actively
5 engaged in transforming the state of the design/design activity knowledge.
6 Given two similar activities (Ai and Aj) with the same input (I) and goals
7 (G) it is the resource knowledge that directly affects the output (O) and
8 consequently the level of effectiveness.
9
1011
1 6.2 Resource Knowledge (R)
2
3 Within the E2 model resource knowledge is explicitly represented as an
4 element (i.e. a type of input) of the formalism and the carrier of this knowl-
5 edge may be in various forms. That is, the formalism does not take account
6 of the characteristics of the resources themselves, only the knowledge repre-
7 sented in the resource. Resource knowledge could typically exist within people
8 such as designers, organisational approaches such as Concurrent Engineering
9 (CE), explicit methods such as Quality Function Deployment (QFD) and tools
2011 such as Computer Aided Design (CAD) software. For example, particular rules
1 to support the parallel execution of activities in a CE based environment [157]
2 may be seen as resource knowledge that can be utilised within activities.
3 According to Fadel [158]:
4 . . . being a resource is not an innate property of an object, but is a property that is
5 derived from the role an object plays with respect to an activity.
6
7 That is, as discussed in Section 5.2.1, an element of knowledge may only be
8 described as resource knowledge when related to a specic activity. With ref-
9 erence to the E2 model, resources provide all the knowledge inputs required
3011 to support the execution of the activity with the exception of the initial state
1 of the design/design process (I) and the goals/constraints (G). Although the
2 designer is a critical resource and acts as a source of both design (DR) and
3 design activity knowledge (DAR) there are an increasing number of other
4 resources available to support design development activities, including:
5 Approaches: such as Concurrent Engineering [143] and Team Engineering
6 [159]
7 Methods: such as Quality Function Deployment [112] and Function
8 Analysis [15]
9
Tools: such as Computer Aided Design (CAD) systems [160] and morpho-
4011
logical charts [12]
1
2 Resources in each of these categories may support the capture and represen-
3 tation of knowledge and/or contain existing knowledge, both design (DR) and
4 design activity knowledge (DAR). For example, CAD systems allow the
5 capture of the design knowledge (geometric, spatial, etc.), but also may
6 contain rules for the automation of some design activities such as clash detec-
7 tion (DR). Scheduling and resource allocation rules (DAR) are represented
8 in project planning systems such as Microsoft Project which often allows
9 the representation of design activity knowledge in graphical form (e.g.
50 PERT chart). QFD also provides specic mechanisms for the capture and rep-
51111 resentation of design knowledge (DR), such as the House of Quality, but also
93
A Methodology for Enhanced Design Performance
1
2
3
Activity 4
(Ai ) 5
6
7
8
Activity 9
(Ar )
1011
1
2
3
Figure 6.4 Activity output as resource.
4
5
6
contains an inherent process (DAR) for developing customer needs through 7
to a product design. Approaches such as Concurrent Engineering (CE) are 8
ill dened in the literature but may be literally interpreted as advocating the 91
parallel execution of activities and could therefore be interpreted as design 2011
activity knowledge (DAR). Particular methods and tools, such as those 1
reported in [45] and [145], support the CE approach. 2
A signicant amount of research has been carried out in the area of re- 3
sources to support design, particularly in the classication of these resources 4
[3, 161164]. Araujo [78] and Cantamessa [161] distinguish between resources 5
that directly address design goals (DG) and those that address design activ- 6
ity goals (DAG). Although these classications contribute to the understand- 7
ing of resources in design and development there exists little consensus 8
on an appropriate classication. Indeed, from the discussion above it may 9
be seen that resources such as QFD often address both design and design 3011
activity goals. It is not the focus of this work to dene a detailed resource 1
classication but to provide a means for analysing the impact of resources on 2
the effectiveness achieved with respect to a specic goal(s), within any clas- 3
sication scheme. It will be shown in Chapter 7 that the developed solution 4
provides the exibility to adapt to any classication scheme for resources, 5
including that developed within a particular organisation. 6
In certain cases the output knowledge (O) from an activity may represent 7
resource knowledge (R) in relation to another activity. Figure 6.4 describes 8
the situation where the activity, Ar, acts as a provider of a resource to another 9
activity, Ai. For example, during the design of a gearbox (Ai) an information 4011
search (Ar) may be undertaken to investigate latest advances in gearbox 1
design. The output of this search activity may not directly alter the state of 2
the design and consequently, the activity itself may be viewed as a resource 3
activity in this context2, i.e. with respect to the gearbox design activity. 4
5
6
6.3 Relating Resource Use and Effectiveness 7
8
Resources are used within an activity in order to deliver an output that meets 9
the goals and/or satises the constraints of the activity, i.e. to achieve a certain 50
level of effectiveness. However, the level of effectiveness that may be achieved 51111
94
The Impact of Resources on Effectiveness
1 by particular resources is not inherent in the resource but will depend on the
2 activity/context in which the resource is used. For example, resources may be
3 more/less effective in situations where the focus (i.e. goals) is on innovative
4 as opposed to more routine development [76].
5
6
7 6.3.1 Resource Impact
8
9 The relationship between the use of a particular resource and the effective-
1011 ness achieved in relation to a specic goal is described here as the Impact (Im)
1 of that resource. Given an activity (A) with an input (I) and a goal (G), the
2 impact of a resource, Im(R), describes its capability to act upon the input (i.e.
3 a function of the input) in a manner that achieves the desired output, i.e. that
4 which best meets the goal. That is,:
5 Im (R) = f (I) : O G
6
7 Therefore an increase in the impact (I) of a resource (R) will result in an
8 increase in the effectiveness () achieved, i.e.:
9 Im(R) (A)
2011
1 Figure 6.5 illustrates a comparison of two similar design activities (A1 and A2),
2 for example the activity of draughting, utilising different resources (DR1 and
3 DR2). If we assume the same goal of maximising dimensional accuracy (DG)
4 and the same input of a concept sketch (DI), then the use of CAD (DR1) in
5 one case versus manual draughting techniques (DR2) in another may produce
6 different dimensional accuracy in the output. If we assume that a CAD system
7 produces greater dimensional accuracy3, then the output of the draughting
8 activity using CAD (DO1) will meet the goal better than the output of the activ-
9 ity while using manual techniques (DO2). Therefore, the impact on effective-
3011 ness of CAD, Im(DR1), is greater than the impact of manual techniques,
1 Im(DR2), i.e.:
2 Im(DR1) > Im(DR2)
3 And therefore:
4 1 > 2
5
6 Although no absolute measure of effectiveness is obtained here, establishing
7 the relative impact allows a comparison of effectiveness impact across similar
8
9 DG DG
4011 1 2
1
2
3 Design Design
4 DI Activity DO1 DI Activity DO2
5 A1 A2
6
7
8
9
DR1 DR2
50
51111 Figure 6.5 Comparing impact of resources.
95
A Methodology for Enhanced Design Performance
96
The Impact of Resources on Effectiveness
1 100%
2
3 Effectiveness ()
4
5
6 pt
7
8
9 Impact
S Profile
1011
1
2
3 Smax
al
4
5
6
7
8
9
2011
0 Exal Expt 100%
1
2
3 Exploitation (Ex)
4 Figure 6.6 The Resource Impact (RI) model.
5
6
7
8 100%
9
3011
1
2
3
4
5
Effectiveness ()
6
7
8 pt
9
4011 Impact
S
1 Profile
2
3 al
4
5
6
7 0 100%
Exal Expt
8
9
Exploitation (Ex)
50
51111 Figure 6.7 Effect of change in resource impact.
97
A Methodology for Enhanced Design Performance
98
The Impact of Resources on Effectiveness
1 100%
2
3
4
5
6
7
8
9
1011 Effectiveness ()
1 pt
2 S
3
al
4
5
6
7
8
9
2011 0 Exal Expt 100%
1
2 Exploitation (Ex)
3
4 Figure 6.9 Impact prole and scope for improvement.
5
6
7
8 6.3.4 Assessing Relative Impact
9
3011 Resources may have inherent attributes that indicate the effectiveness they
1 can deliver in a particular activity. For example, within a simulation activity,
2 the ability of a 3D CAD system to provide a sophisticated representation of
3 an artefact may be assumed to negate the need for a physical prototype. This
4 could be seen to address the overall design activity goal (DAG) of reducing
5 cost. It is proposed here that, given a number of different resources and
6 a similar level of exploitation, the relative impact on effectiveness may be
7 estimated. Assuming that the relationship between exploitation and effec-
8 tiveness is a linear one the impact prole is therefore also established, e.g. the
9 proles for resources R1, R2, and R3 presented in Figure 6.10 as High (H),
4011 Medium (M) and Low (L). This provides a basis for an approach to analysing
1 effectiveness as dened within Chapter 7.
2 The simplicity of the approach used here, where a linear relationship is
3 assumed, allows a comparison of the impact of resources to be determined
4 within a relatively short time in comparison to more detailed data collection.
5 The time and resources consumed in performance measurement is consid-
6 ered to be a barrier to the implementation of performance measurement
7 in industry [44, 165]. Having established the impact proles for various
8 resources against goals and the actual and potential exploitation, compar-
9 isons may be made in terms of scope for improvement. This supports
50 the identication of resources where further exploitation may deliver the
51111 greatest benet.
99
A Methodology for Enhanced Design Performance
100% 1
2
3
4
5
H 6
7
8
9
Effectiveness () 1011
1
M
2
3
4
L 5
6
7
8
0 100%
91
2011
1
Exploitation (Ex)
2
Figure 6.10 Relative impact of resources. 3
4
5
6
6.3.5 Resource Effectiveness 7
8
Many resources in design will inuence the achievement of more than one 9
goal. For example, the use of 3D CAD may improve the effectiveness with 3011
respect to design goals (DG) while also reducing activity cost, which may be 1
specied as a design activity goal (DAG). In addition to multiple goals, the 2
priorities of the goals or the goals themselves may change over time. 3
Therefore, assessment of the impact of resources on effectiveness must take 4
account of the overall environment in which the resource is being used in 5
order to increase the validity of the results [94]. For example, in Figure 6.11, 6
a resource (R) may be used to achieve two different goals (G1 and G2) and will 7
therefore impact the effectiveness achieved with respect to each of these goals 8
(1 and 2). The priorities (or weightings) of these goals may vary and there- 9
fore to provide further insight into the impact on effectiveness, in a situation 4011
where there is impact on more than one goal, these priorities should be taken 1
into account. The weighted impact is dened here as the Resource 2
Effectiveness (R) and it allows the overall impact of a resource, within a sit- 3
uation where there are multiple goals/priorities, to be assessed. For example, 4
the resource effectiveness of resource R in relation to goal G1 in Figure 6.11 5
is dened as: 6
7
Resource Effectiveness = R (G1) = (Im(G1)(R) W1)
8
The resource effectiveness in relation to the higher level goal (GO) will be 9
a combination of the weighted impact on effectiveness achieved with 50
respect to each of the goals (ImG1 and ImG2). For example, if the priorities 51111
100
The Impact of Resources on Effectiveness
1 GO
2 R
3
2
4 G1 G2
5 1
6
7 O2
8 Activity
I O
9 (A) O1
1011
1
2
3
4
R
5
6 Figure 6.11 Impact on multiple goals.
7
8
9
(or weightings) of goals G1 and G2 in relation to the higher level goal (GO) are
2011
W1 and W2 respectively then the overall resource effectiveness in relation to
1
GO may be determined as follows4:
2
3 Overall Resource Effectiveness = R(GO) =
4 (Im(G1)(R) W1) + (Im(G2)(R) W2)
5
The use of a CAD system to carry out a draughting activity provides a typical
6
example of the situation described above. In this case the overall goal (GO)
7
may be the production of a drawing providing a 2-dimensional representa-
8
tion of the product. Within this overall goal, sub-goals related to dimensional
9
accuracy (G1) and adherence to ISO5 standards (G2) may exist, each with par-
3011
ticular priorities (W1 and W2 respectively). Within the overall output (O), i.e.
1
the completed drawing, there will be outputs representing the dimensional
2
accuracy (O1) and the adherence to ISO standards (O2). Therefore the impact
3
of the CAD system with respect to each of these goals can be represented as:
4
5 Im(G1)(R) = f (I) : O1 G1
6
and
7
8 Im(G2)(R) = f (I) : O2 G2
9 And the overall Resource Effectiveness may be represented as:
4011
1 R(GO) = f (I) : O G
2 OR
3
R(GO) = (Im(G1)(R) W1) + (Im(G2)(R) W2)
4
5 The use of resource effectiveness to take account of relative priorities of
6 goals/sub-goals supports the concept of alignment with respect to design goals
7 (DG) or design activity goals (DAG) as highlighted in Chapter 5. That is, the
8 existence of a goal breakdown structure (GBS) can allow goal/sub-goal rela-
9 tionships to be dened, such as the relative weightings relationship illustrated
50 above. Although the goal/sub-goal relationship may be more complex than
51111 simple weightings, its denition allows the resource effectiveness in relation
101
A Methodology for Enhanced Design Performance
102
The Impact of Resources on Effectiveness
103
1
2
3
4
7 The PERFORM
Approach
105
A Methodology for Enhanced Design Performance
106
The PERFORM Approach
1 Phases Outputs
2
3
4 Scope of analysis
5 Specification Goals & Priorities
6 Resources
7
8
9
1011 Assessment Resource Exploitation
1 Resource Impact
2
3
4
5 Resource Effectiveness
6 Analysis
Scope for Improvement
7
8
9
2011
Graphical Results
1 Presentation
(multiple views)
2
3
4
5 Revised Data
6 Review What-if Scenarios
7 Consolidated Results
8
9 Figure 7.1 Process model of the PERFORM approach.
3011
1
2
3
4
5 Goal
6 Definition
7
8 Specification
Selection of Target Areas
9 Prioritised Goals
Conflict categorisation
4011
3.5
Criteria prioritisation
1
2.5
% of Total (Ideal)
Explicit procedures
CAD
Contribution
2.2
1.8
2.0
Function/Req's
Breakdown
EDM
2
1.6 Negotiation
System
1.5 1.5
QFD
Email 1.0
Conflict categorisation Quality Control
3 0.5
0.0
Safety/Liability
knowledge
6.0
5.0
Reuse procedures
Criteria prioritisation
4.0
Risk assess't
Safety/Liability
Methods
4
0.0 1.0 2.0 Ease 3.03.0 4.0 Criteria relations def'n
5.0 knowledge
Risk assess't
2.0 ideal
Definition QFD
Quality Control Explicit procedures
6 Approaches QFD
Function/Req's
Breakdown
7 Concurrent
Engineering
Analysis
Negotiation
107
A Methodology for Enhanced Design Performance
7.2.1 Scope 1
2
Before estimating and analysing any impact on effectiveness, it is necessary 3
to fully dene the overall scope of the analysis. The team involved in the analy- 4
sis may have an implicit denition of the scope but this must be discussed 5
and agreed explicitly before proceeding. As discussed in Chapter 2, the analy- 6
sis of performance may be carried out within different scopes, from a Product 7
Development program to an individual activity. The scope establishes both 8
the range and level of activities and provides a boundary for the analysis. 9
Where the results of the analysis are being communicated beyond the team 1011
actively involved in the process, the scope allows the results to be placed in 1
context by others. 2
3
4
7.2.2 Goal Definition 5
6
The scope of the analysis provides a basis for dening a prioritised list of goals 7
for the activities that exist within this scope. For example, where the scope is 8
dened as a project, the goals of the project would be dened here. Although 91
goals such as minimising cost and reducing time are often implicit within the 2011
activities in design they must be explicitly dened here in order to ensure a 1
common understanding exists within the analysis team and support a detailed 2
analysis. That is, the resources may be analysed for their impact against spe- 3
cic goals in order to dene those that make the best impact. In some cases 4
the goals may already be dened within the organisation and are fully under- 5
stood and agreed by the analysis team. Where this is not the case a process 6
of brainstorming and consolidation may be carried out to provide a list of 7
agreed goals as follows: 8
9
Brainstorming: The denition of goals is initiated using brainstorming
3011
where all the individuals in the team contribute and establish a large
1
number of possible goals that may be applicable. No judgements are made
2
by the group at this stage on the suitability of the goals.
3
Consolidation: In this stage the team members describe and clarify the 4
meaning of those goals they have dened (also known as scrubbing 5
the data) to ensure that a common understanding exists. In many cases 6
individuals in the analysis team may dene the same goals using slightly 7
different terminology. A consolidation exercise allows the group to agree 8
common denitions of particular goals and eliminate duplication. 9
This process results in a set of design and design activity goals that relate to 4011
the scope of analysis. 1
2
3
7.2.3 Goal Prioritisation 4
5
In using PERFORM it is assumed that the different goals that have been 6
dened may not all be of equal priority (in relation to some higher level goal) 7
and therefore these priorities must be dened. The determination of goal 8
priorities may be carried out using a number of methods. Existing methods 9
available to support the weighting of goals include simple ranking, pair com- 50
parison [102], prioritisation matrices [112] and the Analytical Hierarchy 51111
108
The PERFORM Approach
1 Process (AHP) [100]. These methods are aimed at the transfer of tacit knowl-
2 edge (in relation to goal importance) of the team members into explicit
3 knowledge and arriving at consensus within a team. The methods range in
4 complexity and consequently speed and ease of use. All of the methods are,
5 to some extent, subjective and as such there is no best method that may be
6 unanimously adopted for every situation [15, 166]. It is not the intention here
7 to analyse prioritisation methods or propose a method based on the results
8 of this analysis, as any of the above methods may be used within PERFORM.
9 Rather, the focus is on further formalising the principles developed in
1011 Chapters 5 and 6 in order to provide a means for their implementation. For
1 the purposes of illustration the prioritisation method adopted in PERFORM
2 is a relatively simple one where participants are asked to select the impor-
3 tance ranking from a three level scale of Low (L), Medium (M) and High (H).
4 An example of goals and priorities for an actual design project carried out
5 in practice is shown in Table 7.1. The topic area is used here as shorthand
6 to allow abbreviated reference in presentation and discussion during the
7 remaining phases of the approach.
8
9
2011 7.2.4 Resource Definition
1
2 Following the denition of goals and priorities it is then necessary to estab-
3 lish the resources, i.e. approaches, methods and/or tools, which are currently
4 being or might be used to achieve these goals. The denition of resources will
5 be dependent on the initial scope of the analysis as dened earlier, e.g. if the
6 focus is on the design phase of Product Development various manufacturing
7 resources may not be considered.
8 Some organisations may have a detailed list of the resources they use within
9 Design and Development and may be able to directly relate this list to the
3011 scope in which the analysis is being carried out. However, such a list may not
1 be available and also, through brainstorming and facilitation, new resources
2 can be identied and incorporated within the analysis. The process of resource
3 denition may therefore begin with a blank sheet or be facilitated through
4 the introduction of a predened list of resources, represented within areas,
5 such as those shown in Table 4.1, dened in [111]. The use of such a list must
6 be open to amendment to t the needs of the particular analysis.
7 The denition of resources is carried out using a process similar to that
8 used to dene the goals but incorporating the use of Afnity Diagrams [112]
9 to allow resource groups to be created. The following are the key steps:
4011
Brainstorming: An initial list of resources is brainstormed. This list
1
includes those resources currently used and those that have not been intro-
2
duced in the organisation but are felt suitable. No judgements are made by
3
the group at this stage on the suitability of the resources.
4
5
6 Table 7.1. Specication of goals and priorities
7 Goal Topic Area Description Priority
8 G1 Risk Minimise development work and technology risk M
9 G2 Rework Reduce rework H
G3 Programme Meet the programme H
50 ... ... ... ...
51111
109
A Methodology for Enhanced Design Performance
110
The PERFORM Approach
1 100%
2
3
4
5
Im = H
6
7
8
9 Effectiveness ()
1011 Im= M
1
2
3
4 Im = L
5
6
7
8 0 100%
9
2011 Exploitation (Ex)
1 Figure 7.3 Impact assessment.
2
3
4
5 relative impact of resources and consequently their impact proles, can be
6 estimated based on knowledge of inherent resource attributes. The represen-
7 tation of impact is based on that normally used within Quality Function
8 Deployment (QFD) to establish the relationships between customer needs and
9 characteristics of the design concept. The impact may be described here as
3011 being Low (L), Medium (M) or High (H)4 resulting in typical impact proles
1 as represented in Figure 7.3.
2 As discussed in Chapter 6 a resource may have an impact on the achieve-
3 ment of more than one goal and this impact may be varied. Within the analy-
4 sis discussed here a large number of resources and goals may be dened and
5 therefore the number of impact relationships may be large. The PERFORM
6 matrix is introduced here in order to provide a framework in which these
7 relationships may be dened. Within this matrix the resources are represented
8 in rows while the goals are represented in the columns. Where an impact
9 relationship exists between a resource and a goal it may be represented, using
4011 the notation dened above, at the intersection between the row and the
1 column as shown in Figure 7.4.
2
3
4 7.4 Analysis
5
6 Having specied the various elements such as resources and goals, assessed
7 the relationships, exploitation of the resources, etc., the next phase involves
8 the analysis of this data in order to provide greater insight. This analysis
9 provides different views of the data that allow various aspects to be con-
50 sidered, e.g. comparing the impact of different resources, establishing where
51111 the greatest potential for further exploitation exists, identifying goals that are
111
A Methodology for Enhanced Design Performance
Goals 1
2
3
Gj
Resource
4
(Ri ) has a Medium 5
Impact on Goal (Gj ) 6
7
Ri M 8
Resources
9
1011
1
2
3
4
5
Figure 7.4 Modelling relationships within a matrix. 6
7
8
91
not adequately supported, etc. The following section outlines the basic 2011
approach and representation used to describe the data before developing a 1
number of measures that may be applied to give the different views required. 2
3
4
7.4.1 Analysis Approach 5
6
Within a simple matrix, where there are a small number of cells, it may be 7
possible to easily obtain an overview of the data and highlight areas of inter- 8
est. For example, in Figure 7.5 it can be seen that resource R2 is having the 9
greatest overall impact and that goal G1 is only being addressed by one 3011
resource. However, if the number of cells is large then obtaining such an 1
overview is difcult. Further analysis of the data is required to easily identify 2
resources with greatest impact, goals being addressed by only a small number 3
of resources, etc. 4
5
6
7
G1 G2 G3 G4 8
9
R1
4011
L 1
2
R2 L M H H 3
4
R3 H L
5
6
7
R4 H L 8
9
50
Figure 7.5 Data overview. 51111
112
The PERFORM Approach
113
A Methodology for Enhanced Design Performance
114
The PERFORM Approach
9
2011
1 Rn En Exn(al) Exn(pt) Imnj Imnm
2
3
4
Measure 2 AM1
5
6
7 This measure can be applied in a similar manner to that of Measure 1 to allow:
8
9 A comparison of the impact of particular resources (Ri) on the overall goal
3011 (GO).
1 Those resources that have a high impact on GO to be distinguished from
2 those with little or no impact, and therefore critical or redundant resources
3 to be identied.
4 The specication phase of the PERFORM approach can support the creation
5 of Resource Areas, e.g. Design Management, which is a number of resources
6 grouped together. The Resource Effectiveness for a Resource Area may be
7 determined through summing the individual resource effectiveness values.
8 That is, summing a part of a column in the matrix provides an analysis of the
9 effectiveness of a number of resources against a particular goal.
4011
1
2 Gj Gm
3
SW Im
n
Wj Wm PRi ....Rn (Gj ) =
4 i =1
j ij
115
A Methodology for Enhanced Design Performance
116
The PERFORM Approach
1 Gj Gm
2
3 Wj Wm
Potential PRi Gj = Wj Imij Exi(pt)
4 Ri Ei Exi(al) Exi(pt) Imij Imim
5
6
7 Rn En Exn(al) Exn(pt) Imnj Imnm
8
9
1011 Measure 5 IM1
1
2
3 Gj Gm
4
5 Wj Wm Actual PRi Gj = Wj Imij Exi(al)
6 Ri Ei Exi(al) Exi(pt) Imij Imim
7
8
9 Rn En Exn(al) Exn(pt) Imnj Imnm
2011
1
2 Measure 6 IM2
3
4
5
6 Similarly, a resource will have potential and actual effectiveness with respect
7 to a higher level goal (GO) that may be determined by:
8
9
3011 Gj Gm
1
Wj Wm Potential PRi (Go ) = PRi (Gj .... m ) =
2
S W Im W Ex
m
3 Ri Ei Exi(al) Exi(pt) Imij Imim j ij j i(pt)
j=1
4
5
6 Rn En Exn(al) Exn(pt) Imnj Imnm
7
8
9 Measure 7 IM3
4011
1
2 Gj Gm
3
Wj Wm Actual PRi (Go ) = PRi (Gj .... m) =
4
S W Im W Ex
m
5 Ri Ei Exi(al) Exi(pt) Imij Imim j ij j i(al)
j=1
6
7
8 Rn En Exn(al) Exn(pt) Imnj Imnm
9
50
51111 Measure 8 IM4
117
A Methodology for Enhanced Design Performance
S W Im Ex
n
Ri Ei Exi(al) Exi(pt) Imij Imim
j ij i(al)
i =1
7
Rn En Exn(al) Exn(pt) Imnj Imnm 8
91
2011
Measure 10 IM6 1
2
3
An analysis may also be carried out relating a number of resources to a higher
level goal in relation to potential and actual exploitation. 4
5
6
Gj Gm 7
Wj Wm 8
Potential 9
Ri Ei Exi(al) Exi(pt) Imij Imim PRi .... Rn (Go ) = PRi .... Rn (Gj ... Gm ) = 3011
S SW Im Ex
n m
j ij i(pt) 1
i =1 j=1
2
Rn En Exn(al) Exn(pt) Imnj Imnm
3
4
5
Measure 11 IM7 6
7
Gj Gm 8
Wj Wm
9
Potential 4011
Ri Ei Exi(al) Exi(pt) Imij Imim PRi .... Rn (Go ) = PRi .... Rn (Gj ... Gm) = 1
S SW Im Ex
n m
j ij i(pt) 2
i =1 j=1
3
Rn En Exn(al) Exn(pt) Imnj Imnm 4
5
6
Measure 12 IM8 7
118
The PERFORM Approach
1 These measures may be applied in a similar manner to the rst four meas-
2 ures. The results from the potential and actual situations may be compared
3 to allow the identication of instances where gaps exist between them. The
4 existence of such a gap indicates that the resource(s) may be exploited further
5 to achieve higher levels of effectiveness with respect to a particular goal(s).
6 The gap between potential and actual effectiveness is dened here as the Scope
7 for Improvement, i.e.:
8
9
1011
1 Scope for Improvement = Potential P Actual P
2
3 Measure 13 IM9
4
5
6
7 Therefore Scope for Improvement in:
8
9 of a resource in relation to a specic goal =
2011 Measure 5 Measure 6
1
2 of a resource in relation to a higher level goal =
3 Measure 7 Measure 8
4
5 of a resource group in relation to a specic goal =
6 Measure 9 Measure 10
7
8 of a resource group in relation to a higher level goal =
9 Measure 11 Measure 12
3011
1
2
Although the measures described above allow the scope for improvement to
3
4 be dened, they do not allow any analysis of how easy (or difcult) it may be
5 to improve the effectiveness. That is, it does not indicate the degree of dif-
6 culty in closing the gap between actual and potential effectiveness. The ease
7 of further exploiting an existing or new resource is captured in the assess-
8 ment phase of PERFORM as discussed earlier (Section 7.3.1). It may be pre-
9 sented in the results through the use of specic diagrams, for example the use
4011 of a scatter diagram to relate the ease to the resource effectiveness (e.g. Figure
1 9.2 (c)). However, it is also incorporated within the analysis here to assist in
2 highlighting those resources where further exploitation may give the greatest
3 return.
4 Return on Investment (RoI): The Return on Investment (RoI) measure is
5 dened here to provide an indication of the return, in terms of increased effec-
6 tiveness, from the investment of time, people, money, etc. to further exploit
7 a resource. This investment could be in the form of training, installation of
8 new equipment, etc. The measure is dened here based on the effectiveness
9 of the resource in terms of the higher level goal (GO) as in many cases the
50 resource will impact on more than one of the specied goals. It is dened
51111 as follows:
119
A Methodology for Enhanced Design Performance
1
Gj Gm
2
3
S
m
Wj Wm
RoIRi (Go ) = Wj Imij Ei
Ri Ei Exi(al) Exi(pt) Imij Imim
j=1 4
5
6
Rn En Exn(al) Exn(pt) Imnj Imnm 7
8
9
1011
Measure 14 RoI
1
2
3
This measure may be applied across all the individual resources to allow the 4
comparison of the RoI in each of them. The results provide an indication of 5
areas where it may require more investment to gain increased effectiveness 6
and those areas where less investment is required. 7
Normalising the Results: In order to be able to compare all the results 8
against a common datum the data resulting from the analysis is normalised. 91
That is, the effectiveness of resources against goals is normalised using 2011
the Total Effectiveness of all the resources against the higher level goal (GO) 1
2
3
Gj Gm
4
Wj Wm
PRi ... Rn (Go ) = PRi ... Rn (Gj ...m ) =
5
6
S SW Im
n m
Ri Ei Exi(al) Exi(pt) Imij Imim
i =1 j=1
j ij 7
8
Rn En Exn(al) Exn(pt) Imnj Imnm 9
3011
1
2
Measure 15 Total Effectiveness 3
4
calculated using the following: 5
Where: 6
7
n = total number of resources in the analysis scope
8
m = total number of goals in the analysis scope 9
4011
Therefore, as an example of this normalisation, the effectiveness of a partic-
1
ular resource against a higher level goal (GO) would be normalised as follows:
2
m 3
WjIij 4
j=1
Normalised Ri (GO) =
n m
Equation 1 5
WjIij 6
i=1 j=1
7
The use of normalisation in this case aids comparison of values obtained from 8
the application of the different measures. For example, the effectiveness of a 9
particular area of resources could be directly compared with the effectiveness 50
of an individual resource, where both are expressed as a percentage of the 51111
120
The PERFORM Approach
1 Total Effectiveness. This allows the contribution of that resource to the overall
2 resource area to be assessed. Although various different sets of results
3 may be derived from using the PERFORM approach they may be expressed
4 on a common scale, i.e. percentage of Total Effectiveness, through the use of
5 normalisation.
6
7
8 7.5 Representation
9
1011 The application of the measures to the data captured within the PERFORM
1 matrix allows different views of the data to be generated and represented in
2 various formats. The results can be represented within a table to allow detailed
3 and accurate comparison of data. However, such an analysis will provide
4 limited benet given that the approach dened here is aimed more at ease of
5 use and the provision of rapid feedback, e.g. using simple scales of 1, 3 and 5
6 for Priority, than on detailed assessment. The representation of the results in
7 a graphical format provides a more easily interpreted comparison of values
8 allowing relative effectiveness to be more quickly assessed. The PERFORM
9 approach relies primarily on the use of graphical representation in the
2011 form of standard graph types, such as bar charts, scatter diagrams and radar
1 diagrams. The creation of such charts is supported by software tools such
2 as Microsoft Excel 97 which has been integrated within the PERFORM
3 system (See section 7.8). The results can be presented to the analysis team in
4 a variety of forms depending on requirements. For example, they could be
5 presented at the time of the analysis (i.e. on the day that the team is brought
6 together) using electronic presentation equipment. Alternatively, they could
7 be presented in the form of a printed report at a later date.
8
9
3011 7.6 Review
1
2 The presentation of the results provides an opportunity for the team to revisit
3 the data dened in the assessment phase of the process. That is, the team
4 is given the opportunity to review and discuss the results and identify any
5 apparent anomalies. Data can be easily altered at this point and re-analysed
6 to allow what-if scenarios to be performed and support sensitivity analysis
7 [127]. The development of the software system to support the approach
8 greatly simplies this phase and allows different scenarios to be investigated
9 at the time of the analysis with the team present. That is, the changes can be
4011 input and results provided in a typical time of 5 minutes5.
1
2
3 7.7 Further Analysis
4
5 Where the initial scope of analysis is at a high level in terms of the denition
6 of goals and resources, further analysis may be appropriate at a lower level.
7 This analysis would follow the same process as described above with the
8 exception that the areas of focus that have been dened in Phase 1 would
9 become the goals of Phase 2. This reects the approach developed in Enhanced
50 Quality Function Deployment (EQFD) where multiple prioritisation matrices
51111 are generated and linked to support in-depth analysis [112]. For example
121
A Methodology for Enhanced Design Performance
122
The PERFORM Approach
1 Gj Gm
3.5
Conflict categorisation
2 3.0 3.0
2.7
3.0
0.00 0.50
Criteria prioritisation
1.00
Established methods
% of Total Ideal
1.50 2.00 2.50 3.00 3.50
3
2.5
% of Total (Ideal)
Explicit procedures
2.2
Contribution
Gj 2.0 2.0 2.0
Function/Req's
1.8
Breakdown
4
1.6 Negotiation
1.5 1.5
Scope 1.0
Conflict categorisation Quality Control
QFD
Ri I ij
5 I of O 0.5
Safety/Liability
knowledge
6.0
5.0
4.0
Reuse procedures
Criteria prioritisation
Risk assess't
Analysis
0.0
Safety/Liability
Risk assess't Criteria relations def'n
6
0.0 1.0 2.0 Ease 3.03.0 4.0 5.0 knowledge
2.0
1.0
ideal
current
Reuse procedures Established methods
7
0.0
8 QFD
Function/Req's
Breakdown
9 Ri Rn
Negotiation
1011 E2 Relationship
1 Representation Modelling Analysis/Presentation
of Scope
2
3 Facilitation
4 Computer Support
5
6 Figure 7.7 Supporting the PERFORM approach.
7
8
9 tor(s) experienced in the implementation of the approach. The approach is
2011 enhanced where the facilitator provides detailed knowledge of the subject
1 area, i.e. resources that may be used to support design, including those con-
2 sidered by industry to be leading edge. The facilitation of the team using
3 an experienced facilitator(s) also supports the generation and resolution of
4 conict in a constructive manner to arrive at consensus. Achieving consen-
5 sus in this manner provides an important element of ownership on the part
6 of the team involved. That is, the team view the results of the analysis as their
7 results, based on their input data and not generated by an external party.
8 A computer-based system has been developed to support the implementa-
9 tion of the approach, particularly in the analysis and presentation phases. In
3011 general the primary facilitator is supported by a co-facilitator who is also
1 responsible for the data input and operation of the PERFORM software
2 system. The later phase, analysis and presentation, needs less facilitation as
3 additional support is provided by the computer-based software tool (Figure
4 7.7). This tool allows results to be quickly generated and presented on the day
5 of the analysis.
6
7
8 7.9 Discussion
9
4011 When multiple alternatives and multiple criteria are present, decision makers
1 often base decisions on the application of rules of thumb and decision
2 making may be improved through the use of explicit methods [15]. PERFORM
3 provides a structured approach for the assessment of multiple resource/
4 goal relationships (i.e. impacts). The use of a structured approach encourages
5 decisionmaking based on explicit criteria and provides a mechanism for
6 recording decision rationale in an easily understood form [16].
7 The analysis approach dened here supports performance measurement,
8 performance management and decision making for performance improve-
9 ment. That is, this approach allows the user to measure the impact on
50 effectiveness of a particular resource. It provides an indication of the level of
51111 effectiveness currently being and that might be achieved. Further, it supports
123
A Methodology for Enhanced Design Performance
124
8
A Methodology for
Performance Modelling
and Analysis in Design
Development
125
A Methodology for Enhanced Design Performance
performance in any situation and deriving measures for its evaluation. Having 1
established an initial model of performance an analysis approach, PERFORM, 2
provides a means to identify areas for performance improvement. The degree 3
of improvement may be established through further application of the 4
formalism to model and measure performance. The methodology therefore 5
presents a cycle for achieving continuous performance improvement (Figure 6
8.1). The key elements of the methodology are: 7
8
1. Design Development Performance Formalism (Chapter 5): A formalism for
9
modelling performance in design development to support its measurement
1011
and management is dened. This captures the fundamental nature of
1
performance and its relation within the activities in design development.
2
The models developed here provide a comprehensive representation of
3
performance (dened as efciency and effectiveness) and provide a basis
4
for measuring these. This provides a basic understanding of design devel-
5
opment performance that is necessary to conduct further analysis in this
6
area. The models provide a context for the denition of an area of focus in
7
the remainder of the book.
8
2. Resource Impact Model (Chapter 6): Having established a comprehensive 91
view of design performance an area of focus is dened and addressed. One 2011
aspect of performance, effectiveness, is selected. Further, an area of inu- 1
ence on performance is identied for analysis, i.e. the impact of resources. 2
3
4
Measures and values of 5
efficiency and effectiveness 6
7
8
PERFORM
9
Specification 3011
1
(Ad ) 2
3
Assessment 4
(Ad )
5
6
Analysis 7
(Ad ) (Am ) 8
9
(Am ) 4011
Presentation 1
2
3
(Am ) 4
Review
5
6
7
8
Areas for improvement in 9
efficiency and Effectiveness
50
Figure 8.1 A methodology for design performance modelling and analysis. 51111
126
A Methodology for Performance Modelling and Analysis in Design Development
1 The nature of this inuence is explored and modelled within the Resource
2 Impact model. Further elements are derived as part of a framework to
3 support an analysis of resource effectiveness, such as Potential/Actual
4 Exploitation and Ease of Exploitation.
5 3. Analysis Approach (Chapter 7): The PERFORM approach brings together
6 the elements of the framework identied in Chapter 6 within an overall
7 analysis approach to be carried out in a facilitated workshop environment.
8 This approach utilises a matrix as a core tool to allow the representation
9 of relationships between resources and goals, i.e. impact. The results of the
1011 analysis allow the identication of resources that may be exploited further
1 to achieve greatest performance improvement.
2
3 The methodology presented here provides the knowledge, principles and
4 approach necessary to understand design development performance, to
5 model an aspect of it for a particular organisation and to derive a means for
6 its improvement.
7
8
9 8.2 Design Development Performance Formalism
2011
1 Within this area of the methodology the nature of performance and its
2 relation to design activities is analysed. The development of a fundamental
3 understanding of design performance involves a number of key areas:
4 1. The development of a Design Activity Management (DAM) model, based
5 on an analysis of the design activity from a knowledge-processing
6 viewpoint and aimed at distinguishing and relating design and design
7 management activities.
8 2. The development of a Design Performance model (E2), which identies ef-
9 ciency and effectiveness as the core elements of performance and relates
3011 them within the basic knowledge-processing model of the design activity.
1
3. The denition of a process model for Performance Measurement and
2
Management (PMM) based on the elements developed in 1 and 2 above.
3
The description of performance provided in E2 is related to the DAM model
4
in order to describe the process of measuring and managing performance
5
as part of design and design management activities.
6
7 Design is viewed here as the processing of knowledge in order to transform
8 an input to an output, under the direction of goals/constraints, using partic-
9 ular resources (Figure 8.2 (a)). It is recognised within this description that the
4011 goals and/or constraints can relate to the design itself (i.e. the artefact) or
1 the activity of designing. The DAM model distinguishes those activities
2 addressing design goals, i.e. design activities, from those addressing goals
3 related to the activity of designing, i.e. design management activities. These
4 activities are modelled within an overall activity, dened here as a managed
5 activity (Figure 8.2 (b)).
6 The modelling of design activities from a knowledge-processing viewpoint
7 also provides the basis for the denition of design performance. Design per-
8 formance is dened here within the E2 model as efciency and effectiveness,
9 where efciency relates input, output and resource knowledge within the
50 activity and effectiveness describes the relationship between the actual and
51111 intended output knowledge (i.e. the output and the goal) (Figure 8.2 (c)).
127
A Methodology for Enhanced Design Performance
1
2
3
G Design
4
5
6
I Activity O 7
Design 8
Management 9
1011
R
1
2
(a) (b) 3
4
5
6
G Effectiveness 7
Design 8
91
I Activity O
2011
1
Design 2
Management 3
Efficiency R 4
5
6
(c) (d) 7
8
Figure 8.2 Design development performance formalism. 9
3011
1
The denition of efciency and effectiveness provides a basis for determin- 2
ing appropriate metrics to use in their evaluation. 3
From an understanding of design and design management activities and 4
the key elements of performance, i.e. efciency and effectiveness, the PMM 5
model was developed. This model describes the use of efciency and effec- 6
tiveness to support decision making and control (represented as dashed 7
lines) of design/design management activities under dynamic conditions 8
(Figure 8.2 (d)). 9
4011
1
8.3 Area of Performance Analysis 2
3
The analysis of effectiveness provides an important element of controlling 4
design and design management activities and relating them within the 5
PMM model. Although there are a number of factors inuencing effective- 6
ness, such as strategy and goals, the resources used to carry out design/design 7
management activities are considered to have the most direct inuence. 8
The area of performance analysis that is investigated is the relationship 9
between resources, such as approaches, methods and tools and the level of 50
effectiveness achieved. 51111
128
A Methodology for Performance Modelling and Analysis in Design Development
1 100%
2
3
4
5
6
7
8
9 Effectiveness ()
1011
1 pt
2
3 S Impact
4 Profile
5
al
6
7
8
9
2011 0 Exal Expt 100%
1
2 Exploitation (Ex)
3
4 Figure 8.3 Resource impact model.
5
6
7 The impact of a resource has been described here as the capability of the
8 resource to act upon the input in a manner that achieves the desired output.
9 Different resources used in design may be exploited to different degrees and
3011 therefore realise different levels of effectiveness. The relationship between the
1 exploitation of a resource and the effectiveness achieved is modelled within
2 the Resource Impact model as the impact prole. Although in many cases this
3 relationship may be quite complex it is assumed to be linear for the purpose
4 of simplication (Figure 8.3).
5 The Resource Impact model provides a basis for analysing the impact of
6 various resources. Having identied the actual and potential exploitation of
7 a resource and its impact prole in relation to a particular goal the scope for
8 improvement in terms of effectiveness may be dened. The relative estima-
9 tion of impact allows the comparison of scope for improvement across
4011 a number of goals. In addition the resource effectiveness may be dened,
1 which indicates the weighted impact. This supports the analysis of resource
2 impact in an environment where multiple goals with different priorities are
3 addressed.
4
5
6 8.4 Analysis Approach
7
8 The elements developed within Chapter 6 provide the key principles that
9 underpin the implementation of the PERFORM approach. It assumes that the
50 impact of a resource on a goal may be estimated using the judgement of
51111 individuals with appropriate knowledge. PERFORM provides a structured
129
A Methodology for Enhanced Design Performance
130
A Methodology for Performance Modelling and Analysis in Design Development
1 8.5 Summary
2
3 A methodology for design performance modelling and analysis has been
4 presented in this Chapter. The methodology is composed of a number of
5 elements, i.e. a formalism of design development performance, modelling
6 relationships between resource use and effectiveness and an approach to
7 analysis based on these models. The denition of these elements ensures that
8 the PERFORM approach is implemented on a sound basis and provides a clear
9 context for the approach within the broader area of performance. The results
1011 of the implementation of the PERFORM analysis approach support the
1 ongoing management of performance byproviding insight into how resources
2 impact on effectiveness in design. That is, resources may be utilised in such
3 a way as to maximise the impact on effectiveness and improve performance.
4 Part 3 of this book presents applications and summarises key features of
5 the methodology. This is presented in a number of sections describing dif-
6 ferent types of application, each aimed at demonstrating a particular aspect
7 of the methodology.
8
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
131
1
2
3
4
5
6
7
8
9
1011
1 PART III
2
3
4
5
6
7
8
9
Application and
2011
1
2
Key Features
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
Part III Prelude
135
Part III Prelude
136
1
2
3
4
9 Worked Example
137
Application and Key Features
1
HiAdventure Inc is a fairly large US rm (some 2000 employees) making 2
backpacks and other hiking gear. They have been very successful over 3
the last ten year, and are well known nationwide for making some 4
of the best external-frame backpacks around. Their best selling back- 5
pack, the midrange HiStar, is also sold in Europe. In the last one and a half 6
years, this European activity has suffered some setbacks in the market; in 7
Europe internal-frame backpacks are gaining a larger and larger market 8
share. 9
As a response, HiAdventure has hired a marketing rm to look for new 1011
trends and opportunities for the European market. On the basis of this 1
marketing report, HiAdventure has decided to develop an accessory for the 2
HiStar: a special carrying/fastening device that would enable you to fasten 3
and carry the backpack on mountain bikes. 4
The device would have to t on most touring and mountain bikes and 5
should fold down, or at any rate be stacked away easily. A quick survey has 6
shown that there is nothing like this on the European market. 7
This idea is particularly interesting for HiAdventur, because the director, 8
Mr. Christiansen, has a long-standing private association with one of 91
the chief product managers at the Batavus bicycle company (one of the 2011
larger bicycle manufacturers in northern Europe, based in Holland). Mr. 1
Christiansen sees this as a good opportunity to strike up a cooperation and 2
to prot from the European marketing capabilities of Batavus. 3
The Batavus product manager, Mr Lemmens, is very enthusiastic about 4
putting a combination product on the market, a mountain bike and a back- 5
pack that can be fastened to it. The idea is to base the combination product 6
on the Batavus Buster (a midrange mountain bike) and to sell it under the 7
name Batavus HikeStar. 8
The design department at Batavus has made a preliminary design for the 9
carrying/fastening device, but both Mr. Christiansen and Mr. Lemmens are 3011
not very content with it. The users test performed on a prototype also 1
showed some serious shortcomings. 2
That is why they have hired you as a consultant to make a different 3
proposal. Tomorrow there is going to be a meeting between Mr. Christiansen 4
and Mr. Lemmens, scheduled as the last one before presenting the idea to 5
the board of Batavus. Before then, they need to have a clearer idea of the 6
kind of product it is going to be, its feasibility and price. 7
You are hired by HiAdventure to make a concept design for the device, 8
determining the layout of the product, concentrating on: 9
Ease of use 4011
A sporty, appealing form 1
Demonstrating the technical feasibility of the design 2
3
Ensuring that the product stays within a reasonable price range 4
You are asked to produce annotated sketches explaining your concept 5
design. 6
7
Good Luck! 8
9
50
51111
138
Worked Example
139
Application and Key Features
elapsed, etc. is not applicable as design activity input (DAI) at the beginning 1
of the activity. However, as the activity progresses knowledge of the time 2
elapsed becomes a key input to the design management activity, e.g. in one 3
case a schedule is prepared, which is subsequently monitored by the time- 4
keeper. 5
Goals: The goals described within the brief refer only to the design (DG), 6
i.e. ease of use, appearance/form, technical feasibility and price. In addition 7
a time limit of 2 hours is set which may be categorised as a design activity 8
goal (DAG). 9
Resources: One of the key distinctions between the protocol data reported 1011
is in relation to the resources used to carry out the activities. Case #1 con- 1
siders the completion of the task by an individual designer, while in Case #2 2
the task is completed by a team of designers. Where an individual designer is 3
used that individual constitutes both a design (DR) and a design activity 4
resource (DAR) as the individual will carry out the design activities to create 5
the concept and the design management activities such as time management. 6
In the data reported for the team case, a specic individual (timekeeper) was 7
allocated the task of time management i.e. managing the design activity with 8
respect to the time goal, in addition to contributing to the design. Therefore, 91
two of the individuals constitute design resources (DR) only while the remain- 2011
ing designer constitutes both a design and design activity resource (DAR). 1
The above outlines key elements of a formalism that may be used to 2
describe performance of the activities and serve as a basis for analysing per- 3
formance and improving it. These elements are presented within Table 9.1. It 4
should be noted that the elements described here are those that are mentioned 5
6
7
Table 9.1. Elements of performance in Delft study 8
Element CASE #1 Individual CASE #2 Team 9
Inputs 3011
DI1 Design Brief Design Brief 1
DAI1 No initial Input No initial Input
2
Outputs 3
Concept description including Concept description including
information on: information on: 4
DO1 Ease of use Ease of use 5
DO2 Appearance Appearance 6
DO3 Form Form 7
DO4 Technical feasibility Technical feasibility
DO5 Estimated price Estimated price
8
DAO1 Time elapsed Time elapsed 9
Goals 4011
DG1 Ease of use Ease of use 1
DG2 Sporty appearance Sporty appearance 2
DG3 Appealing form Appealing form 3
DG4 Technically feasible Technically feasible
DG5 Reasonable price Reasonable price
4
DAG1 Complete within 2 hours Complete within 2 hours 5
Resources 6
DR1 Individual Designer Team member 1 7
DR2 Team member 2 8
DR3 Team member 3 (part) 9
DR4 Information resources via EXP1 Information resources via EXP1
DAR1 Individual Designer Team member 3 (timekeeper)
50
51111
140
Worked Example
1 explicitly in the text. Additional elements are implicit within the case, e.g. the
2 timekeeper may use a watch as a design activity resource (DAR), the com-
3 pleteness of the information provided by the annotated sketches may be seen
4 as a design goal (DG), additional information elements will be represented in
5 the output such as the approximate size (DO), etc.
6
7
8 9.3 Measuring Performance
9
1011 Performance may be measured through developing efciency and effective-
1 ness measures based on the understanding gained from formalising perfor-
2 mance in the manner described in Chapter 6. For example, the efciency could
3 be determined through some comparison of the quantity of the information
4 provided in the annotated sketches in relation to the time consumed and/or
5 number of people used. That is, the knowledge gained (K+) in design devel-
6 opment may be estimated through some assessment of completeness of the
7 annotated sketches, e.g. a comparison of the number of information elements
8 could be carried out. Information on the resources used is not readily avail-
9 able with the exception of the time and number of people in each case and
2011 therefore efciencies may be measured and compared as follows:
1
Efciency in terms of time: time =
2
No. of information elements : time used
3
4 Efciency in terms of people: people =
5 No. of information elements : no. of people
6
From the information made available on the experiments it is not possible to
7
distinguish the efciency of the design activity from that of the design man-
8
agement activity. However, if the time used in checking and managing the
9
schedule were recorded then the efciency of this management activity could
3011
be analysed. An example of a measure would be to compare the number of
1
scheduling related instructions (DAO) and the time required to dene them
2
(DAR).
3
The design effectiveness may be determined from a comparison of
4
the information provided in the sketches with the specic design goals. The
5
design management effectiveness may be determined through relating the
6
time elapsed to the time goal (DAG). The effectiveness of design development
7
in each case may be determined in relation to the goals as follows:
8
9 Effectiveness in terms of ease of use: ease of use = = rC (DO1 , DG1)
4011
Effectiveness in terms of appearance: appearance = = rC (DO2 , DG2)
1
2 Effectiveness in terms of form: form = = rC (DO3 , DG3)
3
Effectiveness in terms of feasibility: feasibility = = rC (DO4 , DG4)
4
5 Effectiveness in terms of price: price = = rC (DO5 , DG5)
6
Effectiveness in meeting time goal: time = = rC (DAO1 , DAG1)
7
8 The manner in which the design goals are specied makes such comparisons
9 difcult, e.g. the appearance of the product is specied as sporty and
50 appealing. Assessing the degree to which these goals are met is likely to
51111 require subjective measures. Similarly the specication of the price as within
141
Application and Key Features
142
Worked Example
1 ing the Return on Investment (RoI) measure (Measure 14) on the data. This
2 measure takes account of the ease of exploiting the resource, the impact
3 on goals and the goal priorities to provide an indication of the increased
4 effectiveness that may result from further exploitation. The results shown
5 here are based on an analysis of the RoI of exploiting each resource in
6
7 Table 9.2. Assessment of resource exploitation and impact
8 Goals
9 Priority (W) M H H M L
4011 Resources (R) Ease (E) Potential Actual Ease Sporty Appeal Feasi- Price
Degree Degree of bility
1 of Exploi- of Exploi- Use
2 tation tation
3 Ex(Pt) Ex(Al)
4 4.1 Marketing Research M 100% 20% H H L L H
5 4.2 Details on use H 100% 35% H H H L H
4.3 Trials evaluation H 100% 50% M M H H M
6 4.4 Batavus design M 100% 90% H M H M H
7 4.5 Drawing of bike L 100% 10% H H H H H
8 4.6 Drawings of Buster M 100% 80% M L M H
9 4.7 HiStar product information H 100% 34% M M M H
4.8 Drawing of HiStar L 100% 35% L H H H H
50 4.9 Comparable products L 100% 65% L H L
51111
143
Application and Key Features
12.7
% of Total (Ideal)
12.0
10.7
11.9 DR4.2 6
10.0 DR4.5 DR4.6 DR4.4
8.0
7
DR4.3
6.0
6.2 8
DR4.7 DR4.8 DR4.1
4.0
3.9 5.1
POTENTIAL
91
DR4.9
2.0
DR4.9 DR4.7
ACTUAL 2011
0.0
0.0 1.0 2.0 3.0 4.0 5.0 DR4.6
1
Ease 2
3
(c) (d)
4
Figure 9.2 Presentation of analysis results. 5
6
7
relation to the overall goal (GO). Clearly Resource 4.2 (details on the use of 8
the product) provides greatest return from increased use. 9
Figure 9.2 (c) also incorporates the ease of exploitation in a scatter diagram 3011
comparing the ease with the effectiveness (potential) for different resources 1
against the overall goal (Measure 2). Those resources plotted in the top 2
right quadrant provide the most desirable balance between ease and effec- 3
tiveness, i.e. they are both relatively easy to exploit and the resulting 4
effectiveness is high. 5
6
Figure 9.2 (d) illustrates the comparison of potential and actual effective-
7
ness for individual resources against the overall goal (Measures 7 and 8),
8
ordered by values of potential effectiveness. It provides an indication of the
9
resources having greatest overall effectiveness and where the greatest scope
4011
for improvement is possible.
1
Other views of the data may be represented here, depending on the needs of 2
those involved in the analysis and used to support discussion during review. 3
Review: the nature of activities carried out during the review of the data 4
will depend on the individuals involved in the analysis. That is, a number of 5
individuals may wish to review and discuss the results together whereas in 6
other cases the performance analysis may be carried out by an individual and 7
this type of review may be inappropriate. The case presented here is unlikely 8
to be carried on into further, more detailed analysis, i.e. looking at specic 9
features of the resources used and the results presented in Figure 9.2 may be 50
used to directly support decision making on performance improvement. 51111
144
Worked Example
1 From the results it may be seen that the greatest scope for improvement
2 through further exploitation exists in DR4.5 (i.e. making greater use of the
3 information in the drawing of the bike) as evident in Figure 9.2 (d). However,
4 further exploiting DR4.2 (the details on use) is seen to be easier (see Figure 9.2
5 (c)) than doing so with DR4.5 while still having a signicant scope for improve-
6 ment. It is also seen to provide a greater return on investment (see Figure 9.2
7 (b)). The results provide support for decision making on improvements but
8 do not automate the process and nal decision making is carried out by the
9 individual or team.
1011 Having established target areas for performance improvement, e.g. the
1 further exploitation of DR4.2 and DR4.5, this decision may be implemented in
2 future activities. The formal model of performance and the assessment meas-
3 ures described earlier may be reapplied using the results of future activities
4 to assess the degree of improvement realised. The assessment of performance
5 and subsequent analysis to identify areas for improvement constitutes a cycle
6 of continuous improvement (Figure 8.1).
7
8
9 9.5 Summary
2011
1 The example given here is based on a small experimental scenario and
2 although somewhat simplistic in nature it serves to illustrate how the con-
3 cepts presented in Part 2 may be applied. The key points resulting from the
4 case are:
5 The modelling formalisms presented in Chapter 6 allow a model to be
6
developed that describes the key elements of performance in the Delft case
7
and allows efciency and effectiveness to be distinguished and measures
8
to be established.
9
3011 It is possible to identify and distinguish design and design management
1 activities within the experiments, i.e. the actual design of the artefact and
2 time keeping, supporting the principles described in the DAM and PMM
3 models. However, from the information available on the Delft experiments
4 it is not possible to readily identify measures that distinguish the efciency
5 of the design activity from that of the design management activity. This
6 may not be signicant here due to the somewhat articial environment
7 resulting in comparatively little time spent on design management as the
8 management task is a relatively small one (i.e. mainly timekeeping).
9 However, this could be a critical area in larger projects where design man-
4011 agement requires considerable resource.
1 The efciency of the overall managed activity could be considered to be the
2 relationship between the completeness of the information provided in
3 the concept and the time taken (assumed to be 2 hours). Similarly the
4 overall effectiveness could be assessed as the degree to which the overall
5 goal of producing a design concept within 2 hours was met, i.e. it will be a
6 combination of the design and design management effectiveness. This illus-
7 trates the support that the E2 and DAM models provide for performance
8 measurement.
9 The PERFORM approach was applied in a scenario investigating the effec-
50 tiveness of the information resources and resulted in the identication of
51111 particular resources for increased exploitation. This supports the concepts
145
Application and Key Features
described within the Resource Impact model and the overall approach as 1
dened in Chapter 8. 2
Continuous improvement may be implemented and measured using a 3
combination of the performance formalisms and the PERFORM approach 4
as illustrated in Figure 8.1 which describes how all the elements may be 5
related within a methodology for performance modelling and analysis. 6
7
8
9
1011
1
2
3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
146
1
2
3
4
10 Analysis and
Critique of
Metrics
147
Application and Key Features
10.1 Approach 1
2
Haffey [168] presents a review of metrics in design listing a range of metrics 3
that have been proposed in the literature relating to performance. These 4
metrics were analysed in relation to the four views of efciency and effec- 5
tiveness listed above to establish which view(s) a metric described. The 6
detailed results of this exercise are presented in Appendix A. 7
The true meaning behind the metrics quoted by different authors is often 8
not readily apparent and a number of assumptions had to be made within the 9
exercise. These assumptions are documented as part of the analysis results. 1011
In addition the following points are intended to clarify some of the results 1
described in the table presented in Appendix A: 2
3
A metric such as customer satisfaction is considered to ascertain the degree 4
to which the nal product meets or exceeds the initial requirements of the 5
customer. Within the scope of the models presented here this can be seen 6
to (partially) measure design and/or design management effectiveness 7
where it is assumed that customer satisfaction is measured in relation to 8
design (artefact) or design activity goals. For example, the metric may illus- 91
trate whether the product functionality satises the customer (design effec- 2011
tiveness) or how satised the customer is with the date on whichthe product 1
was delivered (design management effectiveness). It is assumed here that 2
the goals were accurately dened from the customer requirements and 3
therefore meeting the goals also means meeting the requirements. 4
Some of the metrics listed in the review are not directly attributable (NDA) 5
to design development activity performance. For example, the values 6
obtained using a metric such as market share achieved could be partly 7
attributed to artefact performance and considered to reect design effec- 8
tiveness. However, the values could equally be attributed to the corporate 9
image of the organisation promoting the product, the effectiveness of their 3011
supply chains, etc. The metric is therefore considered too vague to be 1
directly attributed to design development performance. 2
In some cases the exact meaning of the metric is unclear from the source 3
and therefore not possible to categorise or analyse it fully in this review. 4
A metric measures design development performance if it can be seen to 5
relate to effectiveness (design or design management) and/or efciency 6
(design or design management). Where this is not the case an effort is made 7
to dene the metric in relation to the broader work presented here, e.g. it 8
may be a measure of design activity output (DAO). 9
4011
1
10.2 Results 2
3
The results of the analysis provided support for the concepts presented in this 4
book, both in terms of the validity of the E2 and DAM models and increased 5
insight into design development metrics. 6
7
8
10.2.1 Supporting E2 and DAM Models 9
50
From an analysis of the metrics a number of observations can be drawn: 51111
148
Analysis and Critique of Metrics
149
Application and Key Features
150
Analysis and Critique of Metrics
151
11 Industry
Appraisal
153
Application and Key Features
155
Application and Key Features
It was found during the appraisals that the issues raised were centred pri- 1
marily on the understanding and insight into design development provided 2
by the modelling formalisms, i.e. E2, DAM and PMM models. The issues raised 3
by the appraisers were mostly addressed by the work presented and in some 4
cases conrmed the need for further investigation. Additional comments 5
helped clarify some key concepts and terminology. 6
The following illustrate the support provided for the modelling formalisms: 7
8
Their comprehensiveness in terms of describing design development is
9
illustrated in a number of points, e.g. points 1, 2, 6, 9 and 11.
1011
The generic nature of the formalisms is illustrated in points 4, 5, 7 and 10. 1
The relationship to existing research is evident in point 3. 2
The support provided for coherence, goal specication and management 3
of resources is described in points 6, 8 and 12 respectively. 4
Point 13 provides support for the relative assessment used within the 5
PERFORM approach. 6
7
8
11.2 Industrial Practice 91
2011
During the discussions with industrial practitioners the appraisers offered 1
comments on how the concepts related to their work, i.e. industrial practice. 2
The comments in Table 11.3 have been taken from the conversations: 3
4
5
6
Table 11.3. Relation to Industrial Practice 7
Subject(s) discussed Comments 8
1. Research Review . . . In Company X1 there has been considerable effort expended seeking 9
a generic model/formalism for Product Development performance but 3011
without success, i.e. it was agreed that a such a model did not exist . . . 1
. . . the company gave up on the use of empirical/statistical analysis as 2
the process changes so much, e.g. a whole new process was introduced in 3
using 3D CAD and therefore previous performance was not applicable 4
and trends were not possible to establish . . .
5
. . . the proliferation of metrics is something which is prevalent within
the industry as the company uses many metrics without an
6
understanding of why they are using them. Some metrics are a legacy of 7
many years past and have no relationship to the type of activities 8
currently carried out . . . 9
4011
2. E2 . . . the model provides clarity on the distinction between efciency and
effectiveness and also illustrates how they relate. . . the fundamental 1
model says it all to me 2
. . . I feel the validity [of the E2 Model] is proven and it shows 3
cohesiveness in relation to an area where there is no model existing . . . 4
5
3. Coherence . . . this is called a cause-effect relationship within the company and is 6
the focus of much work aimed at determining the nature of this cause-
effect. This maps directly to the denition of goal/sub-goal relationships 7
and the need for alignment and coherence. The problem of sub- 8
optimisation is a substantial one within the organisation. . . 9
. . . the organisation set up a separate team to develop and manage a 50
goal breakdown (cause-effect) structure . . . 51111
156
Industry Appraisal
157
Application and Key Features
158
12
1
2
3
4
The PERFORM
System
159
Application and Key Features
160
The PERFORM System
1 Analyser: incorporates all the analysis, data ltering and ordering rules for
2 the system and transforms the input data to output based on the measures
3 and data ordering templates.
4 Representer: takes output data from the analyser and applies graphical tem-
5 plates to produce graphical output to aid interpretation.
6 Presenter: applies a presentation template to all the graphical output to
7 produce a logically structured presentation.
8
9 An interface implemented via Microsoft Excel, provides a means by which
1011 the user may enter the various elements of the PERFORM matrix. The entry
of values other than those dened in Section 7.4.2 for Ease, Degree of
1
Exploitation, Priority and Impact is prevented by the relationship modeller
2
through the implementation of data input rules. The interface also provides
3
a visual presentation of the data to the user. Figure 12.2 illustrates the inter-
4
face including the representation provided for elements such as Ease, Degree
5 of Exploitation, Priorities and Impact1. A number of command buttons are
6 provided that initiate the program modules. The Hide and UnHide
7 buttons allow the user to make rows and columns that are not in use invisi-
8 ble/visible. That is, when asked to hide, the system will automatically select
9 and hide those rows and columns without data. The PERFORM! button
2011 triggers the analyser and representer modules to produce graphical output
1 results from the data within a matter of seconds.
2 The use of an earlier version of the system in an industrial environment
3 was reported in Chapter 4. This demonstrated that the general principles
4 incorporated within the system supported industrial practice. The system
5 described above has evolved from the earlier version based on the results of
6 the industrial trial and the increased insight developed within Part 2 of this
7 book, in particular:
8
9 The system is based on the concept of variable resource exploitation as
3011 opposed to changes in the impact of resources. This required a change in
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111 Figure 12.2 PERFORM system interface.
161
Application and Key Features
162
The PERFORM System
1 both internal and external customers. The participants were rstly provided
2 with a presentation of the overall concepts and approach. The analysis data
3 was then created through a facilitated workshop involving the Business Unit
4 Manager and Quality/Information Services Manager. The overall approach
5 used is that as dened in Chapter 7 and therefore only the results are reported
6 here.
7
8
9 12.2.1.1 Specication
1011
1 The scope was dened as the activities of a specic business unit in the
2 organisation and focused on a particular product type that constituted the
3 majority of the units work. Approximately 70 people, many with specialist
4 technical product knowledge, are employed full-time within the unit although
5 particular projects may involve up to 200 staff on a temporary basis. The goals
6 and their priorities are presented in Table 12.1.
7 The organisation carries out specialist design work and the resources spec-
8 ied for the analysis reect this area of work. The list of resources is presented
9 in Table 12.2. Similar resource names are used within the area of Expertise
2011 (people) and Technical Tools. These are distinguished using P and T
1 respectively in the analysis.
2
3
4 Table 12.1. Goals and priorities Company B
5 Goal Topic Area Description Priority
6 L/M/H
7 G1 Growth Growing the Business Year on Year L
8 G2 CN (People) Serve the Customer Need in terms of People M
9 G3 CN (Studies) Serve the Customer Need in terms of Studies M
3011 G4 External Clients Meet the Needs of External Clients M
G5 Inc. Expertise Incubate (grow) the Expertise within the Department H
1 G6 Customer Achieve Customer Excellence by Closer Relationships M
2 G7 Performance Meet/Exceed Performance Targets (Budgets) H
3 G8 Innovation Demonstrate Innovative Capability L
4 G9 Partnering Form Relationships (formal/informal) with other Companies L
5
6
7 Table 12.2. Resource groups and resources Company B
8 Communications Processes Expertise Management Tools
9 Response Naval Arch. P SAP
4011 Team Talks Signatures P CIP
Learning Forums Dynamics/Shock P CSRs
1 Roadshows Marine Eng. P LCycle Mgmt
2 Corporate Brochures CAD P Quality System
3 Project Mgmt/Risk P Knowledge Mgmt
4 Support P
5 Technical Tools IT Systems
6 Naval Arch. T Hardware
7 Signatures T Applications
8 Dynamics/Shock T Infrastructure
9 Marine Eng. T
CAD-T
50 Project Mgmt/Risk T
51111
163
Application and Key Features
12.2.1.2 Assessment 1
2
The participants carried out the assessment using templates based on the 3
PERFORM matrix to dene: 4
5
Ease of exploitation 6
Potential and Actual Degree of Exploitation 7
Impact of Resources on Goals. 8
9
These results were transferred from paper based templates to the PERFORM
1011
system for analysis. The data is shown in Figure 12.3.
1
2
3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
Figure 12.3 Analysis data (system view) Company B. 51111
164
The PERFORM System
1 12.2.1.3 Analysis/Presentation
2
3 The system described in Section 12.2 provided an automated analysis of the
4 results with direct transformation to graphical output. A sample of the results
5 is provided in Figure 12.4 showing some of the key graphs2. The graphs
6 presented here illustrate:
7
8 (a) The RoI for Management Tools: The results indicate that investment in
9 CIP would provide a greater return than an area such as Knowledge
1011 Management, based on the overall goal of the business unit.
1 (b) The Ease of Implementation versus Effectiveness for Expertise: Further
2 exploitation of the expertise in Naval Architecture and Marine
3 Engineering would be relatively easy, yet provide a signicant impact on
4 effectiveness in relation to the overall goal.
5 (c) The comparison of Potential and Actual Effectiveness of all resources
6 against each of the goals: The results suggest that the resources being used
7 can make the greatest impact on Incubating Expertise and Meeting/
8 Exceeding Performance Targets and that there is signicant scope for
9 improvement in relation to these goals. That is, the difference between
2011 actual and potential effectiveness is substantial.
1 (d) The comparison of Potential and Actual Effectiveness of each individual
2 resource against the overall goal: This graph gives overall results for spe-
3 cic resources and highlights areas where signicant improvements
4 may be made. Also, the ease of further exploiting a resource is included
5
6
7
8 SAP
10.0
7.0
Naval Arch.-P
9 8.0 6.0
6.2
6.0
Signatures-P
3011 Knowledge Mgmt
6.0
CIP 5.0
5.5
Potential Effectiveness
1 4.0
% of Total (Potential)
Dynamics/Shock-P
2.0 4.0
2 0.0 3.2
3.3 Marine Eng.-P
3 3.0
CAD-P
4 Quality System CSRs
2.0
5 1.0
1.3 Project Mgmt/Risk-P
6 0.0 Support
3 CN(Studies)
7
8
9
M
M
M
Marine Eng.-T
Project Mgmt/Risk-T
CAD-T
4
10 M Naval Arch.-T
11 M Dynamics/Shock-P
Ext Clients 12 H CIP
13 H Applications
6
17 M Project Mgmt/Risk-P
Customer (ACE) 18 M Quality System
19 H CSRs
Performance 20 H LCycle Mgmt
7 Innovation
21
22
23
H
H
L
Knowledge Mgmt
CAD-P
Corporate Brochures
Potential
Actual
8 Partnering
24
25
26
M
L
L
Team Talks
Response
Roadshows
9
27 M Learning Forums
(c) (d)
50
51111 Figure 12.4 Analysis results Company B.
165
Application and Key Features
166
The PERFORM System
167
168
Table 12.3. Mapping between requirements, solution and evaluation approaches
Requirements Performance Formalism Design & Management Coherence Improvement
Elements of E2 provides a formalism of a DAM distinguishes design and PMM illustrates alignment The PERFORM Approach, based
Methodology knowledge processing activity design management activities and across goals in managing on the performance formalism and
and denes its performance as their relative elements (e.g. goals). performance. The use of the the RI model, and implemented
efciency and effectiveness, PMM relates these activities and performance formalism within within the PERFORM system
related within the formalism. describes the process of measuring a GBS supports alignment provides a means to identify and
and managing performance. within goals. quantify the inuence of resources
and to establish the best means for
improvement.
Evaluation Approaches
Worked Example The formalism provides the The DAM model is used to further The example illustrates how The PERFORM approach is applied
basis for dening efciency and distinguish the various inputs, design and design activity using elements described within
effectiveness measures. The outputs, etc., and determine the goals may be modelled, i.e. the the experiment and additional test
experimental data is used to elements of design and design design goals of form, price and data. The use of the approach to
determine performance management effectiveness measures. the design activity goal of time. identify target areas for
measures based on the The scenario illustrates how the The trade-off in effectiveness improvement is clearly illustrated.
formalism. process of performance measurement for each of those goals
and management is modelled in the demonstrates alignment across
PMM model, i.e. the management of design and design activity goals.
time while carrying out the design
activities.
Metrics The distinction between The distinction between design/
Review/Analysis efciency and effectiveness design activity effectiveness is
remained valid in the review of reected in the metrics review.
metrics, i.e. the metrics that Distinguishing metrics for design
measured performance could be and design activity efciency
distinguished in these two areas. measurement was not possible
although the principle was not
violated. Further insight into metrics
is gained through the performance
formalism.
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
4011
3011
2011
1011
51111
Table 12.3. Continued
Industrial appraisal The questions/feedback provided The DAM and PMM models were The support provided in the The principles underlying the
an evaluation of the shown to be comprehensive and PMM model for coherence was relationship between the
comprehensiveness of the E2 generic in their ability to formalise described in relation to the exploitation of resources and
formalism. The E2 principles design performance. The PMM problem of trading off the effectiveness were described and
were supported. model in particular was seen to beauty of a product with the supported by the appraisers. The
The formalism was also seen to closely reect and clearly describe launch date. This demonstrates use of relative assessment was
reect and clarify industrial industrial practice. alignment across goals. considered feasible in industry.
practice.
System Development Both design and design The PERFORM system provides a
activity goals and their logical realisation of the PERFORM
priorities are modelled within approach and illustrates how the
the system. The system relationships may be modelled and
provides a dynamic model of analysed to identify target areas for
the trade-off through improvement. This realisation
calculating the weighted provides a degree of validation of
effectiveness in relation to the the approach.
overall goal (GO).
Industrial The E2 model was presented as a The industrial participants supported Both design and design activity The implementation of the
Implementation basis for implementing the the principles of the DAM and PMM goals were specied in the PERFORM approach provided
PERFORM approach. It was seen models although more time would be analysis data although many signicant insight into the design
to clarify performance and required for participants to fully were combined goals. All goals development process of the
comprehensively describe it. understand the implication of PMM. were seen to contribute to a organisation and successfully
higher level organisational created results identifying areas
goal. for improvement.
169
Application and Key Features
170
13
1
2
3
4
Methodology
Review
171
Application and Key Features
172
Methodology Review
1 the essence of design management, i.e. the continual trade-off between the
2 achievement of design and design management goals. While design and
3 design management have been the subject of research, their performance
4 relationship at the activity level has not been formalised as described here.
5 The role of efciency in supporting control decisions within the PMM
6 model is not fully clear. From the industrial appraisal it was shown that ef-
7 ciency may be used to support planning decisions but may not support the
8 control decisions in the PMM model in the same way as effectiveness. Further,
9 the worked example and review of metrics presented in Chapter 9 highlights
1011 the difculty in distinguishing efciency of design from that of design man-
1 agement using the information available. That is, due to the continued alter-
2 nation (often mental) between design and design management it may be
3 difcult, in practice, to distinguish the resources used in design from those
4 used in design management.
5 Resources are modelled as sources of knowledge within the formalism pre-
6 sented in this work. The behaviour of these resources is not investigated as
7 part of the work. That is, areas such as motivation, leadership, culture, etc.,
8 in relation to the use of human resources in design are not investigated
9 although these have an impact on performance.
2011 The work provides an understanding of how performance measurement,
1 in particular the measurement of effectiveness, supports the design devel-
2 opment process as described in the PMM model. It does not directly pre-
3 scribe metrics with which to measure performance. It is considered here to
4 be impractical to provide a set of metrics that may be used within any
5 situation, as each situation is to some degree unique in design development
6 and it is more appropriate to describe how those metrics may be determined.
7 The work presents a basis for deriving metrics through developing an
8 instance of the models in relation to a particular area of analysis. Having
9 developed such models, metrics may be dened to analyse the relationship
3011 between outputs and goals (effectiveness) and between knowledge gained
1 and resources used (efciency).
2
3
4 13.1.3 Supporting Coherence
5
6 Achieving coherence requires alignment within and across goals as discussed
7 in Part 1. The DAM and PMM models distinguish and relate Design and Design
8 Activity goals. The measurement and management of performance discussed
9 in Chapter 5 provides a description of the decision making involved in the
4011 trade-off between design and design activity goals in an effort to ensure align-
1 ment. This reveals the nature of design management itself and describes how
2 a degree of coherence across design and design activity goals may be achieved.
3 Achieving coherence within design or design activity goals requires the
4 denition and maintenance of relationships between the goals. This may be
5 realised within a goal breakdown structure, i.e. the denition of some form
6 of network of relationships for a given situation. Therefore, the formalism
7 itself does not achieve coherence within goals per se, only when it is applied
8 within a situation where goal relationships are maintained. As identied
9 above, the modelling formalism may be applied in relation to the complete
50 business but this raises the issue of alignment and coherence between design
51111 development and business performance.
173
Application and Key Features
174
Methodology Review
175
Application and Key Features
176
Methodology Review
1 of complexity and novelty does not reect performance per se, i.e. it does not
2 represent efciency or effectiveness. Rather, it highlights inuences on per-
3 formance. It is reasonable to assume that a highly complex and/or novel
4 design project will require more resources to complete than one that is more
5 simple and familiar. The degree of novelty in a design activity may be reected
6 in the nature of the knowledge processed during that activity. That is, the
7 knowledge required to carry out design development may not be present
8 within existing resources and new knowledge must be obtained. This can be
9 achieved through training existing designers, introduction of new methods,
1011 etc. The complexity of the design development activities may be related to the
1 quantity of knowledge processed (i.e. K+) during the activity. That is a highly
2 complex project may require more knowledge to be processed in comparison
3 to a less complex one. The application of metrics within design development
4 and subsequent comparison or benchmarking with other projects, organisa-
5 tions, etc. should take account of novelty and complexity as key inuencing
6 factors.
7 Two key areas of research are evident, i.e. rstly, to establish a means for
8 measuring complexity and novelty in design development and secondly, to
9 determine the relationship between these phenomena and aspects of design
2011 development performance. Achieving these tasks would greatly enhance our
1 ability to plan in design development and more accurately compare perfor-
2 mance across projects/organisations, i.e. benchmarking.
3
4
5 13.3.2 Resource Impact Profiles
6
7 It has been identied in Section 6.3 that resource impact proles are unlikely
8 to be linear. The work presented here provides a framework for the further
9 investigation of the nature of these graphs. The creation of such graphs may
3011 require detailed analysis of resource use over a considerable period of time
1 to determine the true nature of the relationship.
2 In certain cases the impact prole may be such that the rate of increase
3 in effectiveness may decrease from a certain point in the exploitation of
4 the resource (Figure 13.1). For example, the introduction and initial exploita-
5 tion of a CAD system may have a signicant impact on effectiveness in
6 relation to a time goal. However, beyond a certain degree of exploitation of
7 the CAD system functionality (A-B), additional exploitation may not provide
8 corresponding increases in effectiveness of the same magnitude (B-C). This
9 may be a reection of the degree of ease in exploiting the resource, i.e. the
4011 amount of resources required to further exploit the CAD system from point
1 B in terms of training, customisation, etc. may be much greater than in the
2 early stages.
3 Further research is required to determine the nature of impact proles for
4 different resource/goal combinations. This may involve the analysis of per-
5 formance over a considerable period of time to establish sufcient data,
6 although the weaknesses of such empirical analysis in design development
7 have already been highlighted and should be addressed. It may be possible
8 through such research to determine the actual relationships between
9 resources and the effectiveness that may be achieved in using them. Such rela-
50 tionships would enhance the model developed within the PERFORM matrix
51111 as discussed above.
177
Application and Key Features
100% 1
C 2
3
4
B 5
6
7
8
9
1011
Effectiveness ()
1
2
3
4
5
6
7
8
91
A 2011
0 100% 1
Exploitation (Ex) 2
Figure 13.1 Relating ease to the impact prole. 3
4
5
6
7
8
13.3.3 Knowledge-Based Efficiency 9
3011
The efciency of a knowledge-based activity is considered here to exist irre- 1
spective of whether it is measured or not, i.e. it is an inherent property relat- 2
ing an activity and its resource. The selection and application of metrics to 3
determine efciency allow particular views of efciency to be created, e.g. cost 4
based efciency, time based efciency, etc. Further work is required to estab- 5
lish a means for assessing the quantity of knowledge gained in an activity (K+). 6
7
8
13.3.4 Use of Efficiency 9
4011
The role of efciency in supporting design and/or design management activ- 1
ities is not clear from this work. From the worked example and industrial 2
appraisals it would seem that its primary role is in supporting planning type 3
decisions such as the allocation of resources. That is, estimating the level of 4
resources required to achieve particular design activity goals may be sup- 5
ported by knowledge of the previous efciency of the resources. Efciency is 6
an important component of performance in design development providing 7
insight into the effectiveness of design management activities. Establishing 8
efciency within the knowledge processing activities of design development 9
requires the quantity of knowledge processed or used to be established. This 50
presents an important area for further investigation. 51111
178
Methodology Review
179
14
1
2
3
4
Summary and
Insights
181
182
derived Current Practice/
from
Existing Research
Industrial Trial Primary elements
Nature of Design Methodology
Development Performance derived derived considers
from from
considers
Secondary elements
considers Problems/Deficiencies in considers Requirements for
Resource
Performance Modelling and Analysis Performance Modelling and Analysis
Practice
Strengths/Weaknesses
and Future Work considers Relationship
clarifies meets
2
identifies
E Efficiency and Effectiveness
PERFORM DAM Design Activity Management
Methodology for Performance System
Evaluation PMM Performance Measurement and
Modelling and Analysis
Management
based on based on based on
RI Resource Impact
realised in
Subject Worked derived
Experts Example from
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
91
4011
3011
2011
1011
51111
Summary and Insights
183
Application and Key Features
performance in any situation and these are distinguished and related within 1
the model. 2
The DAM model, which distinguishes design and design management 3
activities within a managed activity. Design activities are aimed at achiev- 4
ing design goals while design management activities address design activ- 5
ity goals. From this model it is possible to distinguish effectiveness and 6
efciency further in terms of design and design management. 7
The PMM model, which describes the process of measuring and managing 8
performance within a managed activity. Design and design management 9
effectiveness are measured and provide key information in support of 1011
controlling decisions such as the allocation of resources, changing of goals 1
or stopping the activity. 2
3
The models and principles that comprise the formalism may be applied at any 4
level in design, from an overall project/program level, where design and its 5
management are formally dened, to that of fundamental activities including 6
those that are carried out mentally by an individual designer. Design and its 7
management are distinguished at all levels. 8
The nature of design development performance, revealed through the 91
development of the formalism, provided a basis upon which to establish an 2011
approach for its analysis. The Resource Impact model has been developed to 1
illustrate the inuence of resources on performance and Principles of Analys- 2
ing Resource Impact have been established. The PERFORM Approach provides 3
a structured process for analysing design development performance and 4
identifying the best means by which it may be improved. 5
The PERFORM System has been developed within this work to support and 6
automate parts of the PERFORM Approach, e.g. to automatically calculate 7
results of the analysis and generate graphical output. The system allows real- 8
time analysis and feedback of results to participants in the PERFORM 9
approach. 3011
1
2
14.1.5 Industrial Relevance 3
The methodology provides a basis for the treatment of design development 4
performance incorporating many elements. A number of approaches were 5
adopted for checking the ideas, addressing different and sometimes overlap- 6
ping elements: 7
8
Worked Example: Based on data collected during an experiment to support 9
design protocol studies, a worked example was provided of the complete 4011
methodology in terms of its application to formalise and analyse perfor- 1
mance in a particular situation. The example illustrated the application of 2
the methodology and its potential to identify areas to be targeted for per- 3
formance improvement. A full application of the methodology in an indus- 4
trial environment could only be done over a considerable period of time 5
to allow results of performance improvement efforts to be realised. 6
Metrics Review and Analysis: The literature on performance was used to 7
identify the range of metrics reecting aspects of performance. These 8
metrics were related to the E2 and DAM models to establish if they could 9
be described in relation to the models and if the principles dened within 50
the models were violated in any way by such metrics. These principles were 51111
184
Summary and Insights
1 supported by the review and an increased insight into the metrics was
2 obtained. That is, metrics could be classied into areas such as design effec-
3 tiveness, design management effectiveness, inuences on performance, etc.
4 Industrial appraisal: Experts in design development, performance man-
5 agement, risk management, etc. provided a basis for critical appraisal of
6 the ideas. The review of these appraisals provides substantial support for
7 the work, primarily focused on the modelling formalism, while also allow-
8 ing the relationship of the work to industrial practice to be revealed and
9 reported.
1011 System Development: The implementation of the PERFORM System pro-
1 vided a realisation of key elements of the PERFORM Approach, based on
2 the Principles for Analysing Resource Impact, within a computer-based
3 model. This provided a logical test of the concepts developed in the
4 approach and allowed the utility of the system to be demonstrated within
5 the industrial implementation of the PERFORM approach.
6
7
Industrial Implementation: The PERFORM approach was implemented
8 within an industrial case study and results were established using the
9 approach that indicated areas of focus for performance improvement. The
2011 review of the case study provided support for the approach in practice, i.e.
1 the participants commented that the approach provided signicant insight
2 into their design development performance and provided an important
3 step towards realising performance improvement.
4 The various means of testing the ideas presented in this book allowed
5 Strengths and Weaknesses of the Methodology to be dened and areas of
6 Future Work to be proposed. The strengths of the work included its compre-
7 hensiveness and ability to clearly distinguish and relate performance of design
8 and design activities. A weakness of the methodology was that the approach
9 focuses on effectiveness and that further analysis of efciency is required for
3011 a complete view of performance. The strengths/weaknesses and areas of future
1 work are further dened in Chapter 13.
2
3
4 14.2 Axioms of Performance
5
6 From the work presented in this book the following Axioms1 of Performance
7 can be argued.
8
9 Axiom 1: Activity performance
4011
1 Activities are the fundamental means that create performance.
2
3 Activities are the fundamental elements that transform input to output and
4 are the basic components of processes, phases, projects and higher-level activ-
5 ities. Other aspects of performance inuence it, such as the type, denition
6 and behaviour but its actual creation is through an activity. That is, no other
7 aspect related to performance creates it.
8
9 Axiom 2: Efciency and effectiveness
50
51111 All performance can be described by efciency and/or effectiveness.
185
Application and Key Features
That is, no matter the metric(s) or aspect(s) under consideration, all meas- 1
ures of performance, no matter how general or specic, will indicate either 2
efciency or effectiveness. Thus, while there may be multiple metrics for 3
a particular activity or process (amalgamation of activities) they can be 4
categorised into three fundamental types of efciency, effectiveness or a com- 5
bination of these. Of course once any goal is dened, including an efciency 6
goal, the relevant performance is an indication of effectiveness. 7
8
Axiom 3: Activity and management 9
1011
Activities and their management are inextricably linked. 1
2
Carrying out an activity will always involve an element of management. Thus, 3
every activity, even at an individual cognitive level, will involve its manage- 4
ment. Performance measurement must ensure that the correct metric is being 5
used for the actual activity or its management. Conicts and misdirected effort 6
ensues if these are not clearly and correctly identied. 7
8
It is suggested that some of the fundamental Axioms of Performance are either 91
neglected or misunderstood when considering performance management 2011
practice. 1
2
3
14.3 Implications for Design Development 4
5
Much of the work carried out in the area of design development is ultimately 6
aimed at improving the performance of the process. That is, it is aimed 7
at developing better understandings, approaches, tools, etc. which may ulti- 8
mately have an impact on the performance of design development in 9
industry. To predict or determine the nature of this impact requires an under- 3011
standing of the phenomenon of performance. This book contributes to this 1
understanding through presenting a methodology, which incorporates a new 2
and more comprehensive formalism for design performance modelling and 3
an approach for analysing the inuence of resources on a particular area of 4
performance. 5
6
7
Note 8
9
1 Self evident truths (The Concise Oxford Dictionary, 1995) 4011
1
2
3
4
5
6
7
8
9
50
51111
186
1
2
3
References
4
5
6
7
8
9
1011
1
2
3 1. Elzinga D.J., et al., Business Process Management: Survey and Methodology. IEEE
4 Transactions on Engineering Management, 1995. 42(2): p. 119128.
2. Leifer R. and W.J. Burke, Organisational Activity Analysis: A Methodology for Analysing and
5
Improving Technical Organisations. IEEE Transactions on Engineering Management, 1994.
6 41(3): p. 234244.
7 3. Araujo C.S. and A.H.B. Duffy. Assessment and Selection of Product Development Tools. in
8 International Conference on Engineering Design. 1997. Tampere Finland.
9 4. Design Council, L., Design Council, Annual Review 1997. 1997.
5. Wheelwright S.C. and K.B. Clark, Revolutionizing Product Development. 1991, New York:
2011 Free Press.
1 6. Duffy A.H.B. and F.J.O. Donnell. A Design Research Approach. in Workshop on Research
2 Methods in AI in Design. 1998. Lisbon, Portugal.
3 7. Neely A., M. Gregory, and K. Platts, Performance Measurement System Design: A Literature
4 Review and Research Agenda. International Journal of Production and Operations
Management, 1995. 15(4): p. 80116.
5 8. Gilbreath R.D., Working with Pulses Not Streams: Using Projects to Capture Opportunity, in
6 Project Management Handbook, D.I. Cleland and W.R. King, Editors. 1988, Van Nostrand
7 Reinhold: New York. p. 315.
8 9. Cusumano M.A. and K. Nobeoka, Strategy, Structure, and Performance in Product
9 Development: Observations from the Auto Industry, in Managing Product Development,
T. Nishiguchi, Editor. 1996, Oxford University Press, Inc. p. 75120.
3011 10. Roozenburg N.F.M. and N.G. Cross, Models of the Design Process: Integrating Across the
1 Disciplines. Design Studies, 1991. 12(4): p. 215220.
2 11. French M.J., Conceptual Design for Engineers. 3rd ed. 1998, London: Springer.
3 12. Pahl G. and W. Beitz, Engineering Design: A Systematic Approach. 2nd Revised Edition ed,
4 ed. K. Wallace. 1996, London: Springer-Verlag.
13. Smithers T., et al., Design as Intelligent Behaviour: An AI in Design Research Programme.
5 Articial Intelligence in Engineering, 1990. 5(2).
6 14. Andreasen M.M. and L. Hein, Integrated Product Development,. 1987: IFS (Publications) Ltd
7 and Springer-Verlag London UK.
8 15. Roozenburg N.F.M. and J. Eekels, Product Design: Fundamentals and Methods. A Wiley
Series in Product Development: Planning, Designing, Engineering. 1995, Chichester,
9 England: John Wiley and Sons. 408.
4011 16. Ulrich K.T. and S.D. Eppinger, Product Design and Development. 1995: MgGraw-Hill, Inc.
1 17. Angelis V.A., The Organisational Performance Re-considered: The Case of the Technical and
2 Social Subsystems Synergy. Business Process Management Journal, 1999. 5(3): p. 275288.
3 18. Neely, A., The Performance Measurement Revolution: Why Now and What Next?
International Journal of Operations and Production Management, 1999. 19(2): p. 205228.
4 19. Pritchard R.D., Measuring and Improving Organisational Productivity: A Practical Guide.
5 1990, New York: Praeger Publishers.
6 20. Kaplan R.S. and D.P. Norton, Using the Balanced Scorecard as a Strategic Management
7 System. Harvard Business Review, 1996. January-February.
8 21. Mehra S., Perpetual Analysis and Continuous Improvements: A Must for Organisational
Competitiveness. Managerial Finance, 1998. 24(1).
9 22. Sinclair D. and M. Zairi, Effective Process Management Through Performance Measurement:
50 Part 3 in integrated model of total quality-based performance measurement. Business
51111 Process Re-engineering and Management Journal, 1995. 1(3): p. 5065.
187
References
23. Drongelen C.K-van., Systematic Design of R&D Performance Measurement Systems. 1999, 1
University of Twente: Enschede. 2
24. Kald M. and F. Nilsson, Performance Measurement at Nordic Companies. European
Management Journal, 2000. 18(1): p. 113127. 3
25. Rolstadas A., ed. Benchmarking Theory and Practice. 1995, Chapman and Hall. 4
26. Kaplan R.S. and D.P. Norton, The Balanced Scorecard: Translating Strategy intoAaction. 5
1996, Boston: Harvard Business School Press. 6
27. Duffy A.H.B. and F.J.O. Donnell. A Model of Product Development Performance. in Designers:
Key to Successful Product Development. 1997: Springer.
7
28. Bititci U.S., Measuring Your Way to Prot. Management Decision, 1994. 32(6): p. 1624. 8
29. Dixon J.R., A.J. Nanni, and T.E. Vollman, The New Performance Challenge Measuring 9
Operations for World-Class Competition. 1990, Homewood, IL: Dow Jones-Irwin. 1011
30. Kaplan R.S. and D.P. Norton, The Balanced Scorecard Measures That Drive Performance. 1
Harvard Business Review, 1992. 70(1): p. 7179.
31. Chissea V., P. Coughlan, and C.A. Voss, Development of a Technical Innovation Audit.
2
Journal of Product Innovation Management, 1996. 13(2): p. 105136. 3
32. Meyer M.W. and V. Gupta, The Performance Paradox. Research in Organisational Behaviour, 4
1994. 16: p. 309369. 5
33. de Haas M. and A. Kleingeld, Multilevel Design of Performance Measurement Systems: 6
Enhancing Strategic Dialogue Throughout the Organisation. Management Accounting
Research, 1999. 10: p. 233261. 7
34. Tsang A., A. Jardine, and H. Kolodny, Measuring Maintenance Performance: A Holistic 8
Approach. International Journal of Operations and Production Management, 1999. 19(7): 91
p. 691715. 2011
35. Finger S. and J.R. Dixon, A Review of Research in Mechanical Engineering Design. Part 1:
Descriptive, Prescriptive, and Computer-Based Models of Design Processes. Research in
1
Engineering Design, 1989. 1: p. 5167. 2
36. Ullman D.G., T.G. Dietterich, and S.L. A, A Model of the Mechanical Design Process Based 3
on Empirical Data. AI EDAM, 1988. 2(1): p. 3352. 4
37. Akin O. and C. Lin, Design Protocol Data and Novel Design Decisions. Design Studies, 1995. 5
16(2): p. 211236.
38. Dorst K. and J. Dijkhuis, Comparing Paradigms for Describing Design Activity. Design 6
Studies, 1995. 16(2): p. 261274. 7
39. Hubka V., Principles of Engineering Design. WDK. 1980, Zurich: Springer-Verlag. 118. 8
40. Pugh S., Total Design: Integrated Methods for Successful Product Engineering. 1990: Addison- 9
Wesley Publishers Ltd.
41. Zhang Y., Computer-based Modelling and Management for Current Working Knowledge
3011
Evolution Support, in Department of Design, Manufacture and Engineering Management. 1
1998, University of Strathclyde: Glasgow. 2
42. VDI-2221, Systematic Approach to the Design of Technical Systems and Products. 1987, 3
Dusseldorf: Verein Deutscher Ingenieure. 4
43. Grifn A., The Effect of Project and Process Characteristics on Product Development Cycle
Time. Journal of Product Innovation Management, 1997. 34(1): p. 2435.
5
44. Brookes N.J. and C.J. Backhouse, Measuring the Performance of Product Introduction. 6
Journal of Engineering Manufacture, 1998. 212(1): p. 111. 7
45. Lawson M. and H.M. Karandikar, A Survey of Concurrent Engineering. Concurrent 8
Engineering: Research and Applications, 1994. 2: p. 16. 9
46. Tomiyama T., Concurrent Engineering: A Successful Example for Engineering Design
Research, in The Design Productivity Debate, A.H.B. Duffy, Editor. 1998, Springer-Verlag 4011
Publications. p. 175186. 1
47. Cantamessa M. Design Best Practices at Work: an empirical research upon the effectiveness 2
of design support tools. in ICED97. 1997. Tampere. 3
48. Grifn A., PDMA Research on New Product Development Practices: Updating Trends and
Benchmarking Best Practices. 1997, Institute for the Study of Business Markets, The
4
Pensylvania State University. 5
49. Schrijver R. and R.d. Graaf, Development of a Tool for Benchmarking for Concurrent 6
Engineering. Concurrent Engineering: Research and Applications, 1996. 4(2): p. 159170. 7
50. Brown S.L. and K.L. Eisenhardt, Product Development: Past Research, Present Findings, and 8
Future Directions. Academy of Management Review, 1995. 20(2): p. 343378.
51. Hales C., Managing Engineering Design. 1993: Longman Scientic and Technical. 9
52. Cooper, R.G., New Products: The Factors that Drive Success. International Marketing Review, 50
1994. 11(1): p. 6076. 51111
188
References
1 53. PM 2000: Performance Measurement Past, Present and Future. 2000. Cambridge.
2 54. Chang Z.Y. and K.C. Yong, Dimensions and Indices for Performance Evaluation of a Product
Development Project. International Jounal of Technology Management, 1991. 6(1/2):
3 p. 155167.
4 55. McGrath M.E., The R&D Effectiveness Index: Metric for Product Development Performance.
5 Journal of Product Innovation Management, 1994. 11: p. 201212.
6 56. Morbey G.K., R&D: Its Relationship to Company Performance. Journal of Product Innovation
Management, 1988. 5: p. 180190.
7 57. Terwiesch, C., C. Loch, and M. Niederkoer, When Product Development Makes a Difference:
8 A Statistical Analysis in the Electronics Industry. Journal of Product Innovation Management,
9 1998. 15(1): p. 315.
1011 58. Eccles R.G., The Performance Measurement Manifesto. Harvard Business Review, 1991.
1 69(January-February).
59. Johnson H.T. and R.S. Kaplan, Relevance Lost the Rise and Fall of Management Accounting.
2 1987, Boston, MA: Harvard Business School Press.
3 60. Nickell S., The Performance of Companies. 1995, Blackwell Publishers: Oxford. p. 120.
4 61. Bititci U.S., T. Turner, and C. Begemann, Dynamics of Performance Measurement Systems.
5 International Journal of Operations and Production Management, 2000. 20(6).
6 62. Dixon J.R., Design Engineering: inventiveness, Analysis, and Decision Making,. 1966, Usa:
McGraw-Hill Book Co.
7 63. LockamyIII, A., Quality-focused Performance Measurement Systems: A Normative
8 Model. International Journal of Operations and Production Management, 1998. 18(8):
9 p. 740766.
2011 64. Montoya-Weiss M.M. and R. Calantone, Determinants of New Product Performance: A
Review and Meta-Analysis. Journal of Product Innovation Management, 1994. 11: p. 397417.
1 65. Cordero R., The Measurement of Innovation Performance in the Firm: An Overview. Research
2 Policy, 1989. 19: p. 185192.
3 66. Dwight R., Searching for Real Maintenance Performance Measures. Journal of Quality in
4 Maintenance Engineering, 1999. 5(3): p. 258275.
5 67. Neely A., et al., Getting the Measure of Your Business. 1996, Horton Kirby: Findlay
Publications.
6 68. Rolstadas A., Enterprise Performance Measurement. International Journal of Operations and
7 Production Management, 1998. 18(9/10): p. 989999.
8 69. Clark K.B. and T. Fujimoto, Product Development Performance: Strategy, Organisation and
9 Management in the World Auto Industry. 1991: Harvard Business School Press.
3011 70. Doz Y., New Product Development Effectiveness: A Triadic Comparison in the Information-
Technology Industry, in Managing Product Development, T. Nishiguchi, Editor. 1996, Oxford
1 University Press, Inc. p. 1333.
2 71. Emmanuelides P.A., Towards an Integrative Framework of Performance in Product
3 Development Projects. Journal of Engineering and Technology Management, 1993. 10:
4 p. 363392.
72. Moseng B. and H. Bredrup, A Methodology for Industrial Studies of Productivity Performance.
5 Production Planning and Control, 1993. 4(3).
6 73. Drongelen C.K.-v. and A. Cook, Design Principles for the Development of Measurement
7 Systems for Research and Development Processes. R&D Management, 1997. 27(4): p. 345357.
8 74. Christiansen R.T., Modelling Efciency and Effectiveness of Coordination in Engineering
9 Design Teams,. 1993, Stanford University.
75. Grifn A., The Impact of Engineering Design Tools on New Product Development Efciency
4011 and Effectiveness. 1996, Institute for the Study of Business Markets, The Pensylvania State
1 University: Univesity Pk., PA.
2 76. McDonough E.R. and A. Grifn, The Impact of Organisational Tools on New Product
3 Development Efciency and Effectiveness. 1996, Institute for the Study of Business Markets,
The Pensylvania State University: University Pk., PA.
4 77. Wakasugi R. and F. Koyata, R&D, Firm Size, and Innovation Outputs: Are Japanese Firms
5 Efcient in Product Development. Journal of Product Innovation Management, 1997. 14:
6 p. 3683832.
7 78. Araujo C.S. and A. Duffy. Product Development Tools. in 3rd International Congress of Project
8 Engineering. 1996. Barcelona.
79. Beitz W., Design Science The Need for a Scientic Basis for Engineering Design Methodology.
9 Journal of Engineering Design, 1994. 5(2): p. 129133.
50 80. Duffy A.H.B., Design Productivity, in The Design Productivity Debate, A.H.B. Duffy, Editor.
51111 1998, Springer-Verlag Publications.
189
References
81. Grifn A. and A.L. Page, An Interim Report on Measuring Product Development Success and 1
Failure. Journal of Product Innovation Management, 1993. 10: p. 291308. 2
82. Bain D., The Productivity Prescription The Managers Guide to Improving Productivity and
Prots. 1982, New York: McGraw-Hill. 3
83. Goldschmidt G., The Designer as a Team of One, in Analysing Design Activity, N. Cross 4
H. Christiaans, and K. Dorst, Editors. 1996, John Wiley and Sons. 5
84. Blessing L.T.M., A Process-Based Approach to Computer-Supported Engineering Design. 1994, 6
University of Twente: Enschede, The Netherlands.
85. Pugh S., Design Activity Models: Worldwide Emergence and Convergence. Design Studies,
7
1986. 7(3): p. 167173. 8
86. Radcliffe D.F., Concurrency of Actions, Ideas and Knowledge within a Design Team, in 9
Analysing Design Activity, N. Cross, H. Christiaans, and K. Dorst, Editors. 1996, John Wiley 1011
and Sons. 1
87. Cross N., H. Christiaans, and K. Dorst, eds. Analysing Design Activity. 1996, John Wiley and
Sons: Chichester, West Sussex. 463.
2
88. Bititci U., Modelling of Performance Measurement Systems in Manufacturing Enterprises. 3
International Journal of Production Economics, 1995. 42: p. 137147. 4
89. LockamyIII A. and M.S. Spencer, Performance Measurement in a Theory of Constraints 5
Environment. International Journal of Production Research, 1998. 36(8): p. 20452060. 6
90. Martin R., Do we practise quality principles in the performance measurement of critical
success factors? Total Quality Management, 1997. 8(6): p. 429444. 7
91. APQC, Corporate Performance Measurement Benchmarking Study Report. 1996, American 8
Productivity and Quality Centre: Houston. 91
92. Clark K.B., High Performance Product Development in the World Auto Industry. International 2011
Journal of Vehicle Design, 1991. 12(2): p. 105131.
93. Loch C., L. Stein, and C. Terwiesch, Measuring Development Performance in the Electronics
1
Industry. Journal of Product Innovation Management, 1996. 13(220). 2
94. Cantamessa M., Design Best Practices, Capabilities and Performance. Journal of Engineering 3
Design, 1999. 10(4): p. 305328. 4
95. Cooper R.G., The Strategy-Performance Link in Product Innovation. R & D Management, 5
1984. 14(4): p. 247259.
96. Grifn A., Evaluating QFDs Use in US Firms as a Process for Developing New Products. 6
Journal of Product Innovation Management, 1992. 9: p. 171187. 7
97. Cooper R.G. and E.J. Kleinschmidt, Determinants of Timeliness in Product Development. 8
Journal of Product Innovation Management, 1994. 11(5): p. 381396. 9
98. Zirger B.J. and J.L. Hartley, The Effect of Acceleration Techniques on Product Development
Time. IEEE Transactions on Engineering Management, 1996. 43(2): p. 143152.
3011
99. Bititci U.S., A.S. Carrie, and P. Suwignjo. Quantitative Models for Aggregating and 1
Prioritising of Performance Measures. in 10th International Working Seminar on Production 2
Economics. 1998. Austria. 3
100. Saaty T.L. and L.G. Vargas, The Logic of Priorities, in International Series in Management 4
Science/Operations Research. 1982, Kluwer Nijhoff Publishing.
101. Elmasri R. and S.B. Navathe, Fundamentals of Database Systems. 1989, California USA: The
5
Benjamin/Cummings Publishing Company Inc. 6
102. Globerson S., Issues in Developing a Performance Criteria System for an Organisation. 7
International Journal of Production Research, 1985. 23(4): p. 639646. 8
103. KPMG, Information for Strategic Management A Survey of Leading Companies. 1990: 9
London.
104. Maskell B., Performance Measures of World Class Manufacturing. Management Accounting, 4011
1989(May). 1
105. NIST, 1999 Criteria for Performance Excellence. 1999. 2
106. Guide to the Business Excellence Model. 1998: British Quality Foundation. 3
107. Ritchie L. and B.G. Dale, An Analysis of Self-assessment Practices Using the Business
Excellence Model. Proceedings of Institution of Mechanical Engineers, 2000. 214(B):
4
p. 593602. 5
108. DTI, Innovation your move. 1994, Department of Trade and Industry. 6
109. Duffy A.H.B., et al., Design Coordination for Concurrent Engineering,. International Journal 7
of Engineering Design, 1993. 4(4): p. 251265. 8
110. Duffy A.H.B., M.M. Andreasen, and F.J.O. Donnell. Design Co-ordination. in International
Conference on Engineering Design. 1999. Munich. 9
111. Duffy A.H.B. Ensuring Competitive Advantage with Design Coordination,. in The Second 50
International Conference on Design to Manufacture in Modern Industry (DMMI95) Bled 51111
190
References
1 Slovenia 2930 May 1995. 1995. Faculty of Mechanical Engineering University of Maribor
2 Maribor Slovenia.
112. Cohen L., Quality Function Deployment: how to make QFD work for you. Engineering Process
3 Improvement Series. 1995: Addison-Wesley Publishers Ltd.
4 113. ODonnell F.J. and A.H. Duffy, DC Performance Analysis in Company A. 1996, CAD Centre,
5 University of Strathclyde: Glasgow.
6 114. ODonnell F.J. and A.H. Duffy, DC Performance Analysis in Company A: 1996 1999 review.
1996, CAD Centre, University of Strathclyde: Glasgow.
7 115. Colquhoun G.J., R.W. Baines, and R. Crossley, A State of the Art Review of IDEF0.
8 International Journal of Computer Integrated Manufacturing, 1993. 6(4): p. 252264.
9 116. Persidis A. and A. Duffy. Learning in Engineering Design,. in Selected and Reviewed Papers
1011 and Reports from the IFIP TC/WG5.2 Third Intelligent Workshop on Computer Aided Design
1 Osaka Japan September 1989. 1991. Elsevier Science Publishers B.V. (North-Holland)
Amsterdam The Netherlands.
2 117. Andreasen M.M., Modelling The Language of the Designer. Journal of Engineering Design,
3 1994. 5(2): p. 103115.
4 118. Frankenberger E., P. Badke-Schaub, and H. Birkhofer, eds. Designers: Key to Successful
5 Product Development. 1997, Springer. 319.
6 119. Eynard B., P. Girard, and G. Doumeingts. Control of Engineering Processes Through
Integration of Design Activities and Product Knowledge. in Proceedings of the 1999
7 International CIRP Design Seminar. 1999. University of Twente, Enshede, The Netherlands.
8 120. Kosanke K., CIMOSA Overview and Status. Computers in Industry, 1995. 27(2): p. 101109.
9 121. Coyne R., Logic Models of Design. 1988, London: Pitman Publishing.
2011 122. Koopman P.J., A Taxonomy of Decomposition Strategies Based on Structures, Behaviours and
Goals, in DE-Vol. 83, 1995 Design Engineering and Technical Conferences. 1995, ASME.
1 123. Maher M.L., Engineering Design Synthesis: A Domain Independent Representation. AI EDAM,
2 1988.
3 124. Zhang Y., K.J. MacCallum, and A.H.B. Duffy, A Product Knowledge Modelling Scheme for
4 Function Based Design, in AID 96 Workshop on Function Modelling in Design. 1996:
5 Stanford, USA.
125. Baykan C., Design Strategies, in Analysing Design Activity, N. Cross, H. Christiaans, and K.
6 Dorst, Editors. 1996, John Wiley and Sons.
7 126. Bayus B.L., Speed-to-Market and New Product Performance Trade-offs. Journal of Product
8 Innovation Management, 1997. 14(6): p. 485497.
9 127. Edwards, W. and J.R. Newman, Multi-attribute Evaluation, ed. R.G. Niemi. 1982: Sage
3011 Publications.
128. Zeleny M., Multiple Criteria Decision Making. McGraw-Hill Series in Quantitative Methods
1 for Management, ed. M.K. Starr. 1982, New York: McGraw-Hill Book Company.
2 129. Pugh S. Concept Selection a method that works,. in Proceedings of the International
3 Conference on Engineering Design Rome Italy 1981. 1981.
4 130. Sim, S.K. and A.H.B. Duffy, A Foundation for Machine Learning in Design. Articial
Intelligence for Engineering Design Analysis and Manufacturing, 1998. 12: p. 193209.
5 131. Eppinger S.D., et al., A Model-based Method for Organising Tasks in Product Development,.
6 Research in Engineering Design, 1994. 6(1): p. 113.
7 132. Gruninger M. and M.S. Fox, An Activity Ontology for Enterprise Modelling. 1994.
8 133. Murmann P.A., Expected Development Time Reductions in the German Mechanical
9 Engineering Industry. Journal of Product Innovation Management, 1994. 11(3): p. 236
252.
4011 134. Osmond E. and D.F. Sheldon. Whole Life Cost Reduction by Design,. in International
1 Conference on Design for Competitive Advantage Making the most of design. Coventry 2324
2 March 1994. 1994. Institute of Mechanical Engineers.
3 135. Smith P.G. and D.G. Reinertsen, Developing Products in Half the Time. 1991, New York: Van
Nostrand Reinhold.
4 136. Cordero R., Managing for Speed to Avoid Product Obsolescence: A survey of techniques.
5 Journal of Product Innovation Management, 1991.
6 137. Lockyer K., Critical Path Analysis and other Project Network Techniques. Fourth Edition ed.
7 1988, London: Pitman Publishing.
8 138. Boehm B.W., Software Engineering Economics. 1981: Prentice Hall.
139. Kusiak A. and J. Wang, Decomposition of the Design Process,. Journal of Mechanical Design,
9 1993. 115: p. 687695.
50 140. Maher M.L., Structural Design by Hierarchical Decomposition. 1989, Engineering Design
51111 Research Centre, Carnegie Mellon University: Pittsburgh, PA.
191
References
141. Srinivas K., et al., MONET: a multi-media system for conferencing and application sharing 1
in distributed systems,. 1992, Concurrent Engineering Research Center West Virginia 2
University: Morgantowm W.Va. USA.
142. Kusiak A. and J. Wang. Qualitative Analysis of the Design Process. in DE-Vol 66 Intelligent 3
Concurrent Design: Fundamentals, methodology, modelling and practice. 1993. New Orleans, 4
Louisiana: ASME, New York. 5
143. Prasad B., Concurrent Engineering Fundamentals: Integrated Product and Process 6
Organisation. Prentice Hall International Series in Industrial and Systems Engineering.
Vol. One. 1996, New Jersey: Prentice Hall PTR. 478.
7
144. AitSahila F., E. Johnson, and P. Will, Is Concurrent Engineering Always a Sensible 8
Proposition? IEEE Transactions on Engineering Management, 1995. 42(2): p. 166170. 9
145. Molina A., et al., A Review of Computer-Aided Simultaneous Engineering Systems. Research 1011
in Engineering Design, 1995. 7: p. 3. 1
146. Smith R.P. and S.D. Eppinger. Characteristics and Models of Iteration in Engineering Design.
in International Conference on Engineering Design, ICED 93. 1993. The Hague.
2
147. The Concise Oxford Dictionary,. Seventh ed, ed. J.B. Sykes. 1995: Oxford University Press. 3
148. Balachandran M., Knowledge Based Optimum Design, ed. C.A. Brebbia and J.J. Connor. 1993: 4
Computational Mechanics Publications. 5
149. Stauffer L.A. and D.G. Ullman, Fundamental Processes of Mechanical Designers Based on 6
Empirical Data. Journal of Engineering Design, 1991. 2(3).
150. Gaither N., Production and Operations Management ; A problem-solving and decision making 7
approach. 1980, Hinsdale, Illinois: The Dryden Press. 8
151. Nonaka I. and H. Takeuchi, The Knowledge Creating Company. 1995, Oxford: Oxford 91
University Press. 2011
152. Buzzell R.D. and B.T. Gale, The PIMS Principles. 1987: The Free Press.
153. Henderson R. and W. Mitchell, The Interactions of Organisational and Competitive
1
Inuences on Strategy and Performance. Strategic Management Journal, 1997. 18(Summer 2
Special Issue): p. 514. 3
154. Mintzberg H., Crafting Strategy, in Strategy: Seeking and Securing Competitive Advantage, 4
C.A. Montgomery and M.E. Porter, Editors. 1991, Harvard Busines School. p. 403420. 5
155. Johnson H., Relevance Regained: From Top-Down Control to Bottom-Up Empowerment. 1992,
New York: Free Press. 6
156. Andreasen M.M., et al. The Design Co-ordination Framework: key elements for effective 7
product development. in The Design Productivity Debate. 1998: Springer-Verlag Publications. 8
157. Kusiak A. and K. Park, Concurrent Engineering: decomposition and scheduling of design 9
activities,. International Journal of Production Research, 1990. 28(10): p. 18831900.
158. Fadel F.G., M.S. Fox, and M. Gruninger. A Generic Enterprise Resource Ontology. in Third
3011
IEEE Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises (WET 1
ICE 94). 1994. Morgantown, West Virginia. 2
159. Evans S., Implementation Framework for Integrated Design Teams. Journal of engineering 3
design, 1990. 1(4): p. 355363. 4
160. Shepherd R. and T. McCarthy, AutoCAD2000 productivity study, in Engineering Designer.
1999. p. 47.
5
161. Cantamessa M., Investigating Productivity in Engineering Design: a Theoretical and 6
Empirical Perspective, in The Design Productivity Debate, A.H.B. Duffy, Editor. 1998, 7
Springer-Verlag Publications. 8
162. Cross N., Engineering Design Methods,. 1989: John Wiley and Sons. 9
163. Hubka V., Design Tactics = methods + working principles for design engineers. Design
Studies, 1983. 4(3). 4011
164. Jones J.C., Design Methods. 1992, New York: Van Norstrand Reinhold. 1
165. Neely A. and J. Mills, Manufacturing in the UK Report on a Survey of Performance 2
Measurement and Strategy Issues in UK Manufacturing Companies. 1993, Manufacturing 3
Engineering Group: London.
166. Green G., Towards Integrated Design Evaluation: validation of models. Journal of
4
Engineering Design, 2000. 11(2): p. 121132. 5
167. Kaye M., Continuous Improvement: the ten essential criteria. International Journal of Quality 6
& Reliability Management, 1998. 16(5): p. 485506. 7
168. Haffey M., Approaches to the Denition of Performance Metrics in Design Development. 2000, 8
CAD Centre, University of Strathclyde: Glasgow.
169. Jacobson R., Microsoft Excel 97/Visual Basic Step by Step. 1997. 9
170. Brown M.G. and R.A. Svenson, Measuring R&D Productivity. Research-Technology 50
Management, 1988. July-August: p. 1115. 51111
192
1
2
3
Appendix: A Review
4
5
6
of Metrics in Design
7
8
9
Development
1011
1
2
3 The following presents detailed results of the review of metrics in design
4 development referred to in Chapter 10. The metrics are selected from a report
5 by Haffey [168] which reviews approaches to the denition of metrics in this
6 area. The results of the review are presented in a table with comments inserted
7 to illustrate particular assumptions that were made in relation to each metric.
8 A is used to denote the area that the metric relates to i.e. one of the
9 following:
2011 1. Design Efciency: relating the knowledge gained in the design activity to
1 the knowledge used.
2
2. Design Management Efciency: relating knowledge gained in the design
3
management activity to the knowledge used.
4
5 3. Design Effectiveness: comparing the output and goal knowledge of the
6 design activity.
7 4. Design Management Effectiveness: comparing the output and goal knowl-
8 edge of the design management activity.
9 The following points should be also be considered in interpreting the results:
3011
1 1. A metric such as customer satisfaction is considered to ascertain the degree
2 to which the nal product meets or exceeds the initial requirements of the
3 customer. Within the scope of the models presented here this can be seen
4 to (partially) measure design and/or design management effectiveness
5 where it is assumed that customer satisfaction is measured in relation to
6 design (artefact) or design activity goals. For example, the metric may illus-
7 trate whether the product functionality satises the customer (design effec-
8 tiveness) or how satised the customer is with the date that the product
9 was delivered (design management effectiveness). It is assumed here that
4011 the goals were accurately dened from the customer requirements and
1 therefore meeting the goals also means meeting the requirements.
2 2. Some of the metrics listed in the review are not directly attributable (NDA)
3 to design development activity performance. For example the values
4 obtained using a metric such as market share achieved could be partly
5 attributed to artefact performance and considered to reect design effec-
6 tiveness. However the values could equally be attributed to the corporate
7 image of the organisation promoting the product, the effectiveness of their
8 supply chains, etc. The metric is therefore considered too vague to be
9 directly attributed to design development performance.
50 3. In some cases the exact meaning of the metric is unclear from the source
51111 and therefore not possible to categorise or analyse it fully in this review.
193
Appendix
194
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
4011
3011
2011
1011
51111
Appendix
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
ASSUMPTIONS/COMMENTS
Customer
Success measures for new to world strategies [16]
Degree to which products are accepted by customer and satisfy them acceptance and satisfaction in relation
after use to design and design activity goals.
Number of NDA
customers
Level of customer satisfaction [14] satisfaction in relation to both design
and design activity goals
Customer
satisfaction [3]
Customer satisfaction survey
Product
Function functions expressed as design goals
Usability usability expressed as design goals
Reliability reliability expressed as design goals
Performance performance expressed as design goals
Serviceability serviceability expressed as design
goals
Ergonomics ergonomics expressed as design goals
Aesthetics aesthetics expressed as design goals
Timeliness of availability of on-time means meeting time goal
product
Perceived Quality quality refers to product not delivery
time
Product return rates where returns due to failure to meet a
design goal
Cust. complaint levels complaints reflect failure to meet design
goals.
Customer Return Factor NDA
Capture & loss of new NDA
customers
Contracts and markets NDA
Customer loyalty NDA
195
196
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Customer
acceptance [14]
* Desired Revenue Goals NDA
Revenue growth NDA
Market share goals met NDA
*Customer Satisfaction satisfaction in relation to both design
and design activity goals
*No of Customers NDA
*Price/value of customers NDA
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
91
4011
3011
2011
1011
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
4011
3011
2011
1011
51111
Appendix
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Customer acceptance in relation to both design
acceptance and design activity goals
Customer satisfaction in relation to both design
satisfaction and design activity goals
No of customers adopting to product NDA
Test market trial NDA
rate
197
198
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Value added resource cost
lab.
Wasted resource cost
labour
Overhead Costs resource cost
Distribution Costs NDA
Life Costs
Maintenance Cost maintenance cost expressed as design
goals
Warrant warranty cost expressed as design
y Costs goals
Replacement Cost replacement cost expressed as design
goals
Delivery, Installation NDA
& Service
Product Availability
Anticipation of market NDA
Time to where reaching market means an
Market assumed output
Ease of Installation ease of installation expressed as design
goals
Ease of Service ease of service expressed as design
goals
Firm/Organisation
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Time to market where reaching market means an
assumed output
Program
199
200
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
% tech specs met or where tech spec is list of design goals
exceeded, averaged
across completions
% completion dates met where dates are expressed as design
or exceeded activity goals
People
management
% fully satisf.or above influence - resource attribute
personnel resigning/year
Planning No Eng. change orders influence - where spec changes are
due to spec changes requested by external "customer"
before product release
New tech.study and No. patents/total no of where patenting is a design goal
development R&D employees
Outputs
No of complaints/ where complaints are for not meeting
product year ,ave across design or design activity goals
project, with 3 month
rolling ave.
Division
results/outcomes
%sales from products NDA
released in last 3 years
Score on annual R&D NDA
scorecard survey
Annual sales/total NDA
budget
Project / Product
Cycle Time [15 & 17]
Product Complexity influence - level of knowledge
processed (K+)
Management
Complexity
Team size [2] influence - attribute of a resource
Communication Pattern [2] influence - attribute of a resource
Degree of Change influence
201
202
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Degree of change in product from generation (Newness in product or influence - level of knowledge
process) processed (K+)
Quality [6]
Relating quality and rework Rate of production time based efficiency
[2] of drawings
No. of engineering change requests assuming changes due to design errors
Average time efficiency of "change resolving" activity
needed to resolve
203
204
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Time to market for normalised project where reaching market means an
assumed output
Total product quality conformance quality conformance means meeting design
goals
product integrity where integrity is expressed as design
goal(s)
a high level measure of product performance where product performance is
expressed in design goals
consistency with an overall company identity unclear from source
Cusumano & Nobeoka [9] No of products where this is a measure of output over
introduced a time period
Design where manufacturability is expressed
manufacturability as a design goal
Market share/growth NDA
Return on R&D NDA
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Launch time schedule met where schedule is expressed as design
activity goals
Speed to market where reaching market means an
assumed output
Product performance level where product performance is
expressed in design goals
Quality Guidelines met where quality guidelines are expressed
in design goals
*Innovativeness where innovativeness is expressed as a
design goal
*Quality guidelines met where quality guidelines are expressed
as a design goal
205
206
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Speed to market where reaching market means an
assumed output
Completed in where budget is expressed as design
budget activity goals
Subjectively unclear from source
successful
Technically where technical success is defined as a
successful design goal
Got to market on
time
Process
Development process [15]
Type of process Formal/Informal (Stage influence - use of an approach
used gate)
Delivery of customer customer needs expressed as
needs design/design activity goals
207
208
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
% Concurrency influence - attribute of an approach
(CE)
No of requested changes + (no of specific unclear
changes)
Time lost to other unclear
projects
No of on time influence - attribute of an approach
milestones (design management)
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
is a design goal
Sales forecast not design
Variation in
Prediction
Variation between Programs NDA
Variation between Quarters NDA
209
210
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Crossfunctional Integration Ext sources of ideas influence - use of external resource
[26] knowledge
Early market influence - attribute of an approach
involvement
Reverse influence - attribute of an approach
Engineering
Preferred parts list influence - attribute of an approach
DFM information influence - attribute of an approach
Value engineering influence - use (or not) of a method
Early manufacturing influence - use (or not) of approach
involvement
Design complexity influence - attribute of resource
knowledge
Early purchasing influence - use (or not) of approach
involvement
Early supplier influence - use (or not) of approach
involvement
Joint supplier influence - use (or not) of approach
designs
Cooperation with influence - use (or not) of approach
basic research
People management and Team rewards 0/1 for rewards based influence - use (or not) of approach
learning [26] on team not individual
performance
Job rotation No of engineers involved influence - attribute of an approach
in rotation at business
level (%)
Training per Active training per influence - level of knowledge of
employee employee resources
Cross- functional Prop. engineers involved influence - attribute of an approach
training in cross functional
training
Appendix
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
50
91
4011
3011
2011
1011
Appendix
1 Sources of Metrics
2
3
4 The following references are cited throughout the Table A-1 as the sources
5 for particular metrics listed:
6
7 1. Bahill A.T. and W.L. Chapman. Case Studies in System Design. in International Symposis and
Workshop on Systems Engineering at Computer Based Systems. 1995.
8 2. Bashir H.A. and V. Thomson, Metrics for Design Projects A Review. Design Studies, 1999.
9 20: p. 263277.
1011 3. Beaumont L.R., The PDMA Handbook of New Product Development, , J.M.D. Rosenau, Grifn,
1 A., Castellion, G.A. and Anschuetz, N.D., Editor. 1997, John Wiley and Sons, Inc. p. 463485.
4. Brown W.B. and D. Gobeli, Observations on the Measurement of R & D Productivity: A Case
2 Study. IEEE Transactions on Engineering Management, 1992. 39(4): p. 325331.
3 5. Clark K.B. and T. Fujimoto, Product Development Performance. 1991: Harvard Business
4 School Press.
5 6. Cooke J.A., C.A. McMahon, and M.R. North, Metrics in the Engineering Design Process.
6 Proceedings Institution of Mechanical Engineers, 1999. 213(Part B): p. 523526.
7. Cooper R.G., Debunking the Myths of New Product Development. Research Technology
7 Management, 1994. 37: p. 4050.
8 8. Cooper R.G. and E.J. Kleinschmidt, Benchmarking the Firms Critical Success Factors in New
9 Product Development. Journal of Product Innovation Management, 1995. 12: p. 374391.
2011 9. Cusumano M.A. and K. Nobeoka, Strategy, Structure, and Performance in Product
Development: Observations from the Auto Industry, in Managing New Product Development,
1 T. Nishiguchi, Editor. 1996, Oxford University Press. p. 75120.
2 10. Fenton N.E., Software Metrics: A Rigorous Approach. 1991: Chapman and Hall.
3 11. Foster R.N., et al., Improving the Return on R & D Part I. Research Management, 1985. 28:
4 p. 1217.
5 12. Foster R.N., et al., Improving the Return on R & D Part II. Research Management, 1985. 28:
p. 1322.
6 13. Goldense B.L., Rapid Product Development Metrics. World Class Design to Manufacture, 1994.
7 1(1): p. 2128.
8 14. Grifn A. and A.L. Page, An Interim Report on Measuring Product Development Success and
9 Failure. Journal of Product Innovation Management, 1993. 10: p. 291308.
15. Grifn A., Metrics for Measuring Product Development Cycle Time. Journal of Product
3011 Innovation Management, 1993. 10: p. 112125.
1 16. Grifn A. and A.L. Page, PDMA Success Measurement Project: Recommended Measures for
2 Product Development Success and Failure. Journal of Product Innovation Management, 1996.
3 13: p. 478496.
4 17. Grifn A., The Effect of Project and Process Characteristics on Product Cycle Time. Journal of
Marketing Research, 1997. 34(1): p. 2435.
5 18. Hauser J.R. and F. Zettelmeyer, Metrics to Evaluate R, D&E. Research Technology
6 Management, 1997. July-August 1997: p. 3238.
7 19. Hubka E., Theory of Technical Systems. 1988: Springer-Verlag.
8 20. Hutlink E.J., et al., New Consumer Product Launch: Strategies and Performance, . 1999,
University of Strathclyde: Glasgow.
9 21. Jacome M.F. and V. Lapinskii, NREC: Risk Assessment and Planning of Complex Designs. IEEE
4011 Design & Test of Computers, 1997. January March 1997: p. 4249.
1 22. Kan S.H., Metrics and Models in Software Quality Engineering. 1995: Addison-Wesley.
2 23. Kaplan R.S. and D.P. Norton, The Balanced Scorecard Measures that Drive Performance.
3 Harvard Business Review, 1992. Jan- Feb 1992: p. 7179.
24. Kaplan R.S. and D.P. Norton, Putting the Balanced Scorecard to Work. Harvard Business
4 Review, 1993. Sep-Oct 1993: p. 134147.
5 25. Kerssens-van Drongelen I.C. and J. Bilderbeek, R&D Performance Measurement: More Than
6 choosing aSset of Metrics. R&D Management, 1999. 29(1): p. 3546.
7 26. Loch C., L. Stein, and C. Terwiesch, Measuring Development Performance in the Electronics
Industry. Journal of Product Innovation Management, 1996. 13: p. 320.
8 27. McGrath M.E. and M.N. Romeri, The R&D Effectiveness Index: A Metric for Product
9 Development Performance. Journal of Product Innovation Management, 1994. 11: p. 213220.
50 28. Moser M.R., Measuring Performance in R & D Settings. Research Technology Management,
51111 1985. 28(5).
211
Appendix
29. Pahl G. and W. Beitz, Engineering Design: A Systematic Approach. 1988: The Design Council. 1
30. Smailagic A. Benchmarking an Interdisciplinary Concurrent Design Methodology for 2
Electrical/Mechanical Systems. in 32nd Design Automation Conference. 1995.
31. Walrad C. and E. Moses, Measurement: The Key to Application Development Quality. IBM 3
System Journal, 1993. 32(3): p. 445460. 4
32. Zirger B.J. and M.A. Maidique, A Model of New Product Development: An Empirical Test. 5
Management Science, 1990. 36: p. 867883. 6
7
8
9
1011
1
2
3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
212
1
2
3
Index
4
5
6
7
8
9
1011
1
2
311 Action 910 Decomposition relationships 6566
4 Activity management 6165 Delft study
5 Actual effectiveness 116 Design Activity Management model of
Actual exploitation 110 135, 139
6
Analyser 161 elements of performance in 140
7 Analysis 9 overview of 137139
8 in PERFORM approach 111121 Dependent activities 68
9 of performance 105 Descriptive models 11
2011 Analysis approach in PERFORM Design
1 approach 112113 characteristics of performance in
2 Analysis measures 115116 1314
3 in PERFORM approach 114121 knowledge-based model of 5660
4 Assessment 9 management and 6163
5 in PERFORM approach 110111 Design Activity Management (DAM)
Attainment of objectives 21 model 135
6
Axioms of performance 185186 complexity and novelty 176177
7 comprehensiveness 172173
8 Balanced scorecard 2629 of Delft case 139
9 Basic design cycle 12 generality 172
3011 Brainstorming 108 subjects discussed 154 157158
1 Business level approaches 38 supporting 148149
2 supporting coherence 173
3 Coherence 11 Design activity modeling 2326
4 achieving 83 Design activity performance 1213
5 in design performance 1517 Design cycle, basic 12
goal alignment for 1617 Design development
6
supporting 2629 nature of performance in 181
7 Computer-aided design 32 performance in 1117
8 Consolidation 108 term 2
9 Constraints 5758 Design development analysis 3941
4011 Cost 150 Design development metrics, insights
1 quality versus 24 into 149151
2 Cost reduction 42 Design development performance
3 Customer need 60 analysis for PERFORM approach
4 Customer satisfaction 148 174176
5 Design development performance
DAM model, see Design Activity formalism 126 127128
6
Management model Design effectiveness 76 147
7 Data input, ideal 45 Design efciency 147
8 Data-ordering templates 160 Design management effectiveness 76 147
9 Data representation in PERFORM Design management efciency 147
50 approach 113114 Design performance 12
51111 Datum method 113 application and key features 135186
213
Index
214
Index
215