You are on page 1of 215

1

2 Design Performance
3
4
5
6
7
8
9
1011
1
2
3
4
5
6
7
8
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
1
2 Francis J. ODonnell and
3
4
Alex H.B. Duffy
5
6
7
8
9
1011
1
Design
2
3
4
5
Performance
6
7
8 With 65 Figures
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
Francis J. ODonnell, PhD 1
Scottish Enterprise E-Business Group, 150 Broomielaw, Atlantic Quay, 2
Glasgow, UK 3
4
Alexander H.B. Duffy, PhD, CEng 5
CAD Centre, Department of Design Manufacture and 6
Engineering Management, University of Strathclyde, 7
75 Montrose Street, Glasgow, UK 8
9
1011
1
2
3
4
5
6
British Library Cataloguing in Publication Data 7
ODonnell, Francis J.
Design Performance 8
1. Design, IndustrialManagement. 2. Performance technology 91
I. Title II. Duffy, Alex H. B. (Alex Hynd Black), 1957 2011
658.5752 1
ISBN 185233889X
2
3
Library of Congress Cataloging-in-Publication Data 4
ODonnell, Francis J. 5
Design performance/Francis J. ODonnell and Alex H. B. Duffy 6
p. cm.
ISBN 185233889X (alk. paper) 7
1. Design, Industrial. I. Duffy, Alex H. B. (Alex Hynd Black), 1957 8
II. Title 9
TS171.O36 2004 3011
658.5752dc22 2004052219
1
Apart from any fair dealing for the purposes of research or private study, or criticism or review, 2
as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be 3
reproduced, stored or transmitted, in any form or by any means, with the prior permission in 4
writing of the publishers, or in the case of reprographic reproduction in accordance with the
5
terms of licences issued by the Copyright Licensing Agency. Enquiries concerning reproduction
outside those terms should be sent to the publishers. 6
7
ISBN 185233889X 8
Springer Science+Business Media 9
springeronline.com
4011
Springer-Verlag London Limited 2005 1
Printed in the United States of America 2
3
The use of registered names, trademarks, etc. in this publication does not imply, even in 4
the absence of a specic statement, that such names are exempt from the relevant laws and
regulations and therefore free for general use. 5
6
The publisher makes no representation, express or implied, with regard to the accuracy of the 7
information contained in this book and cannot accept any legal responsibility or liability for any 8
errors or omissions that may be made.
9
Typeset by Florence Production Ltd, Stoodleigh, UK 50
69/3830543210 Printed on acid-free paper SPIN 10969571 51111
1
2
3
Preface
4
5
6
7
8
9
1011
1
2
31 The continual effort to improve performance in business processes attracts
4 increasing attention in research and industry alike. The impact of design
5 development performance on the overall business positions this area as an
6 important performance improvement opportunity. However, design devel-
7 opment is characterised by novelty, uniqueness and non-repeatability, which
8 provides particular challenges in dening, measuring and managing its
9 performance to achieve improvement.
2011 This book explores the support provided by both general research in busi-
1 ness process performance and design research for supporting performance
2 improvement in design development. The nature of design development in
3 industrial practice is further revealed, and requirements for its modelling and
4 analysis to achieve improvement are highlighted.
5 A methodology for the modelling and analysis of performance in design
6 development that encapsulates a formalism of performance and an approach
7 for its analysis is established. The formalism is composed of three models,
8 which capture the nature of design development performance and support its
9 measurement and management. The E2 model formalises and relates the key
3011 elements of performance, i.e., efciency and effectiveness. The Design Activity
1 Management (DAM) model distinguishes design and design management
2 activities in terms of the knowledge processed, while the Performance
3 Measurement and Management (PMM) model describes how these activities
4 relate within a process of measuring and managing performance. The PER-
5 FORM approach is dened as a means to analyse an aspect of performance,
6 effectiveness and the inuence of resources on that aspect. A computer- based
7 tool that supports its implementation in industry and provides the capability
8 to identify appropriate means to improve performance is presented.
9 The overall methodology is illustrated using worked examples and indus-
4011 trial practice to reveal different elements. This highlights how the methodol-
1 ogy addresses requirements for modelling and analysing design development
2 performance to achieve improvement. Strengths and weaknesses of the
3 methodology are revealed and resulting insights are discussed.
4
5 Frank ODonnell and Alex Duffy
6 January 2005
7
8
9
50
51111

v
1
2
3
Contents
4
5
6
7
8
9
1011
1
2
31 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
4 1.1 Scope of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
5 1.2 Key Areas Covered . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
6 1.3 Book Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
7
8
9
2011
Part I The Need for Design Performance
1
2 2 The Nature of Performance in Design . . . . . . . . . . . . . . . . . . . . . . . . . 7
3 2.1 Business Process Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
4 2.1.1 What Is Performance? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
5 2.1.2 Why Assess and Analyse Performance? . . . . . . . . . . . . . . . . 8
6 2.1.3 Achieving Improved Performance . . . . . . . . . . . . . . . . . . . . 8
7 2.1.4 Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
8 2.2 Performance in Design Development . . . . . . . . . . . . . . . . . . . . . . 11
9 2.2.1 Analysing Performance in Design . . . . . . . . . . . . . . . . . . . 12
3011 2.2.2 Characteristics of Performance in Design . . . . . . . . . . . . . 13
1 2.2.3 Factors Inuencing Design
2 Development Performance . . . . . . . . . . . . . . . . . . . . . . . . . 14
3 2.2.4 Coherence in Design Performance . . . . . . . . . . . . . . . . . . . 15
4 2.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5
6 3 Design Performance Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
7 3.1 Overview of Performance Research . . . . . . . . . . . . . . . . . . . . . . . 19
8 3.1.1 Trends in Performance Research . . . . . . . . . . . . . . . . . . . . 20
9 3.2 Dening and Modelling Performance . . . . . . . . . . . . . . . . . . . . . 21
4011 3.3 Design and Design Activity Performance . . . . . . . . . . . . . . . . . . . 23
1 3.4 Supporting Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2 3.5 Identication and Quantication of
3 Inuencing Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4 3.5.1 Empirical Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5 3.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
6
7 4 Industrial Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
8 4.1 Current Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
9 4.1.1 Business Level Approaches . . . . . . . . . . . . . . . . . . . . . . . . 38
50 4.1.2 Design Development Analysis . . . . . . . . . . . . . . . . . . . . . . 39
51111 4.2 Industrial Insight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

vii
Contents

4.2.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 1
4.2.2 Overview of Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2
4.2.3 The Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 3
4.2.4 Trial Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4
4.2.5 Further Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 5
4.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 6
7
8
Part II A Methodology for Enhanced Design Performance 9
1011
5 A Formalism for Design Performance Measurement and 1
Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 2
5.1 Activity Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 3
5.1.1 A Knowledge-Based Model of Design . . . . . . . . . . . . . . . . 56 4
5.2 Activity Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 5
5.2.1 Design and Management . . . . . . . . . . . . . . . . . . . . . . . . . . 61 6
5.2.2 A Model of Design Activity Management . . . . . . . . . . . . . 63 7
5.3 Managed Activity Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . 65 8
5.3.1 Decomposition Relationships . . . . . . . . . . . . . . . . . . . . . . 65 91
5.3.2 Temporal Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 2011
5.4 A Design Performance Model E2 . . . . . . . . . . . . . . . . . . . . . . . . 69 1
5.4.1 Efciency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 2
5.4.2 Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 3
5.4.3 Relating Efciency and Effectiveness . . . . . . . . . . . . . . . . . 77 4
5.4.4 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5
5.5 A Model of Performance Measurement and 6
Management (PMM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 7
5.5.1 Measuring and Managing Effectiveness . . . . . . . . . . . . . . . 80 8
5.5.2 Measuring Efciency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 9
5.5.3 Achieving Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 3011
5.6 Summary and Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 1
2
6 The Impact of Resources on Effectiveness . . . . . . . . . . . . . . . . . . . . . 89 3
6.1 Inuences on Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 4
6.1.1 The Inuence of Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . 90 5
6.1.2 Inuence of Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 6
6.1.3 Inuence of Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 7
6.2 Resource Knowledge (R) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 8
6.3 Relating Resource Use and Effectiveness . . . . . . . . . . . . . . . . . . . 94 9
6.3.1 Resource Impact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 4011
6.3.2 The Resource Impact Model . . . . . . . . . . . . . . . . . . . . . . . 96 1
6.3.3 Nature of Impact Prole . . . . . . . . . . . . . . . . . . . . . . . . . . 96 2
6.3.4 Assessing Relative Impact . . . . . . . . . . . . . . . . . . . . . . . . . 99 3
6.3.5 Resource Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 4
6.4 Principles of Analysing Impact of Resources . . . . . . . . . . . . . . . 102 5
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 6
7
7 The PERFORM Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 8
7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 9
7.2 Specication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 50
7.2.1 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 51111

viii
Contents

1 7.2.2 Goal Denition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108


2 7.2.3 Goal Prioritisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
3 7.2.4 Resource Denition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
4 7.3 Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
5 7.3.1 Resource Exploitation . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
6 7.3.2 Resource Impact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
7 7.4 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
8 7.4.1 Analysis Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
9 7.4.2 Data Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
1011 7.4.3 Analysis Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
1 7.5 Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
2 7.6 Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
3 7.7 Further Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
4 7.8 Supporting the Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
5 7.9 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
6
7 8 A Methodology for Performance Modelling and Analysis
8 in Design Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
9 8.1 Overview of Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
2011 8.2 Design Development Performance Formalism . . . . . . . . . . . . . . 127
1 8.3 Area of Performance Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 128
2 8.4 Analysis Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
3 8.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
4
5
6 Part III Application and Key Features
7
8 Part 3 Prelude . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
9
3011 9 Worked Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
1 9.1 Overview of the Delft Assignment . . . . . . . . . . . . . . . . . . . . . . 137
2 9.2 Formalising Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
3 9.3 Measuring Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
4 9.4 Applying PERFORM to Improve Performance . . . . . . . . . . . . 142
5 9.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
6
7
8 10 Analysis and Critique of Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
9 10.1 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
4011 10.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
1 10.2.1 Supporting E2 and DAM Models . . . . . . . . . . . . . . . . . .148
2 10.2.2 Insights into Design Development Metrics . . . . . . . . . .149
3 10.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
4
5 11 Industrial Appraisal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
6 11.1 Issues Raised . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
7 11.2 Industrial Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
8
9 12 The PERFORM System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
50 12.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
51111 12.2 Application of PERFORM in Industry . . . . . . . . . . . . . . . . . . . 162

ix
Contents

12.2.1 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162 1


12.2.2 Industrial Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166 2
12.3 Validation of Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167 3
4
13 Methodology Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171 5
13.1 Design Development Performance Formalism . . . . . . . . . . . . . 171 6
13.1.1 Generality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 7
13.1.2 Comprehensiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 8
13.1.3 Supporting Coherence . . . . . . . . . . . . . . . . . . . . . . . . . 173 9
13.2 Design Development Performance Analysis . . . . . . . . . . . . . . 174 1011
13.2.1 Generality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174 1
13.2.2 Comprehensiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . 174 2
13.2.3 Supporting Coherence . . . . . . . . . . . . . . . . . . . . . . . . . 174 3
13.2.4 Sensitivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 4
13.2.5 Subjectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 5
13.2.6 General Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . 175 6
13.3 Further Investigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 7
13.3.1 Complexity and Novelty . . . . . . . . . . . . . . . . . . . . . . . . 176 8
13.3.2 Resource Impact Proles . . . . . . . . . . . . . . . . . . . . . . . 177 91
13.3.3 Knowledge-Based Efciency . . . . . . . . . . . . . . . . . . . . . 178 2011
13.3.4 Use of Efciency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 1
13.3.5 Establishing Performance Trends . . . . . . . . . . . . . . . . . 179 2
13.3.6 The PERFORM Approach and System . . . . . . . . . . . . . 179 3
4
14 Summary and Insights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 5
14.1 Key Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 6
14.1.1 Nature of Performance in 7
Design Development . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 8
14.1.2 Requirements for Performance Modelling 9
and Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 3011
14.1.3 Problems/Deciencies in Performance 1
Modelling and Analysis . . . . . . . . . . . . . . . . . . . . . . . . 183 2
14.1.4 Methodology for Performance Modelling 3
and Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 4
14.1.5 Industrial Relevance . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 5
14.2 Axioms of Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 6
14.3 Implications for Design Development . . . . . . . . . . . . . . . . . . . 186 7
8
9
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 4011
1
Appendix: A Review of Metrics in Design Development . . . . . . . . . . . . . 193 2
3
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 4
5
6
7
8
9
50
51111

x
1 Introduction

Organisations are continually striving to nd better ways


of improving their business situation in terms of
improved market share and/or size and ultimately
increased prots. To achieve this they show a willingness
to embrace a plethora of business improvement initia-
tives. These initiatives range from redening their key
business processes [1, 2] to the selection and implemen-
tation of specic tools in support of these processes [3].
This trend has been evident in all organisational activi-
ties such as manufacturing, marketing, distribution, etc.
However the impact of the design process on overall
organisational performance has been recognised as being
increasingly signicant. For example, in the Design
Council Annual Review of 1997 92% of British businesses
agreed that design helps to produce a competitive advan-
tage [4]. Wheelwright and Clark [5] emphasise the
importance of Product Development in the fast changing
environment that organisations operate in:
In a turbulent environment, doing product and process devel-
opment well has become a requirement for being a player in
the competitive game; doing development extraordinarily well
has become a competitive advantage.
Therefore, the analysis of performance in this area, i.e.
understanding how well product development is being
done, has become an increasing focus of attention as
it forms a key element in implementing continuous
improvements. Current research efforts in areas such as
Concurrent Engineering, Team Engineering and Design
Management are aligned with this focus on continuous
improvement. The eventual aim of this work is to have
the output adopted by industry and result in improved
performance. But evidence of increased performance

1
Introduction

must be provided using the correct metrics in the correct manner in order to 1
fully evaluate and/or validate the impact of research [6]. The ultimate success 2
of these research efforts may then become apparent through systematic meas- 3
urement of performance, e.g. on a before-and-after basis or as part of an 4
internal/external benchmarking initiative. Yet there are many uncertainties 5
surrounding the practice of performance measurement in this area. 6
Considerable work has already been carried out in measuring the perfor- 7
mance of the manufacturing process [7] but there are inherent differences 8
in the nature of design/development compared to manufacturing [8] which 9
make performance measurement more difcult [9]. The measurement of 1011
knowledge based activities in design where outputs are less tangible than in 1
processes such as manufacturing poses particular problems. The impact 2
of activities within the conceptual design stage may not be evident for a 3
number of years, i.e. when the success of the product in the market has been 4
established. 5
6
7
1.1 Scope of the Book1 8
91
This book attempts to establish an underlying theory or formalism of per- 2011
formance as applied to the design process, i.e. from the identication of a 1
need through to the description of a solution to meet that need. Activities and 2
their inter-relationships are considered to be the key elements that make up 3
the process of design. This process is presented in a variety of models adopt- 4
ing different viewpoints. For example, models of design as a problem solving 5
process as presented in [10], phase models such as those presented in [11, 12], 6
and knowledge based models such as that dened in [13]. The knowledge- 7
based viewpoint of design is adopted in this work as a basis for describing the 8
process. The work does not focus on additional activities required for product 9
development as a whole such as marketing, manufacturing, etc., as described 3011
in [1416]. 1
Although existing process models of design help to characterise the focus 2
of the book they do not comprehensively describe it. These models focus pri- 3
marily on the activities required to create a design solution without an analy- 4
sis of the activities involved in managing the process by which that solution 5
is developed, i.e. design management activities. To comprehensively analyse 6
performance it is necessary to consider performance in relation to both the 7
design (artefact) and the design process. The term design development is used 8
throughout this work to indicate that both the design and its development are 9
being considered. 4011
The presented methodology is aimed at providing a new and clear under- 1
standing of performance in design development and utilising this under- 2
standing to determine a means by which performance may be analysed and 3
improved. Although the understanding developed here provides a basis for 4
determining metrics with which to measure performance, the work does not 5
prescribe metrics for use in design development as these are considered to be 6
specic to each case. 7
Many factors may be considered when analysing performance within a 8
process such as design development. This work does not involve the analysis 9
of aspects such as the motivation of designers, loyalty, power relationships, 50
individual reward schemes, etc. which inuence performance. That is, the 51111

2
Introduction

1 work focuses on the technical, as opposed to social, subsystem within design


2 development [17], and all resources are considered to be sources of knowledge
3 with no further analysis of their behaviour.
4
5
6 1.2 Key Areas Covered
7
8 The book presents a methodology for performance modelling and analysis that
9 supports performance improvement in design development. In particular it:
1011
Identies requirements for performance modelling and analysis as a means
1
to improve performance.
2
3 Establishes the degree of support that is provided in research and practice
4 for these requirements so that shortcomings of approaches may be identi-
5 ed.
6 Presents the needs of industry in relation to performance analysis.
7 Presents a formalism of performance in design development in order
8 to understand the nature of the phenomenon and provide a basis for its
9 analysis.
2011 Identies an approach for analysing performance.
1 Outlines a computer-based system to support design performance
2 enhancement.
3
4 Discusses the strengths and weaknesses of the methodology to identify
5 areas for future work.
6
7
8 1.3 Book Structure
9
3011 The remainder of this book is structured in three parts:
1
2 Part One: The Need for Design Performance (Chapters 2, 3 and 4)
3 Chapter 2 presents an overview of performance analysis in a business
4 context with specic focus in design development and reveals the nature
5 of the problem of managing design performance. Requirements for for-
6 malising and analysing design development performance are established.
7 Chapter 3 presents an overview of approaches and identies their contri-
8 bution to design development performance. The key issues are also dened.
9 Chapter 4 identies and reviews some of the more commonly used per-
4011
formance analysis approaches in industry and presents an industrial trial
1
to test a proposed analysis approach. The results of the trial provided valu-
2
able input to the understanding and development of the methodology
3
presented in Part 2.
4
5
Part Two: A Methodology for Enhanced Design Performance (Chapters 5, 6, 7
6
and 8)
7
8 Chapter 5 introduces a formalism for performance measurement and man-
9 agement in design. This provides a new and comprehensive insight into
50 design development performance and relates this within a process model
51111 distinguishing and relating design and design management.

3
Introduction

Chapter 6 is focused on the analysis of aspects that inuence effectiveness 1


in design development. The nature of the inuence of resources is revealed 2
and modelled as a basis for identifying areas of performance improvement. 3
Chapter 7 presents an approach for the analysis of performance (effective- 4
ness) in design development in a manner that supports the identication 5
of key areas where performance may be improved. 6
Chapter 8 describes the overall methodology for performance modelling 7
and analysis based on that presented in the previous three chapters and 8
showing how the elements relate. 9
1011
Part Three: Application and Key Features (Chapters 9, 10, 11, 12, 13 and 14) 1
2
Chapters 9, 10, 11 and 12 describe a number of approaches to assessing 3
the application of the presented methodology and its results. Chapter 9 4
presents a worked example based upon a workshop held at Delft Uni- 5
versity on Research in Design Thinking II Analysing Design Activity, 6
that took place in 1994. 7
Chapter 10 presents a review of metrics that have been previously estab- 8
lished and checks them against the developed ideas of the book. 91
Critical appraisals from industrial practitioners and a leading academic 2011
researcher are presented in Chapter 11. 1
The practicality and utility of the analysis of aspects that inuence perfor- 2
mance is illustrated in Chapter 12 through the implementation of a 3
computational system. 4
5
Chapter 13 provides a review of the methodology in terms of its strengths 6
and weaknesses, and areas for further investigation. 7
Chapter 14 concludes the book with a summary and a description of the 8
key insights in design performance. 9
3011
1
Note 2
3
1 This book is adapted from the PhD Thesis of one of the authors, i.e.: A methodology for
performance modeling and analysis in design development, F J ODonnell, PhD thesis,
4
University of Strathclyde, Glasgow, UK, December 2000. 5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

4
1
2
3
4
5
6
7
8
9
1011
1 PART I
2
3
4
5
6
7
8
9
The Need for
2011
1
2
Design
3
4
5
6
Performance
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
1
2
3
4
2 The Nature of
Performance in
Design

The following chapter provides an insight into the nature


of performance in design and identies the issues that are
key to achieving enhanced design performance.
The chapter begins with a general discussion on per-
formance analysis in a business context, highlighting
5 some of the key drivers behind this activity and the
6 process of performance improvement. The specic area
7 of design performance is then discussed, with reference
8 to particular characteristics of analysing its performance.
9 The chapter concludes with a summary that highlights
1011 the key requirements for performance modelling and
1 analysis in design to achieve improvement.
2
3
4 2.1 Business Process Performance
5
6 The topic of performance has been of interest within
7 business environments for a number of years and many
8 of the methods used to manage business performance
9 today have been available since the early twentieth
2011 century [18]. The following provides a general overview
1 of performance in business processes and discusses the
2 main elements involved in achieving improvement.
3
4
5 2.1.1 What Is Performance?
6
7 The denition of performance used in a particular
8 context may be varied and is often dependent on the
9 discipline of the individual or group discussing the phe-
3011 nomenon [19]. The Chief Executive of a public limited
1 company, when asked how the company is performing,
2 may dene performance in terms of the current share

7
The Need for Design Performance

value, reecting the result of performance across all business processes. The 1
Production Manager may describe performance of the production process(es) 2
in terms of the number of products produced over a given time. A Product 3
Development Manager may discuss the performance of the product develop- 4
ment process in terms of the percentage of turnover due to new products. 5
These descriptions of performance provide different but related viewpoints 6
of the phenomenon. That is, share value, number of products produced, etc. 7
are used as indicators that reect aspects of the phenomenon but do not 8
describe its nature. A formal denition of performance is required to support 9
common understanding as a basis for carrying out analysis in the area. 1011
1
2
2.1.2 Why Assess and Analyse Performance? 3
4
Maintaining or improving performance is an aim of most businesses and 5
understanding the performance of the business processes plays a central 6
part in the implementation of performance improvements, i.e. assessing and 7
analysing the performance is a fundamental part of managing a business 8
and is often difcult to distinguish from the general management task 91
[1822]. For example, Sinclair et al. [22] in their study of 115 companies found 2011
that in most cases there was no separate performance management system 1
and measurement was integrated within the overall management process. 2
Although the analysis of performance is seen as an integral part of man- 3
aging a business a number of authors present specic reasons or benets for 4
carrying out performance analysis [19, 23, 24]. These may be summarised as 5
follows: 6
To allow comparison against some standard. For example the comparison 7
of performance of different organisations in the practices of benchmark- 8
ing [25]. This allows a diagnosis of whether performance is increasing or 9
decreasing. 3011
To support planning and control. The analysis of performance against 1
a plan supports the decisions necessary to make more effective future 2
plans and exercise control in order to increase the chances of achieving an 3
existing plan. 4
5
To provide insight in to particular business processes, e.g. through identi- 6
cation of those factors inuencing performance and the nature of their 7
inuence, and support organisational learning. 8
To motivate individuals and groups within an organisation to improve 9
their performance. 4011
The benets of analysing performance outlined here may be related within 1
an overall aim, i.e. the improvement of performance. That is, all of the ben- 2
ets may be seen as elements that are necessary to achieve performance 3
improvement1. 4
5
6
2.1.3 Achieving Improved Performance 7
8
Many of the functions of a performance measurement system may be cate- 9
gorised within the area of decision support, i.e. the results of performance 50
analysis may be used to assist organisational actors in making decisions. 51111

8
The Nature of Performance in Design

1 These decisions may result in actions such as the allocation of resources,


2 denition of goal priorities, implementation of controls, etc. That is, perfor-
3 mance assessment or analysis, per se, does not achieve performance improve-
4 ment but is a component of a process by which improvement may be realised.
5 Figure 2.1 presents a model of performance improvement, which outlines
6 three key elements of the performance improvement process. These elements
7 are assessment, analysis and action, and are incorporated within a process
8 that is cyclic in nature aimed at continuous performance improvement of any
9 business process such as design, manufacturing, etc. These business processes
1011 may be dened in terms of their inputs, outputs, goals and resources. That is,
1 a business process involves the transformation of inputs to outputs, using
2 resources, to achieve particular goals. The following describes the stages in
3 the model:
4
Assessment2: The assessment of performance is aimed at establishing values
5
for different aspects of performance through the application of particular
6
performance metrics/measures3. The assessment focuses on key elements
7
of the particular business process(es), i.e. the inputs, outputs, goals and
8
resources. A more detailed discussion of these elements is presented in
9
Chapter 5. The output of the assessment may be values such as the dura-
2011
tion of a business process, the cost incurred, the deviation from planned
1
performance, etc. This provides a description of current performance
2
within a business process but does not identify causes of performance or
3
provide more detailed analysis.
4
5 Analysis: Performance analysis is aimed at providing information on the
6 causes of high or low performance to support decision making for improve-
7 ment. For example, the analysis may be aimed at identifying the strengths
8 and weaknesses in the process, analysing the inuence of resources such
9 as particular tools, identifying trends, or dening the scope for improve-
3011 ment. The output of performance analysis is a more comprehensive under-
1 standing of the performance in the process, in particular the areas for
2 improvement and the factors that may achieve the improvement.
3 Action: The output from analysis provides the necessary information to
4 support decision making in relation to the necessary actions to achieve
5 improved performance. These actions could include the introduction of
6
7 Decision
8
9 Assessment Analysis Action
4011
1
2 Metrics
3
4
5 Business Processes
6 Design Manufacture Distribution ......
7
inputs inputs inputs ...
8 outputs outputs outputs ...
9 goals
resources
goals
resources
goals
resources
...
.
50
51111 Figure 2.1 A model of the performance improvement process.

9
The Need for Design Performance

new resources, the alteration of the organisational structure, reallocation 1


of existing resources, etc. Continued application of the performance 2
improvement process results in the type of learning necessary to provide 3
the ability to build a predictive model of performance. 4
5
In the model described here the analysis of performance forms a key element
6
in performance improvement. This analysis allows the factors that inuence
7
performance to be identied and the nature of their inuence to be dened.
8
This understanding allows future performance to be predicted/simulated and
9
is therefore the key to supporting decisions regarding the implementation
1011
of changes to increase performance. The assessment of performance, per se,
1
does not ensure improvement as it is based on historical data and merely pro-
2
vides a baseline from which to improve. The work reported in this book is
3
based on the viewpoint that performance improvement is the overall aim of
4
research in this area and that analysis is the most complex activity within the
5
improvement process.
6
7
2.1.3.1 Performance Metrics 8
91
The model presented in Figure 2.1 illustrates that performance metrics are a 2011
specic element used in the performance improvement process. Performance 1
metrics are briey discussed here due to the wide attention given to them in 2
the literature. 3
The performance of organisations has been a topic of research throughout 4
the twentieth century, but the observation by a number of authors [18, 26] 5
that the area is currently going through a revolution indicates that many issues 6
remain unresolved in this eld. One of the key issues around which there 7
is much discussion is the selection and use of appropriate metrics with which 8
to measure performance [27]. There is general agreement in the literature 9
that the metrics used in performance assessment are widely varied and gen- 3011
erally dominated by those which assess nancial input and outputs [2830]. 1
Although there have been attempts to develop metrics for application across 2
a range of processes and industries, studies suggest that it may not be feasi- 3
ble to determine a generic set of metrics which would be broadly applicable 4
[31]. A number of authors suggest that metrics must continually change as 5
the context in which they are used is changing and the development of xed 6
metrics is inappropriate [29, 32]. For example Dixon [29] states that: 7
8
Each organisation, each unit within each organisation, and each unit within each 9
organisation at each point within its strategic, market, and technological evolution 4011
will require its own unique set of performance measures.
1
Metrics should be derived from a model of the activity/process under inves- 2
tigation, created from a viewpoint of performance and reecting aspects that 3
are specic to the scope of investigation. That is, the means by which the most 4
appropriate metrics may be dened for any particular situation is considered 5
to be of most importance, rather than the actual metrics themselves. We do 6
not attempt here to dene a generic set of metrics to support the measure- 7
ment of performance in design, but provide a formalism that supports the 8
modelling of design performance within any situation. Such a model, when 9
developed for a particular area of performance analysis, will provide a basis 50
upon which the most appropriate metrics may then be dened. 51111

10
The Nature of Performance in Design

1 2.1.4 Coherence
2
3 Within an organisation the measurement of performance may take place in
4 a dispersed and often unrelated manner, i.e. the performance improvement
5 process illustrated in Figure 2.1 may be applied to different business processes
6 in isolation. This can result in sub-optimisation where a particular business
7 process may be performing very well but its performance may have a negative
8 inuence on the performance of other processes and ultimately, the overall
9 performance of the organisation.
1011 Achieving coherence, i.e. aligning performance throughout the organisa-
1 tion, has been identied as a key requirement in the design of performance
2 analysis approaches [7, 26, 33, 34]. de Haas [33] identies the requirement for
3 coherence as an attribute of a performance measurement (PM) system which
4 causes performance (i.e. achieved scores on performance indicators) by the
5 group acting upon that system to contribute to the performance of other
6
interdependent groups and, thereby, to contribute to the performance of the
7
organisational entity as a whole. Although the work reported by de Haas is
8
focused on the contribution of individuals or groups within an organisation
9
the denition of coherence may be equally applied to business processes.
2011
That is, coherence causes the performance of a process to contribute to the
1
2 performance of other interdependent processes and to the business as a whole.
3 These interdependent processes may be inter-related in a variety of
4 complex ways, e.g. a process/sub-process relationship may exist or the
5 processes may exist at the same hierarchical level. For example, design may
6 be viewed as a sub-process of product development. The performance of the
7 design process must be aligned with the overall performance of the product
8 development process to ensure that design will contribute positively to the
9 performance of the product development process. Similarly the design and
3011 manufacturing processes should be aligned so that high performance in one
1 does not adversely affect the performance of the other and subsequently,
2 overall product development performance. Achieving coherence is necessary
3 to ensure performance improvements within any area of the business deliver
4 an overall benet to the business.
5
6
7 2.2 Performance in Design Development
8
9 The above discussion presents an overview of performance in relation to
4011 general business process and illustrates some of the more generic elements.
1 However, we focus here on performance in design, i.e. a specic business
2 process. Design may be seen as a process of goal-directed reasoning where
3 there are many possible (good) solutions and although the process can be
4 supported methodologically, it cannot be logically guaranteed [15]. The
5 process is inherently uncertain in progressing from an initial need to arriv-
6 ing at a solution that meets that need. The literature reports various models
7 of the design process that may be categorised as either descriptive or pre-
8 scriptive in their approach [35]. Descriptive models, such as those found in
9 [13, 36] describe how the design process is carried out, often created through
50 the use of protocol studies [37, 38], while prescriptive models prescribe how
51111 the design process should be carried out, e.g. those presented in [12, 39, 40].

11
The Need for Design Performance

The basic design cycle proposed by Roozenburg [15] may be seen to encom- 1
pass key activities of design, i.e. analysis, synthesis and evaluation and pre- 2
sents a basis for describing performance analysis. The cycle begins with 3
identication of a difference between the current state and some desired 4
future state (e.g. the existence of a new artefact). The initial activity of analy- 5
sis is aimed at clarifying the desired future state and dening it in terms of 6
a statement of the problem including goals (i.e. function) to be achieved. 7
Although the information available at the initial stages of the design cycle is 8
often incomplete, inconsistent and ambiguous [13] and analysis may only 9
produce initial goals, criteria (i.e. measures) should be established at this 1011
point, which form part of the value system referred to during evaluation. The 1
synthesis activity draws on the creativity of the designer to produce a solu- 2
tion proposal(s) intended to meet the goals established in analysis. The eval- 3
uation activity involves the identication of how well the solution meets the 4
goals and ultimately whether a satisfactory value has been achieved with a 5
particular solution(s). 6
A review of design process models is presented in [41] that discloses char- 7
acteristics of the design process. The intangible nature of design is revealed 8
which indicates the additional challenges in terms of dening, analysing 91
and improving performance. The following sections outline specic issues in 2011
relation to performance in this area. 1
2
3
2.2.1 Analysing Performance in Design 4
5
The difculties associated with dening performance in general are equally 6
apparent in the area of design performance and the existing design process 7
models provide little insight into the phenomenon. When we refer to perfor- 8
mance in design it can be related to different areas, e.g. the performance of 9
the design solution (artefact) in terms of its properties, such as the top speed 3011
of a car, or the performance of the process by which the solution was created 1
in terms of duration or cost. These areas of performance are related in some 2
way but the relationship is unclear. 3
4
5
2.2.1.1 Design (Artefact) Performance 6
7
The evaluation activity within the basic design cycle represents an area of 8
performance measurement in design, i.e. the measurement of the solution 9
(artefact, product, etc.) and assigning values to it, in order to guide decision 4011
making. That is, an aspect of the performance of the solution, how well it 1
meets initial goals, is measured in order to establish how the design cycle 2
should proceed. This area of performance measurement forms a critical part 3
in the basic design cycle and is inherent in the design process. 4
5
6
2.2.1.2 Design Activity Performance 7
8
The evaluation of solution proposals (i.e. designs) within the basic design 9
cycle does not provide a complete view of performance in design. In order to 50
create and evaluate such solutions it is necessary to use resources such as 51111

12
The Nature of Performance in Design

1
2 Performance
3 in Design
4
5
6
7
8 Design
Activity
9 (Artefact)
1011
1 Figure 2.2 Performance relationships in design.
2
3
4 designers, tools, etc. over a period of time resulting in costs being incurred.
5 In addition to the output discussed above, performance in design also encom-
6 passes the process by which that output was created. For example, the basic
7 design cycle may be repeated many times within a phase model of the design
8 process, e.g. that presented in [12, 40, 42], to arrive at a satisfactory solution.
9 The overall performance in design will be determined both by the resulting
2011 design (artefact) and how well the activities required to produce that artefact
1 are carried out (Figure 2.2). These two areas of performance are often referred
2 to as product and process performance within the literature4.
3
4 2.2.1.3 Relating the Key Elements of Performance
5
6 As the artefact is created using design activities there is a causal link between
7 the performance of the artefact and the activity, e.g. the types of resources used
8 in the activity will inuence the performance of the artefact. Also, the com-
9 plexity of the artefact will inuence the activity performance, e.g. a more
3011 complex artefact may mean that the activity duration is longer [43]. As
1 discussed above both artefact and activity performance are components
2 of overall design performance. These relationships are illustrated within a
3 simplistic model in Figure 2.2. This model fails to fully dene the relationships
4 between the different performance elements. The nature of these relation-
5 ships is complex and existing design activity/process models do not address
6 this adequately.
7
8
9 2.2.2 Characteristics of Performance in Design
4011
1 In comparison to areas such as manufacturing the analysis of performance in
2 design is more complex. For example performance in manufacturing is con-
3 cerned with the measurement of a more tangible output, i.e. physical goods,
4 and the activities are more easily dened and understood. Cusumano [9] sug-
5 gests that compared to manufacturing, product development presents unique
6 difculties for researchers. . . . and that activities required to make new
7 products require considerable conceptualisation, experimentation, problem
8 solving, communication and coordination. . . . rather than simply assembling
9 components or doing other relatively routine tasks. Brookes [44] also high-
50 lights the differences between performance measurement in manufacturing
51111 and design (Table 2.1).

13
The Need for Design Performance

Table 2.1. Comparison of manufacturing and product introduction [44] 1


Manufacturing Operations Product Introduction 2
Similarity between processes Very similar Dissimilar 3
Number of times process performed Many times Once 4
Time-lag for measuring output Weeks Years 5
Ability to measure output Directly measurable Indirectly measurable
6
7
8
The following characteristics of design highlight the difculties in analysing 9
performance in this area: 1011
The activities and their associated elements such as inputs, outputs, etc. 1
are primarily knowledge based. For example the output of a design activ- 2
ity might be a concept or an idea. Measuring the inputs and outputs of a 3
knowledge based activity such as synthesis provides particular difculties. 4
For example, it is not clear how the value of an idea may be dened or 5
evaluated. This further complicates any comparison between inputs and 6
outputs. 7
8
Although the activity types may be the same in each project the actual activ- 91
ities will differ. That is, the activities in design in terms of the inputs, 2011
outputs, goals, etc. are, by nature, novel and not repeated. Therefore estab- 1
lishing targets or norms in terms of activity duration, output, etc. may draw 2
on the experiences in previous projects as a guide but the uniqueness of 3
the activities make it difcult to achieve any accuracy in prediction. 4
Dening contribution to business performance: there is a general consen- 5
sus that design contributes signicantly to business success. However, the 6
means of dening this contribution has not achieved the same degree of 7
consensus in the research. There are a wide number of processes and 8
factors inuencing the success of a business [14] and isolating the inuence 9
of design is therefore difcult. 3011
The time between the delivery of a nal design output and the measure- 1
ment of its performance in the marketplace can often be a number of years. 2
Therefore the analysis of performance of a product in the market may 3
provide suitable learning for future projects but it is unsuitable in terms of 4
supporting decisions in relation to the design process as it is being carried 5
out. Other more timely information is required to support the management 6
of design development. 7
8
9
2.2.3 Factors Influencing Design Development Performance 4011
1
Within the model of performance improvement presented in Figure 2.1 the 2
analysis of factors inuencing performance was identied as a key element 3
supporting decisions on performance improvement. Research in the area of 4
design development aims to produce approaches, methods or tools which, 5
when implemented in industry, result in increased performance of the 6
process. The areas of work range from redening the key business processes 7
[1, 2] to the selection and implementation of specic tools in support of 8
these processes [3]. There are many claims on how research results improve 9
performance, such as the claimed reduction in cycle time through the adop- 50
tion of Concurrent Engineering (CE), yet it has been proven difcult to 51111

14
The Nature of Performance in Design

1 provide concrete evidence in support of such claims [45, 46]. However, there
2 have been numerous attempts to determine the relationship between the
3 approaches, methods and tools employed in the design process and the level
4 of success, e.g. [4749] the results are often empirically observed correlation
5 between factors and performance with limited theoretical understanding [50].
6 Through establishing the link between particular approaches, methods and/or
7 tools, the aspects of performance they inuence and the nature of that inu-
8 ence, more informed decision making regarding the selection and use of such
9 approaches, methods and/or tools can take place.
1011
1
2 2.2.4 Coherence in Design Performance
3
4 The area of coherence in performance was discussed in a general manner
5 in section 2.1.4. The issues raised regarding coherence apply equally to per-
6 formance in design. The following section describes particular aspects of
7 coherence as applied to design performance.
8
9
2011 2.2.4.1 Scope of Analysis
1
2 Design generally takes place within a business context and therefore the
3 performance may be analysed at many different levels from the market
4 impact of the overall product development program to the performance of
5 individual design projects and ultimately, specic design activities. In addi-
6 tion performance analysis can be applied over a particular range of activi-
7 ties e.g. single/multiple phases in a project or a series of individual activities
8 within a phase. Therefore the subject of analysis can be dened in relation
9 to a particular range and level (Figure 2.3) which describes the scope of the
3011 investigation. It is necessary to dene this scope clearly to ensure accurate
1 interpretation of any results and to understand the relationship between these
2 results and those obtained in other areas of the organisation.
3
4
5
6 RANGE
7
8 LEVEL
9 Product Development Program
4011
1 Project A
2
3
4
5
6
7 Phase i

8
9 Activity i
50
51111 Figure 2.3 Scope of performance analysis.

15
The Need for Design Performance

Hales identies a framework with ve levels of resolution, which allows the 1


context of a design project to be dened [51]. At the company level the issue 2
of performance is generally related to the totality of the product development 3
program, encompassing a number of projects at different stages and their 4
combined impact on the business. Often this level of analysis is centred on 5
the success of the product [52], i.e. it investigates how well the product meets 6
the high-level program goals relating to issues such as market share, corp- 7
orate image and technical success. Analysis of performance at lower levels, 8
such as individual projects, includes evaluation in relation to activity goals 9
such as lead time and cost. 1011
1
2
2.2.4.2 Goal Alignment for Coherence 3
4
Within the performance analysis of design there are two sets of goals which 5
performance may be analysed against, i.e. those relating to the design (arte- 6
fact) itself and those relating to the activities involved in creating it. In order 7
to achieve coherence in performance, analysis of design alignment within and 8
across these goals is required. 91
For example the strategic goal for the company may be to increase their 2011
margin on sales to a particular level for a specic product. This may 1
be achieved through the reduction of activity cost, which could in turn be 2
achieved through a reduction in rework. Therefore there is a need for align- 3
ment within these three activity related goals in terms of sales, cost and rework 4
which must be maintained in a goal breakdown structure (Figure 2.4 (a)). The 5
denition and maintenance of this relationship ensures that a high level of 6
performance with respect to rework contributes to the overall performance 7
of the company. 8
Similarly, alignment within goals relating to the design is required. In 9
Figure 2.4 (b) a goal, providing specic product functionality, contributes to 3011
achieving a higher level goal, meeting customer requirements and eventually 1
the overall goal, developing a successful product. For example, in the car 2
industry, the provision of remote locking functionality may contribute to 3
a customer requirement for ease of use and ultimately the success of the 4
product. 5
6
7
Goal D:
Develop Successful Product 8
Goal A:
Increased Margin on Sales 9
4011
1
2
Goal A1:
Goal D1:
Satisfy Customer
3
Reduced Activity Cost
Alignment
Within Goals
Requirements
4
5
Goal A1.1: 6
Reduced Rework Alignment Across Goals 7
Goal D1.1: 8
Provide Specific Functions
9
(a) (b) 50
Figure 2.4 Goal alignment of acitivity and design goals. 51111

16
The Nature of Performance in Design

1 Typically, alignment across goals is required in order to manage conict-


2 ing goals at the same level of analysis. For example the goal, minimising
3 activity cost, may be achieved to the detriment of the product quality, i.e. how
4 well the product meets the customer requirements. Any efforts to improve
5 performance of the activity in terms of cost reduction should be aligned with
6 the need to ensure customer requirements are met, and ultimately, overall
7 performance is improved.
8 Alignment of goals provides an important element in achieving coherence
9 in performance analysis. That is, through having goals fully aligned within an
1011 organisation, the analysis of performance against these goals is similarly
1 aligned and coherence in performance analysis may be achieved.
2
3
4 2.3 Summary
5
6 This chapter provided an overview of performance related to general busi-
7 ness processes before focusing on particular aspects of performance in design.
8 Throughout the chapter issues were raised which relate to the overall area
9 of improving performance in design development. A brief summary is
2011 provided here of the key points discussed, which highlights requirements
1 for performance modelling and analysis in design development to achieve
2 improvement:
3 Research in performance aims to contribute to achieving performance im-
4 provement in different situations. The assessment of performance through
5 the use of particular metrics provides a baseline for performance improve-
6 ment but is insufcient to ensure improvement. The analysis of perfor-
7 mance, and in particular those factors inuencing performance, provides
8 the opportunity to gain a greater understanding of performance and inform
9 decision making to achieve improvement.
3011 Both from a general viewpoint and the particular case of the design process
1
a comprehensive and generic denition of performance is lacking. The
2
characteristics of the design process offers additional challenges in terms
3
of dening performance in comparison to more tangible processes such
4
as manufacturing. There is a need to formalise performance in design
5
development as a pre-requisite to further analysis so that a common
6
unambiguous understanding exists.
7
8 Within design development performance two key areas of interest are
9 apparent, the performance of the design itself and the activities involved
4011 in creating it. Existing design process models describe the activities and
1 phases in design but focus primarily on the performance of the design
2 (artefact). They do not clarify the relationship between performance of the
3 artefact, performance of the activities and the overall performance in
4 design development.
5 Performance must be viewed holistically within a business environment to
6 avoid sub-optimal improvements. The concept of coherence is seen as
7 a key requirement in this respect. Within the design process the align-
8 ment of goals may provide a basis for achieving coherent performance
9 assessment, analysis and improvement.
50 Research in design development aims to provide results, e.g. methods, tools
51111 and approaches, which support improvement in performance. The ability

17
The Need for Design Performance

to identify the relationship between such methods, tools, etc. and improved 1
performance when implemented in industry is currently lacking. There 2
is a need to establish a means to identify this relationship and support 3
decision making on the best way to improve performance. 4
5
The discussion in this chapter on the nature of performance in design allows
6
some further points to be elaborated to reect the scope of this book:
7
Much of the research in this area focuses on the assessment and analysis 8
of the performance of particular business processes. This book is focused 9
primarily on the analysis of performance within design development, 1011
although many of the concepts discussed may be equally applicable to other 1
processes. 2
Activities are seen here as the basic elements of a process, i.e. business 3
processes are composed of a number of activities. Therefore the analysis 4
of performance at the level of individual activities is necessary to provide 5
a fundamental understanding of performance in this area. 6
The area of assessing performance is dominated by the denition of suit- 7
able metrics. We do not aim here at specifying metrics per se, but rather 8
at providing a formalism that supports their denition in any specic 91
situation. 2011
1
The empirical studies of performance form a substantial part of the liter-
2
ature in this area, i.e. studies aimed at observing design and analysing past
3
performance in order to draw conclusions from trends, dene descriptive
4
models, etc. The developed formalism is based on a fundamental under-
5
standing of performance in design and, although developed with reference
6
to design practice, is prescriptive in nature.
7
8
9
Notes 3011
1 Performance improvement may involve the maintenance of existing levels of performance
1
under increased constraints. 2
2 Assessment is used here where many authors use measurement. Measurement, in the 3
authors view, implies more accuracy and certainty than is often possible in this area while 4
assessment supports the practice of estimation.
5
3 Although the terms metric and measure may have different meanings within the literature
they are used synonymously within this book to mean either an area/type of measurement 6
such as time, or a specic measure such as project lead time. 7
4 The term artefact is considered to be more generic than product and may be used in rela- 8
tion to different design disciplines such as industrial, engineering, architectural, etc. The 9
term activity is used as opposed to process as processes are composed of activities and an
activity may be considered to be a fundamental element of a process. 4011
1
2
3
4
5
6
7
8
9
50
51111

18
3 Design Performance
Measurement

Some of the key issues involved in the measurement of


performance in design activities have been highlighted in
Chapter 2. These issues range from generic areas aimed
at performance measurement in any business process,
such as the varied interpretations of the nature of per-
formance, to aspects that are more specic to measuring
performance in the design process.
The following chapter incorporates an analysis of exist-
ing research in terms of how it addresses the following
requirements:
1. A clear denition or model of design development
performance which relates to the nature of activities
carried out in design, i.e. knowledge based, uncertain,
etc.
2. Distinguishing between performance in relation to the
design (artefact) from that of the activities involved in
creating it, yet relating these within a model of the
design development activity/process.
3. The need for coherence, i.e. alignment within and
across goals, related to the activities in design devel-
opment.
4. Support the identication of factors inuencing design
development and quantication of their inuence in a
manner which supports decision making, i.e. provide
an approach to selecting the best means to improve
performance.

3.1 Overview of Performance Research


The research reviewed here forms part of the overall
research in the area of performance of organisations.

19
The Need for Design Performance

1
Business Processes 2
Theoretical analysis 3
Product Development Performance measurement 4
approaches 5
Design
.
Empirical studies 6
Research reviews 7
Manufacturing 8
..
9
1011
Figure 3.1 Areas and types of performance related research. 1
2
3
Many of the articles and books reviewed in this chapter fall primarily within 4
the general area of business process performance measurement and manage- 5
ment. Some of the work is generic in terms of being applicable across all busi- 6
ness processes while other work is aimed at more specic processes such as 7
product development, design, manufacturing, etc. (Figure 3.1). Within such 8
areas the type of research and focus may vary widely and include empirical 91
studies aimed at determining relationships between performance in different 2011
processes, the design and implementation of approaches for measuring per- 1
formance, the development of theoretical models of activities that relate their 2
performance, etc. Design research, in particular the area of design activity/ 3
process modelling, is briey reviewed in order to analyse how it meets the 4
requirement to distinguish and relate the performance of the design and of 5
the activities involved in creating it. 6
7
8
3.1.1 Trends in Performance Research 9
3011
There has been considerable research published in the area of performance, 1
e.g. Neely [18] identied that between 1994 and 1996 some 3,615 articles on 2
performance measurement were published. He refers to the growth in mem- 3
bership of accountancy institutes and conferences on performance as indica- 4
tors of the increased interest in this area. However, in comparison to areas 5
such as manufacturing, measuring the performance in product design is 6
relatively undeveloped [44]. For example, at the PM2000 conference in the 7
UK [53] no papers focused specically on the analysis of design development 8
performance, from a list of over 90. Many authors have recognised the 9
particular difculties in measuring the performance in design/development 4011
activities [44, 54, 55] and identifying the causal relations between design per- 1
formance and business success [9, 56, 57]. These difculties arise from the 2
less tangible nature of outputs from design activities, i.e. knowledge, the often 3
long duration and wide range of inuences from design to market launch, the 4
difculty in dening/measuring design quality, etc. 5
The decline of management accounting as the only way to measure busi- 6
ness performance [21, 29, 58, 59] is an indication of the move toward mea- 7
suring less tangible aspects of performance, e.g. those related to knowledge 8
intensive activities in design. Johnson and Kaplan [59] suggest that traditional 9
accounting methods are unsuited to organisations where the product life 50
cycle is short and research and development assume increased importance. 51111

20
Design Performance Measurement

1 In addition the area of intellectual capital provides particular difculties for


2 measurement using traditional accounting methods [60].
3 The following sections present an overview of a broad range of literature
4 on performance based on the scope dened above.
5
6
7 3.2 Defining and Modelling Performance
8
9 The literature on performance is characterised by a lack of and inconsistency
1011 in, denition of terms. Numerous works have been published which directly
1 address the area of performance but do not explicitly dene performance
2 itself, e.g. [33, 58, 6164]. Neely [7], in his review of performance literature,
3 suggests that Performance measurement is a topic which is often discussed
4 but rarely dened. Meyer [32] suggests that there is massive disagreement
5 as to what performance is and that the proliferation of performance meas-
6 ures has led to the paradox of performance, i.e. that organisational control
7 is maintained by not knowing exactly what performance is. That is, the
8 lack of a comprehensive understanding of performance can often lead to igno-
9 rant acceptance of particular approaches, metrics, etc., proposed by senior
2011 management in an organisation.
1 A number of authors offer explicit denitions of performance, notably in
2 [6568] and others refer to dimensions of performance [6972]. Table 3.1 pre-
3 sents a list of performance, and performance related, denitions highlighting
4 the context in which the denitions were presented1. Some denitions of per-
5 formance may be inferred from the information given e.g. in [73] and [22]
6 performance may be inferred to mean the attainment of objectives.
7 The table serves to illustrate the lack of consistency in dening performance
8 within the literature, although authors may often be discussing the same
9 phenomenon. However, some key terms emerge, i.e. efciency and effective-
3011 ness are often cited in performance articles although not used specically to
1 dene performance [19, 54, 70, 7477]. In general, effectiveness is related
2 to the attainment of objectives/goals and efciency is seen to relate to the
3 use of resources. Although efciency and effectiveness are used to describe
4 performance, the relationship between these elements has not been dened.
5 The table also reveals the use of terms such as productivity, incorporating
6 denitions that relate closely to those provided for performance, indicating
7 inconsistency in terminology/denition. It has been observed that such
8 terms may have different meanings in different disciplines [19] and the more
9 general problem of terminology within the area of product development has
4011 been previously highlighted in [78, 79]. The denition of efciency provided
1 by Duffy [80] is similar to the denitions of performance in the table, while
2 Andreasens denition of efciency [14] explicitly addresses the knowledge
3 based characteristics of the design activity.
4 The authors dening performance in terms of its key dimensions (metrics)
5 primarily describe those dimensions within the areas of Time, Cost and
6 Quality2. The dimensions of time and cost can be relatively easily understood
7 and applied within performance measurement, but the concept of quality is
8 somewhat similar to performance itself in terms of its varied interpretation
9 and the alternative measures used in its evaluation. In some of the literature,
50 dimensions such as Focus in Development, Adaptability and Flexibility have
51111 been used. These metrics do not measure performance itself, but rather act

21
Table 3.1. Performance and performance related denition
Author/Source Element Dened Denition Context
Cordero [65] Performance Effectiveness (i.e. measuring output to determine if they help accomplish objectives); Research and
Efciency (i.e. measuring resources to determine whether minimum amounts are Development/
used in the production of these outputs). Organisation
Dwight [66] Performance The level to which a goal is attained. General
Neely et al.[67] Performance Efciency and Effectiveness of purposeful action. Business
Rolstadas [68] Performance A complex interrelationship between seven performance criteria: Organisational
effectiveness System
efciency
quality
productivity
quality of work life
innovation
protability/budgetability
Clark & Fujimoto [69] Dimensions of Performance Total Product Quality, Lead Time and Productivity (level of resources used) Product Development
Doz [70] Dimensions of Performance Focus in Development, Speed of Development and R&D Efciency Product Development
Emmanuelides [71] Dimensions of Performance Development time, Development Productivity (use of resources) and Total Design Product Development
Quality. (project)
Moseng & Bredrup [72] Dimensions of Performance Efciency, Effectiveness and Adaptability Manufacturing
Neely et al. [7] Dimensions of Performance Time, Cost, Quality and Flexibility. Manufacturing
van Drongelen[73] Performance Measurement The acquisition and analysis of information about the actual attainment of General
company objectives and plans, and about factors that may inuence this attainment.
Sinclair & Zairi [22] Performance Measurement The process of determining how successful organisations or individuals have been Organisations/
in attaining their objectives Individuals
Andreasen & Hein [14] Efciency Ratio of increase in (clarication + risk reduction + detail + documentation) Product Development
TO (increase in costs)
Grifn & Page[82] Productivity A measure of how well resources are combined and used to accomplish specic, General
desirable results.
Duffy [80] Design Productivity Efciency and Effectiveness Engineering Design
Goldschmidt [83] Design Productivity Efciency and Effectiveness Engineering Design

9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
91

4011
3011
2011
1011

51111
Design Performance Measurement

1 as inuences on it. For example, exibility is only appropriate within an envi-


2 ronment where changes are required and the capability for a process, such as
3 design or manufacture, to change rapidly may add unnecessary cost/overhead
4 in a stable environment. Flexibility will inuence the performance, i.e. ef-
5 ciency and/or effectiveness of an activity/process, but does not constitute a
6 dimension of performance itself.
7 The variety of measures used to evaluate quality is indicative of the greater
8 problem regarding performance measures in general. That is, the lack of con-
9 sistency in dening performance is reected in the multiplicity of measures
1011 [32] which appear in the literature. Many empirical studies of product devel-
1 opment performance use a number of unique, although related, measures
2 increasing the difculty in comparing results across these studies. One
3 researcher at the 1990 PDMA3 International Conference observed that each
4 presentation on new product development success used different measures
5 for delineating success [81]. This has resulted in work aimed specically at
6 conducting meta-analysis of existing studies [64]. This inconsistency in meas-
7 ures also greatly inhibits benchmarking, i.e. comparisons between rms.
8 In summary, the research in performance has been hindered by a lack of
9 clarity on its meaning. In particular:
2011
The key elements of performance have not been consistently dened or
1
agreed.
2
3 Those dening performance as efciency and effectiveness have not
4 distinguished them clearly and/or related them within a formalism of
5 performance.
6 Many of the measures used in the research relate to inuences on perfor-
7 mance and not performance itself.
8 Comparability of empirical research in performance is hindered by the lack
9 of clarity described here.
3011
1
2 3.3 Design and Design Activity Performance
3
4 Design activity modelling has received signicant attention in research over
5 the last 30 years aiming at the development of both descriptive and prescrip-
6 tive models [35, 84]. This has resulted in the development of models offering
7 different viewpoints of the design process such as the prescription of the activ-
8 ities/stages in design and their logical sequence [85]; others focused on the
9 cognitive nature of design [13] and those relating design within an overall
4011 model of product development [14]. These models are aimed at increasing
1 our understanding of design (descriptive), and/or providing a framework
2 (procedures, guidelines, etc.) in which to carry out design (prescriptive), so
3 that the level of performance may be maintained and improved. However,
4 performance in design requires attention to both the design (artefact) and the
5 activities involved in producing that design. That is, both design and design
6 activity goals exist within design development and performance in relation to
7 these goals must be distinguished yet related to overall performance. Design
8 goals relate to aspects of the design (artefact) such as its dimensions, its
9 form, etc. while design activity goals relate to the activities in design devel-
50 opment and consider aspects such as the time taken and cost of resources.
51111 The following section provides a review of work in the area of design activity

23
The Need for Design Performance

modelling in order to establish the support it provides for distinguishing and 1


relating these areas of performance. 2
The distinction between design and design activity performance is recog- 3
nised at a project/programme level in much of the product development 4
literature, e.g. [16, 51]. However, this is not reected in research at the design 5
activity level where the primary focus is on goals related to the design. Radcliffe 6
[86] highlights the importance that designers place on design management 7
activities within the design process in his protocol analysis but the analysis 8
does not identify how design and design management activities are linked/ 9
related. Indeed, throughout the collection of papers from the Delft Workshop, 1011
Analysing Design Activity [87], there is no explicit reference to goals such as 1
time and cost in relation to activities. The analysis is restricted to the achieve- 2
ment of design (artefact) goals. 3
Smithers [13] presents a model of the design process that treats design as 4
a knowledge based exploration task. The model illustrates the role of knowl- 5
edge in design that begins with the construction of an initial requirement 6
description. This requirements description is focused on the artefact in terms 7
of what kind of behaviour is required. The description does not relate 8
any of the requirements in relation to the design activity such as the time to 91
realise a design specication. Further, in the model proposed here, the knowl- 2011
edge base includes domain and design knowledge but does not distinguish 1
knowledge of managing the design activity, i.e. scheduling, control, etc. 2
Strategy planning is presented as one of a list of activity types that together 3
constitute the design process within the exploration-based model. This activ- 4
ity involves the use of decomposition knowledge to determine intermediate 5
goals, sub-problems, etc. focused on achieving the design goals. However, all 6
of the activity types are focused on processing knowledge of the design and 7
design management activities, such as scheduling and control are not 8
included within the work. 9
Numerous models of the design process have tended to focus on the stages 3011
and outputs from each stage and therefore provide a framework to support 1
the management of design development. The model of Pahl and Beitz [12] 2
(Figure 3.2) provides a step-by-step method to be followed in design, which 3
supports scheduling of the activity within discrete phases. However, the tasks 4
outlined in this model are focused on design goals and there is no reference 5
to activity goals and the need to manage the design process in relation to both 6
design and activity goals, e.g. the trade-off between cost of design development 7
and quality of the design. 8
Similarly, French [11] suggests that evaluation is carried out continually 9
within the design process but this evaluation is focused on the initial need in 4011
terms of the artefact. Again there is no reference to activity goals or to the 1
need to trade-off between the goals related to the artefact and those related 2
to the activity. In an extensive review of prescriptive and descriptive models 3
of the design process Blessing [84] identies evaluation as a key activity in 4
these models. However, in her discussion of this activity the focus is on the 5
assessment of solution proposals and she considers the management of 6
design to be outside the scope of her research. 7
Authors such as Andreasen [14] and Hales [51] provide more insight into 8
the (business) context in which design is carried out. Andreasen identies 9
the need for greater efciency in product development while also ensuring 50
better results in terms of the artefacts produced. The Integrated Product 51111

24
Design Performance Measurement

1 Task
2 Market, company, economy

3
4
5 Plan and clarify the task:

clarifying the task


Planning and
Analyse the market and the company situation
6 Find and select product ideas
Formulate a product proposal
7 Clarify the task
Elaborate a requirements list
8
9 Requirements list
1011 (Design Specification)
1

Conceptual Design

Optimisation of the principle


2 Develop the principle solution:
Identify essential problems
3 Establish function structures
Search for working principles and working structures
4 Combine and firm up into concept variants
Evaluate against technical and economic criteria
5
6 Concept
7 (Principle Solution)
8
9

Upgrade and improve


Develop the construction structure:
Preliminary form design, material selection and calculation
2011

Optimisation of the layout, forms and materials


Select best preliminary layouts
Refine and improve layouts
1 Evaluate against technical and economic criteria
2

Embodiment Design
3
Preliminary layout
4
5
6 Define the construction structure:
Evaluate weak spots

Optimisation of the Production


7 Check for errors, disturbing influences and minimum costs
Prepare the preliminary parts list and production and assembly
8 documents
9
3011
Definitive layout
1
2
3 Prepare production and operating documents:
Detail Design

Elaborate detail drawings and parts lists


4 Complete production, assembly and parts lists
Complete production, assembly, transport and operating instructions
5 Check all documents
6
7 Product Documentation
8
9
4011
1 Solution
2
3 Figure 3.2 A phase model of the design process (from [12]).
4
5
6 Development model supports the integration of procedures, aims, methods
7 and attitudes to provide a more coherent approach to product development
8 than that which exists in many functionally oriented organisations with
9 heavily schematised procedures. The concept of efciency as dened by
50 Andreasen identies the trade-off between what is being achieved in product
51111 development and the costs (and implicitly time) incurred. However, the

25
The Need for Design Performance

author provides a viewpoint, identifying the need for managing such a trade- 1
off, and does not relate this within an activity/process model to further illus- 2
trate how it might be achieved. Similarly, Hales identies the need to address 3
aspects of the design and the process in managing the design project but does 4
not illustrate the trade-off or how conicts might be managed. 5
From the review of design activity and process models presented here, it 6
is clear that they provide signicant insight into the activities, stages, etc. in 7
design. There is a reasonable consensus on the types of activities involved 8
in design, their sequence, etc., and the evaluation of the output in relation to 9
the design goals is a key component of the models discussed. The analysis 1011
of performance in relation to the activities carried out in design is restricted 1
to literature addressing the management of design at the project level, e.g. 2
[51]. However management activities are carried out at every level in 3
design, from corporate boards to individual activities, and therefore there is 4
a requirement to analyse performance in relation to activities at all levels. This 5
requirement is not supported in other work. 6
7
8
3.4 Supporting Coherence 91
2011
The variation in scope of analysis when carrying out performance measure- 1
ment highlights the need for coherence (alignment within and between goals), 2
as described in Chapter 3. The need for coherence has been identied widely 3
in research [7, 19, 29, 33, 34, 8890]. 4
The work of de Haas [33] is focused on behaviour of individuals within an 5
organisation and does not address coherence in relation to the technical 6
system as considered within this book. LockamyIII [89] investigates the align- 7
ment of goals within a manufacturing/ production environment to the organ- 8
isational goals. The work is based on the Theory of Constraints, which focuses 9
on throughput, inventory and operating expense within a manufacturing 3011
facility. The application is therefore in a more repetitive and predictable 1
environment than design development. Neely [7] provides a review of litera- 2
ture in the area and while identifying the need for coherence the paper is not 3
aimed at proposing solutions. The work described in [90] is based on the 4
Performance Measurement Questionnaire (PMQ) of Dixon [29], which is 5
reviewed in further detail below. Tsang [34] proposes the adoption of the 6
Balanced Scorecard to support the holistic measurement of maintenance per- 7
formance. The scorecard is also reviewed in more detail in the following 8
section, which reviews more generic approaches in terms of their support for 9
coherence in design development performance. 4011
The Balanced Scorecard [20] is becoming one of the most widely used per- 1
formance measurement and management frameworks in industry. The score- 2
card is aimed at achieving strategic alignment between the strategy of an 3
organisation and the behaviour of individuals at all levels working to achieve 4
the strategic goals. According to the authors, the balanced scorecard 5
6
translates vision into a clear set of objectives which are then further translated into a 7
system of performance measurements that effectively communicate a powerful, 8
forward-looking, strategic focus to the entire organisation.
9
The scorecard provides a framework incorporating four views of the organi- 50
sation where objectives and measures may be dened, i.e. 51111

26
Design Performance Measurement

1 The nancial perspective.


2 The customer perspective.
3 The internal business perspective.
4
The learning and growth perspective.
5
6 The scorecard supports alignment in a number of ways. Firstly, it promotes
7 the use of a balanced set of measures which illustrates attention to both
8 product and process goals. The proposal to use a balanced set of measures is
9 a response to the wide criticism of using purely nancial or operational meas-
1011 ures [30, 58, 91]. The authors outline measures illustrating balance between:
1 Analysing the achievement of short and long term objectives.
2
3 Financial and non-nancial measures.
4 Lagging and leading indicators.
5 External and internal performance perspectives.
6
Secondly, it suggests means to link measures to strategy and proposes three
7
mechanisms to achieve strategic alignment. Three principles are proposed
8
to link the measures dened within a scorecard to the strategy of the organ-
9 isation. These are dened as:
2011
1 1. Cause-and-effect relationships (also used in [88]): these are represented
2 as a set of hypotheses, e.g. suggesting a causal relation between increased
3 training, improved product knowledge of employees and ultimately
4 improved sales due to this knowledge.
5 2. Performance drivers: in a similar way to cause-and-effect a scorecard will
6 have a combination of lagging measures (outcome) and leading measures
7 (drivers). That is, the lagging measures will evaluate the result of perform-
8 ing in relation to leading measures, e.g. the prot of an organisation may
9 be seen as a lagging measure with the amount of rework created in the
3011 manufacturing process providing a leading measure (i.e. an indicator) in
1 this respect. In order to relate lagging and leading measures a similar cause-
2 and-effect relationship must be established, similar to that suggested in 1.
3 3. Linkage to nancials: the authors suggest that causal paths from all meas-
4 ures on a scorecard should be linked to nancial objectives. This again
5 promotes a cause-and-effect diagnostic i.e. it is suggested that the effect
6 caused by a re-engineering program should be visible in the nancial
7 results/outcomes.
8
9 The balanced scorecard provides a comprehensive framework to support
4011 business level approaches to performance measurement and management.
1 Although a strength of the work is that it encourages logical systems think-
2 ing and highlights the need for coherence in performance measurement, it
3 fails to provide the necessary support to ensure such thinking occurs and
4 coherence is realised. The particular weaknesses of the balanced scorecard
5 approach are:
6 The three principles dened, aimed at linking measures, are dependent on
7 the ability to perform a cause-and-effect analysis across different areas of
8 performance. The work provides no structured approach to dening cause-
9 and-effect. This is particularly difcult to dene in a design environment
50 where the uniqueness and uncertainty prevent reliable cause-and-effect
51111 analysis in an a priori manner.

27
The Need for Design Performance

The work reported here offers a simplistic view of cause-and-effect, e.g. it 1


does not quantify the relationship and where a number of causes have the 2
same effect it does not support the prioritisation of the causes in terms of 3
their contribution to the outcome. 4
The efforts to achieve strategic alignment in the approach focus on 5
organisational behaviour and aligning individuals to a shared vision, e.g. 6
translating strategic objectives to individual goals. The mechanism for per- 7
forming such a translation/alignment is not apparent, i.e. the examples 8
given provide the result of such a translation but do not illustrate how it 9
is achieved. 1011
The case studies/examples of scorecard use do not illustrate its use at 1
the level of design/product development activities and no evidence is given 2
of its applicability at this business area/level. For example, there is no 3
evidence of how the scorecard approach could be applied to the decom- 4
position and alignment of product goals within the area of design/ 5
development. 6
7
The suggestion that correlation analysis will determine and validate the 8
linkages between measures at different levels is based on the assumption 91
that the environment, i.e. the goals, measures, etc., is a stable one. In many 2011
cases the goals (and measures) are constantly changing which both limits 1
the ability to perform such a correlation analysis and, where it can be per- 2
formed, its results are applicable only for as long as there is no change in 3
the goals/measures. 4
The approach suggests that causal paths for all the measures on a scorecard 5
should be linked to nancial objectives. Generating such causal paths in 6
business environments where intangible assets constitute an increasing 7
part of a companys value is difcult. If nancial goals assume overall 8
priority then innovation and creativity may be stied, i.e. the focus may 9
be on efciency as the more tangible aspect of performance, e.g. reducing 3011
rework, lead time, etc. 1
In general the scorecard provides a framework that promotes balance in 2
the selection and use of performance measurement from an overall business 3
viewpoint. However, such balance does not in itself ensure coherence within 4
performance measurement and although the approach recognises the need 5
for such coherence, and encourages logical thought in relation to this need, 6
no further support is evident. 7
Dixon [29] presents an approach, the Performance Measurement Ques- 8
tionnaire (PMQ), aimed at aligning strategy, actions and measures. The 9
questionnaire contains a list of improvement areas, such as new product 4011
introduction and customer satisfaction, and performance factors, such as unit 1
labour costs and meeting project milestones. Respondents are asked to rate 2
the importance of an improvement area in relation to the long term health 3
4
of the company compared to the support being provided by current meas-
5
ures. The performance factors are analysed through comparing their per-
6
ceived importance to the company with the current emphasis the company
7
places on measuring that factor. The data gathered using the questionnaire
8
may be analysed to determine:
9
Alignment: through comparing the importance of improvement areas and 50
emphasis on particular measures with the strategy of the company. 51111

28
Design Performance Measurement

1 Congruence: through comparing the importance of improvement areas to


2 the support of measures and comparing the importance of performance
3 factors to the emphasis on measuring it.
4 Consensus: by comparing results across different functions/levels in an
5 organisation.
6 Confusion: more detailed analysis of consensus within a particular group.
7
8 The degree of alignment between the improvement areas/measures and the
9 strategy is not clearly evident from the results of the questionnaire and
1011 the alignment is assessed in a subjective manner. That is, the approach does
1 not allow the improvement areas/measures to be rated in terms of how well
2 they support the strategy, as a whole, or particular aspects of it. This is a fun-
3 damental element required to ensure that the remaining analysis is valid,
4 i.e. the analysis of congruence, consensus and confusion is of limited value if
5 the improvement areas/measures do not support the strategy. The approach
6 could be greatly enhanced through clearly dening a strategy and the
7 strategic objectives associated with it and offering a means to analyse (and
8 compare) the degree of alignment between actions/measures and these objec-
9 tives. Further, it is not clear how the item list, i.e. the improvement areas and
2011 performance factors, would be dened for each particular situation, such as
1 in the example given for Northern Telecom Incorporated [29] which describes
2 an implementation of the questionnaire.
3 Bititci [88] suggests that alignment may be achieved through adopting a
4 modelling approach which involves the creation of three views representing
5 the structure, process and dynamics of a performance measurement system. A
6 Cause-and-Effect analysis is proposed as a means to create the structure view
7 relating different measures. This provides a starting point for the alignment of
8 measures in strategic, tactical and operational levels. However, the structure
9 provides very limited information on the relationships between measures, e.g.
3011 it does not determine whether the relationships are positive or negative or their
1 magnitude. The author proposes the use of the Quality Function Deployment
2 (QFD) approach to further dene the relationships but does not elaborate on
3 its application, limiting the ability to evaluate such an approach.
4
5
6
7 3.5 Identification and Quantification of
8 Influencing Factors
9
4011 The ultimate aim of most research in performance is to achieve performance
1 improvement in certain areas. The identication and quantication of inu-
2 encing factors is a key element, seen by some as the holy grail in the eld
3 of performance measurement [18]. That is, although measuring performance
4 provides important information, improving performance requires an under-
5 standing of the inuencing factors. The factors may then be manipulated to
6 achieve a positive effect.
7 Research on inuencing factors can be considered in two main areas for
8 the purposes of this review: empirical studies involving an analysis of data,
9 from previous projects, processes, etc. in product development, with a view
50 to identifying cause-and-effect relationships using statistical techniques such
51111 as regression analysis; work aimed at developing approaches, other than those

29
The Need for Design Performance

used in the empirical studies, to analysing inuences in a particular situation, 1


e.g. the development of questionnaires, checklists, etc. The following section 2
begins with a review of some of the empirical studies before drawing general 3
conclusions on this type of research. A number of other approaches to 4
analysing inuences are then reviewed before the section is summarised. 5
6
7
3.5.1 Empirical Research 8
9
Performance research is dominated by empirical work aiming to study 1011
the past performance in many organisations in order to identify trends and 1
generic relationships. The historical data on performance results and the fac- 2
tors present in each case are analysed using statistical analysis in an attempt 3
to estimate causality. A number of these studies are selected for review in this 4
section based on their relevance to the area of product development. 5
One of the most often cited studies in this area is the work of Clark and 6
Fujimoto [69] which compared the product development performance of the 7
automobile industry in Europe, Japan and the United States. The research is 8
based on past results of 29 development projects in 20 car manufacturers with 91
particular interest in the managerial practices that inuence: 2011
1
Lead Time: time elapsed from beginning of concept development to market 2
introduction. 3
Productivity: level of resources used over the lead time, such as hours 4
worked, materials used, etc. 5
Effectiveness (Quality): measured as Total Product Quality, which includes 6
objective and subjective attributes of the product itself and its manufac- 7
turability. 8
9
Although the work is aimed at a specic industry sector, the results provide
3011
some useful indicators of success that have support in other studies and are
1
applicable to product development from the viewpoint of strategy, organisa-
2
tional structures and the management of product development projects. For
3
example, it provides support for the use of management approaches such as
4
concurrent engineering to reduce project lead time. Four key themes emerged
5
from the work when looking at the policies and practices behind outstanding
6
performance:
7
Project Strategy. 8
Manufacturing Capability. 9
Integrated Problem Solving. 4011
1
Organisation and Leadership.
2
The overall nding is that a balance in terms of overall consistency across 3
these four elements is key to achieving high levels of product development 4
performance. 5
The work provides general indications of relationships between methods 6
and approaches used in PD and particular performance dimensions, e.g., the 7
impact of rapid prototyping on lead times. However, the study is based on 8
the historical data from the 1980s and therefore its relevance to the current 9
product development environment is unclear, i.e., new approaches, methods 50
and tools are available to support PD and the work cannot provide an 51111

30
Design Performance Measurement

1 indication of their inuence on performance. The research identied a set of


2 measures but did not design these into an ongoing measurement system. It
3 has not produced prescriptive models that could be implemented in other
4 industries or a methodology for identifying factors inuencing particular
5 areas of performance in the absence of statistical data. This type of data, as
6 pointed out by the authors, is rarely readily available, and carrying out
7 research on the scale reported here is time consuming (the study reported
8 here lasted approximately 5 years) and costly.
9 Loch et al. [93] present an analysis of performance in the electronics indus-
1011 try (95 companies in Europe, Japan and the U.S.) drawing on data from the
1 Excellence in Electronics project jointly undertaken by Stanford University,
2 the University of Augustburg and McKinsey and Company. The authors aim
3 to combine rm and project level views of performance and distinguish
4 between performance in the development process, performance of the output
5 of the process (i.e. the design/artefact) and eventual business success. Further,
6 the authors suggest that process performance inuences output performance
7 through the operational management of development projects. The aim was
8 to develop causal relationships between the development process perfor-
9 mance, development output performance and overall business success while
2011 recognising that additional areas such as the performance of manufacturing
1 and marketing/sales also inuence business success (Figure 3.3). Variables
2 are dened for process management, development output and success and
3 the data is analysed in relation to these variables. The authors perform two
4 separate regression analysis on the data to develop the links.
5 Although the analysis provides some indication of relationships which may
6 be statistically signicant and presents a step framework (Figure 3.3) for
7 analysing the causal links from product development to business success, the
8 work is limited in terms of providing insight into the product development
9 process. Some particular weaknesses of the work are presented here:
3011 The authors found no process variables (factors) with a signicant rela-
1
tionship with the output measures new product productivity and design
2
3
4
5 Manufacturing Mktg./Sales
6 Performance Performance
7
8
9
4011
1 Development Development
Business
2 Process Output
Success
Performance Performance
3
4
5
6
7 28 Process 12 Development 3 Business
8 Management Output Success
Variables Variables Variables
9
50
51111 Figure 3.3 Framework of development performance [93].

31
The Need for Design Performance

quality and suggest that these would be driven by the qualications of the 1
designers. These output measures are highly signicant in many industries; 2
indeed the authors conclude that development productivity is a very 3
important driver of business success. 4
The authors dene 28 process management variables but within this set 5
they confuse actual performance variables, e.g. meeting schedules, and 6
meeting budgets, with variables which inuence performance, such as team 7
size, concurrence, and use of value engineering. That is, the variable meeting 8
schedules provides an indication of how well the project met the time goals 9
which were specied, while team size provides an indication of the 1011
resources used but not the results achieved. 1
The work does not provide enough detailed analysis at the level of product 2
development activities to support performance improvement in this area. 3
In particular the variables used do not include any analysis of the impact 4
of particular tools or techniques, the focus of much interest in design 5
research [3, 94]. 6
The characterisation of industries, in an attempt to develop industry 7
specic results, presents the results on process performance in two broad 8
categories, i.e. Measurement/Large Systems Industry and the Computer 91
Industry. Within these categories many of the industries may be pursuing 2011
very different product development strategies/goals and therefore the 1
causal relationships within the model in Figure 3.3 may vary signicantly 2
[95]. This work does not take such individual strategies into account within 3
the results. 4
5
The work described in [9], [75] and [96] focuses on specic inuencing
6
factors, e.g. design tools, and evaluates their impact on performance. Grifn
7
[75] looks at the impact of engineering design tools on efciency and effec-
tiveness using data from the Product Development and Management 8
Associations (PDMA) 1995 Best Practices in Product Development survey. 9
The results of the study are based on the data from 383 respondents of a postal 3011
survey using a detailed questionnaire. The practices surveyed included 1
Computer-Aided Design (CAD), Computer-Aided Engineering (CAE), Design 2
for Manufacture (DFM) and Concurrent Engineering (CE). However, these 3
terms are not dened within the report although it is suggested that CE is a 4
tool/method for speeding the design process while DFM is a tool for reduc- 5
ing design defects and it is not clear if detailed denitions were provided to 6
respondents. Within this book CE is seen as an overall approach to carrying 7
out activities concurrently. This may be facilitated by the use of methods such 8
as Design for Manufacture that in turn may be assisted using particular CAD 9
tools. Therefore CE Tools could be seen to include some CAD applications, 4011
DFM, etc. The lack of clarity on what each practice means within this report 1
limits the value of the results in terms of the conclusions that are drawn, par- 2
ticularly when presented as a guide to implementing different practices to 3
support particular PD strategies. 4
Cooper [52] reports on an extensive survey of the factors driving the success 5
of new products. This work identies the product itself and its attributes as 6
the most important factor and lists others including the use of cross-func- 7
tional teams, thoroughness of activities in the early stages, etc. The work does 8
not provide detail on the use of particular tools or methods to support the 9
activities in product development and the results may be used as a high level 50
guide only. 51111

32
Design Performance Measurement

1 Some studies in this area are more specically focused on one aspect of
2 performance, e.g. the work of Cooper [97] and Doz [70] investigates the key
3 drivers of speed and timeliness in product development projects. Also, Zirger
4 et al. [98] look at the impact of various acceleration techniques on devel-
5 opment time. Grifn [75] investigates the inuence of specic design tools on
6 performance. In some cases the performance of Product Development has
7 itself been investigated as an inuencing factor on overall business perfor-
8 mance e.g. [57]. None of the studies offer conclusive evidence of a generic
9 nature and the lack of a framework e.g. a generic performance denition
1011 and/or model, in which this work may be related, limits the potential rele-
1 vance to the research community. In addition, the context in which the inu-
2 encing factor is analysed is often not clearly dened which limits the
3 conclusions that may be drawn. For example, the introduction of CAD may
4 have a signicant impact on performance where a substantial amount of
5 routine design is carried out but the inuence on a design process focusing
6 on innovation will be different.
7 A general critique of this type of research is provided within the meta-analysis
8 reported in [64] which highlights many of the key weaknesses in empirical studies.
9 The authors provide reference to the growing number of these studies and suggest
2011 that in spite of the use of increasingly sophisticated statistical techniques much of
1 the work identifies rather than explains influencing factors. That is, the theoretical
2 understanding of relationships is limited and the work is . . . disjointed and lacking
3 with respect to concise conclusions on which factors should command the most
attention.
4
5 This review highlights the lack of consistency in this type of work in terms of
6 the elements studied, leading to a lack of convergence in terms of the results.
7 In addition, the validity of particular studies of success and failure may be
8 questionable due to the natural tendency not to report failure and the biases
9 that are inherent in much of this type of research [50].
3011
1
2 3.5.1.1 Other Approaches
3
4 Hales [51] suggests that dening, understanding and working with the factors
5 inuencing the course of the (design) project are critical elements in manag-
6 ing engineering design. He identies inuences on design at different levels,
7 such as project and individual, and provides checklists for analysing
8 the nature of these inuences and identifying actions required to improve the
9 success of the project. The analysis attempts to establish the inuences that
4011 different sets of factors have on the overall project, e.g. corporate factors,
1 project prole, personnel factors, etc. Although the checklists encourage a sys-
2 tematic approach to controlling the design project it assumes that an indi-
3 vidual, i.e. the design manager, can establish the link between a particular
4 factor and the overall goal of the project. Such analysis would benet from
5 decomposition of the goal into particular goals and priorities and analysis of
6 the inuencing factor in relation to each of these goals. That is, the complex-
7 ity of the problem could be decreased through the use of a decomposition
8 approach, contributing to greater reliability of the results. In addition the
9 analysis of inuences at different levels by different individuals may result in
50 a series of unrelated checklists and therefore a lack of alignment among the
51111 inuencing factors in relation to the overall goal.

33
The Need for Design Performance

Bititci [99] presents an approach based on the use of cognitive maps to 1


establish initial relationships. Having established an initial hierarchy based 2
on cause-and-effect the Analytical Hierarchy Process [100] is then used to 3
establish the relative inuence of each factor, at a particular level, on the factor 4
at the next highest level. The inuence (effect) of one factor on another is 5
dened as a combination of the inherent effect of the factor and effect result- 6
ing from interaction with itself and through other factors at the same level. 7
The initial example given, of money accumulating interest in a bank, repre- 8
sents a relatively simple mathematical relationship where cause-and-effect 9
can be easily established. A further example is presented where the approach 1011
is used to select an appropriate manufacturing strategy. In this case the rela- 1
tionships are not as simple and a number of fundamental assumptions must 2
be made in order to dene a hierarchical relationship network. The use of a 3
detailed and rigorous mathematical analysis, using information that contains 4
a high degree of subjectivity, would seem unjustied in such cases, i.e. inac- 5
curacies in the initial assumptions would not be improved through mathe- 6
matical rigour. The complexity of design and development activities offer 7
greater challenges in terms of dening relationships such as those dened 8
using this approach and its application within such an environment would be 91
difcult and time consuming. The approach does not offer an appropriate 2011
balance between the strength of its theoretical foundation and its usefulness 1
in an application to support decision making [15]. 2
A number of authors present high level models of the factors affecting 3
product development success. Emmanuelides [71] presents a model that 4
recognises the importance of the context in which product development takes 5
place in terms of the scope of the project and dynamics of the marketplace. 6
The model dened by Brown [50] indicates where relationships are strongly 7
supported by the research, e.g. the link between senior management support 8
and the resulting lead time and productivity of the process. Both models focus 9
primarily on organisational issues (social subsystem) such as the authority/ 3011
power of the project leader, group dynamics and skills/experience of those 1
involved. The models do not incorporate factors inuencing performance, at 2
an activity level, such as the use of specic methods or tools. Therefore, the 3
work does not provide sufcient insight, from a technical system viewpoint, 4
into the detail of design/development activities to allow their performance to 5
be analysed and improved. 6
7
8
3.5.1.2 Conclusion 9
4011
There are a number of approaches aimed at identifying and quantifying 1
factors inuencing performance that are predominantly based on an empir- 2
ical approach studying past cases. The weaknesses of this type of research 3
in terms of how it might support identication and quantication in future 4
cases have been identied. In general, it has failed to reliably identify factors 5
inuencing performance in a manner that supports performance improve- 6
ment in other organisations. Additional approaches to this area have also 7
been reviewed and critically appraised. Although they demonstrate appro- 8
priate principles, such as the aim of continuous improvement and the recog- 9
nition of relationships between factors, their use in practice could produce 50
inappropriate results. 51111

34
Design Performance Measurement

1 3.6 Summary
2
3 This chapter began by specifying the requirements for a methodology
4 supporting performance modelling and analysis to achieve performance
5 improvement. The overview covers a broad range of contributions from a
6 wide range of areas and types. It has focused on key contributions in relation
7 to each of the requirements with a brief review of related work.
8 Some of the key issues that must be addressed for design performance are:
9
1011 The comparative difculty in analysing performance in design develop-
1 ment, i.e. its uniqueness in comparison to activities such as manufactur-
2 ing, and the lack of suitability of nancial analysis in describing
3 performance in this area.
4 The lack of research in design development performance within the wider
5 area of performance measurement.
6 The confusion and the need for a common language within this area to
7 support common understanding.
8 The need to ensure alignment/coherence of performance throughout all
9 parts of an organisation.
2011 The incomplete view of performance implicitly assumed within much
1
design process research.
2
3 The need to analyse the factors inuencing performance in order to under-
4 stand how performance may be impacted and/or predicted in the presence/
5 absence of these factors.
6 The chapter has presented an overview of existing work and identied par-
7 ticular weaknesses in relation to the different areas addressed. The following
8 points summarise the ndings:
9
3011 The research contains a very wide range of terms that reect different
1 aspects of performance and its related elements. There exists no common
2 framework in which to relate these terms and understand how they relate
3 to the phenomenon of performance. This contributes to difculties in inter-
4 preting the results of research in this area and hinders progress in the
5 research.
6 Design research has substantially improved our understanding of the
7 design process and general agreement exists among many of the process
8 models presented in research. Design and its management are both
9 addressed in the research but these areas are not jointly addressed in the
4011 process models and their relation dened. This limits our understanding
1 of performance in design development.
2 Coherence in performance is accepted as a key requirement within the
3 broad literature on performance. A number of approaches are reviewed
4 here that offer solutions but are not considered directly applicable in design
5 development. The approaches provide more information on what to do
6 than how it might be done.
7 The identication and assessment of factors inuencing performance
8 is widely reported as empirical research. The limitations of this type of
9 research in a non-uniform/repeatable environment such as design devel-
50 opment have been described. Further approaches have been reviewed and
51111 their weakness discussed.

35
The Need for Design Performance

Based on the work reviewed in this chapter the following are key issues 1
that a methodology for modelling and analysing performance in design and 2
development should address: 3
4
It should present a comprehensive formalism of design development per-
5
formance that denes and relates the key elements, e.g. efciency and effec-
6
tiveness and claries related elements such as inuences on performance.
7
The formalism should allow design and its management to be distinguished 8
at the level of activities in design development, i.e. it should provide a more 9
comprehensive view of performance than current models of the design 1011
process. Further, the relationship between these should be determined 1
within a process model of design and its management. 2
The modelling and analysis of performance should address the issue of 3
coherence and avoid sub-optimisation or isolated assessment of perfor- 4
mance within a particular area of design development. 5
The analysis of performance should allow the identication of inuencing 6
factors and the nature of that inuence in a manner that supports the 7
implementation of future performance improvements. 8
91
Part 2 of this book species key components to support the requirements
2011
specied here and presents the elements within an overall methodology for
1
performance modelling and analysis in design development.
2
3
4
Notes 5
1 The table is not intended to represent an exhaustive coverage of all terms and denitions
6
but presents a sample to indicate the range of interpretations. 7
2 It should be noted here that dening the dimensions of performance does not constitute a 8
generic denition of performance as the appropriate dimensions may vary while the den- 9
ition should remain consistent.
3011
3 Product Development Management Association
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

36
1
2
3
4
4 Industrial Practice

Chapter 3 presented an overview of the work on perfor-


mance and referred to some of the approaches to per-
formance analysis that are described in various articles.
The following section briey reviews practices more
commonly used in industry aimed at analysing and
5 improving performance both from an overall business
6 and design development viewpoint. Further insight
7 into industrial practice is gained through developing
8 and implementing an approach to performance analysis
9 based on existing research. The implementation of the
1011 approach is described and key conclusions are discussed.
1
2
3 4.1 Current Practice
4
5 Within current industrial practice Management Infor-
6 mation Systems (MIS) in general support a degree of
7 performance management. There are a wide range of
8 solutions aimed at capturing data and relying on funda-
9 mental database principles [101] to interrogate, analyse
2011 and report on that data in a meaningful way. The data
1 required for these systems is generated through the appli-
2 cation of suitable metrics to derive performance values.
3 The focus of this book is on providing the means to deter-
4 mine such metrics rather than their application to
5 measurement. Therefore the general area of data man-
6 agement systems as applied to performance reporting and
7 analysis is not considered here.
8 General principles for the design of performance meas-
9 urement systems have been proposed by a number of
3011 authors [7, 44, 102104]. The main themes to emerge
1 include:
2

37
The Need for Design Performance

1. Measures should relate to the strategy/objectives of the organisation. 1


2. The approach should be simple and easy to use. 2
3. Measurement should allow comparisons and focus on improvement, not 3
just monitoring. 4
5
4. Users of the results must have an inuence on the criteria. 6
5. A variety of measures and types of measures should be used. 7
6. Rapid feedback of the results should be possible. 8
9
1011
4.1.1 Business Level Approaches 1
2
Performance analysis may be carried out in industry either on a self- 3
assessment basis or with the support of external consultants. At the overall 4
business level two frameworks tend to dominate the area of self-assessment, 5
i.e. the Malcolm Baldrige National Quality Award [105] in the United States 6
and the European Foundation for Quality Managements (EFQM) Business 7
Excellence Model (BEM) [106] in Europe. Both of these frameworks provide 8
a basis for analysing design development at a relatively high level of abstrac- 91
tion and are focused on the achievement of an award and recognition. That 2011
is, they view design development as a black box offering little insight into 1
the detail of its performance. For example, in analysing the design processes 2
for the Baldrige award, applicants are asked to describe the following: 3
4
5
Design Processes 6
How new, modied, and customised products and services, and production/ 7
delivery processes are designed and implemented. Including: 8
9
(1) How changing customer and market requirements and technology are 3011
incorporated into product and service designs. 1
(2) How production/delivery processes are designed to meet customer, 2
quality, and operational performance requirements. 3
(3) How design and production/delivery processes are coordinated and 4
tested to ensure trouble-free and timely introduction and delivery of 5
products and services. 6
(4) How design processes are evaluated and improved to achieve better per- 7
formance, including improvements to products and services, transfer of 8
learning to other company units and projects, and reduced cycle time. 9
4011
Source: National Institute for Science and Technology [105]
1
2
3
This description is then evaluated by examiners, using a general scoring 4
structure, which is applied in relation to the different areas described by the 5
applicants. The framework offers a high level analysis of design development 6
but does not support measurement of performance in any detail. Similarly, the 7
EFQM Business Excellence Model offers little insight into design development 8
performance and is concerned with business level descriptions/evaluations. 9
Indeed the meaning of business excellence is not widely agreed in industry 50
and a wide variety of approaches have been adopted in self-assessment [107]. 51111

38
Industrial Practice

1 4.1.2 Design Development Analysis


2
3 Within the UK the Department of Trade and Industry (DTI) have produced
4 a self- assessment guide, Innovation Your Move [108] which may be used
5 with or without external support. The guide provides support for an initial
6 self-assessment of design development before focusing on areas that require
7 more in-depth assessment. The initial phase involves completing a set of
8 scorecards provided in a workbook. The scorecards contain a series of state-
9 ments in relation to a particular topic with a number (score) related to each
1011 statement. The statement that most closely resembles the practice within the
1 organisation is chosen and the relevant score is recorded (fractional scores
2 may be used, e.g. 2.5). An example statement on the product development
3 process is:
4
5
6
The product development process
7
8 1. No product development procedures.
9 2. Simple procedures applied to all projects but activities carried out in
2011 sequence, not in parallel.
1 3. Product development on major projects planned in phases with reviews.
2
4. Established procedures and objectives with exibility to allow small
3
projects to move through quickly. Parallel and integrated activities.
4
5 Source: Innovation Your Move [108]
6
7
8 The statement corresponding to the maximum score (4) is considered
9 to be best practice. The overall scores are recorded and a prole of the
3011 organisation is generated (e.g. Figure 4.1 presents a prole for Product
1 Development). The difference between the scores recorded and the maximum
2 score is considered a gap to be closed and therefore a potential area for focus
3 in more in-depth analysis.
4 The DTI approach offers more detailed analysis of the design development
5 process than the wider frameworks of Baldrige and EFQM. However, it has a
6 number of inherent drawbacks:
7
8 The approach does not take account of the specic goals of the organ-
9 isation, and therefore their weightings/priorities in relation to design
4011 development, i.e. a maximum score may not be appropriate for some
1 organisations.
2
3
4 0 1 2 3 4
5 Product The product development process
6 Development
7 Teamwork and organisation
8 Transfer to manufacturing
9 Industrial design
50
51111 Figure 4.1 Organisation prole product development (source: DTI).

39
The Need for Design Performance

There are implicit assumptions within the approach, relating to best prac- 1
tice, which are not widely agreed in research. For example in the descrip- 2
tion provided above there is an assumption that carrying out activities in 3
parallel is superior to a sequential process. A number of authors have raised 4
issues around the adoption of Concurrent Engineering approaches [46, 5
109] illustrating that such a general assumption is inappropriate. Indeed 6
it is suggested that activities should not simply be carried out in parallel 7
but should be co-ordinated to give the right results, in the right place, at 8
the right time, for the right reasons [110]. 9
1011
A concept for analysing the worth of a research concept, Design Coordin-
1
ation (DC), as an approach to Product Development has been developed and
2
presented in DMMI 95. This concept was developed through a collabora-
3
tion of industrial and academic partners and was based on the view that
4
approaches such as Design Coordination should not be adopted by industry
5
until an appreciation of its potential to support an organisation is obtained.
6
Design Coordination is dened as [110]:
7
a high level concept of the planning, scheduling, representation, decision making and 8
control of product development with respect to time, tasks, resource utilisation and 91
design aspects. 2011
The concept is proposed as a basis to support a number of aspects of product 1
development, i.e.: 2
3
Concurrent Engineering 4
Decision Support 5
Design Management 6
Product Management 7
Team Engineering 8
9
Within each of these aspects there are considered to be a number of DC means 3011
(methodologies/tools) as presented in Table 4.1. 1
The analysis approach presented by Duffy in [111] is based on determin- 2
ing how well the DC means support the enterprise. Recognising that the 3
support provided was likely to vary across different organisations an initial 4
step of dening and prioritising the enterprise requirements was carried out. 5
The various DC means, such as Design for X, Process Modelling and Variant 6
Management are subsequently analysed in terms of how much impact they 7
have on meeting the requirements of the enterprise. This ensures that results 8
of such an analysis are more directly relevant to the organisation and are not 9
based on generic assumptions on the value of different approaches. A matrix 4011
is proposed to support and provide structure for this analysis based on the 1
Quality Function Deployment (QFD) approach [112]. 2
The approach provides a framework to obtain greater insight into the 3
impact of DC Means on performance against particular goals. From such an 4
analysis those means with greater impact on goals could be further exploited 5
to increase performance. That is, the concept seemed to provide a basis for 6
achieving performance improvement. Further, the expertise of one of the co- 7
creators of the original approach was readily available to support implemen- 8
tation of the approach within an industrial trial. Such a trial was considered 9
appropriate in establishing insight into the requirements for performance 50
analysis in industry. This insight is captured here. 51111

40
Industrial Practice

1 Table 4.1. Product development aspects and DC means


2 Concurrent Engineering Team Engineering
3 DFX Inter/Intra Integration
4 DFX Management Formation
5 Life Cycle Issues Assessment and enhancement
Providence Negotiation support/management
6 XTD
7
8 Decision Support Product Management
9 Authority, responsibility and control Conguration management
1011 Conict resolution Design re-use and standardisation
Consistency management Evolution management
1 Effective communication Integration and control
2 Information integration Integrity
3 Integration and coherence Variant management
4 Knowledge management Viewpoint management
Multi-criteria management
5 Probability and risk assessment
6
7 Design Management
8 Design experience re-use
9 Distributed design authority
Planning, scheduling and control
2011 Process modelling
1 Resource management
2 Right-rst-time, re-work and iteration control
3 Task management
4
5
6
7
8
9
3011 4.2 Industrial Insight
1
2 In order to gain rst hand knowledge of industry needs and views it was
3 decided to adapt the approach of Duffy and implement it within an industrial
4 environment. A computer-based system was developed to automate much of
5 the analysis and provide graphical representations of the results during the
6 trial1. The following provides some background to the trial, an overview of
7 the approach used, results achieved and key lessons which emerged to inform
8 the work reported in this book.
9
4011
1 4.2.1 Background
2
3 The industrial trial was held in Company A2 in 1996 over a two day period.
4 The company is involved in the design and manufacture of large scale engin-
5 eering products and were just embarking on a specic project (P1) with the
6 rst products due for delivery in 2002. The company were briefed on the
7 approach and felt it provided the opportunity to use the knowledge and exper-
8 ience gained in previous work to support the analysis and identify key target
9 areas for performance improvement in Project P1. Their overall aim for the
50 exercise was therefore to improve performance in this project over previous
51111 projects.

41
The Need for Design Performance

4.2.2 Overview of Approach 1


2
The approach followed was based on the concept of analysing the relation- 3
ship between the various DC means and the product development goals of 4
the organisation, e.g. establishing how well variant management contributed 5
to the achievement of cost reduction. The key steps involved: 6
7
Denition of the scope of the analysis. 8
Denition and prioritisation of PD goals. 9
Denition of DC means. 1011
Analysis of the impact of DC means on these goals. 1
Presentation and review of results. 2
3
Achieving agreement on key areas of focus for performance improvement. 4
The following provides a brief description of the process implemented at 5
Company A, including examples of the outputs generated. A full presentation 6
of the results was provided to the company in the form of a report [113]. 7
8
91
4.2.3 The Process 2011
1
The analysis of Company A was carried out with a team of senior managers 2
who were involved in Project P1 either on a full or part-time basis. The authors 3
facilitated the process. 4
5
6
4.2.3.1 Scope of Analysis 7
8
Although Company A was aiming to improve performance in the overall 9
project they identied design development as having the greatest inuence 3011
on the project outcome and selected this area as the focus of the analysis. The 1
participants in the analysis were selected from a broad range of disciplines to 2
ensure that the inuence of design development on other areas of the project 3
was taken into account, e.g. 4
5
Computer Aided Design 6
Manufacturing 7
Technical Design 8
Testing 9
4011
1
4.2.3.2 Goals and Priorities 2
3
The goals were dened and prioritised based on the agreed scope of analysis. 4
This was achieved using a brainstorming session followed by a consolidation 5
period where various points were discussed and a nal listing was agreed. 6
Individuals from different disciplines had opposing views on the priority of 7
some goals, such as the reduction of rework, while goals such as prot had 8
wider agreement. A nal agreement on the goals and priorities was reached 9
through facilitated discussion and the results are shown in Table 4.2. The pri- 50
orities were dened as Important (I), Very Important (VI) and Critical (C). 51111

42
Industrial Practice

1 Table 4.2. Company As product development goals


2 No. Title Description Priority
3 1 Risk Minimise development work and technology risk VI
4 2 Rework Reduce rework VI
5 3 Meet prog Meet the programme C
4 Coherence Maximise customer/Company As coherence with specication VI
6 5 Achieve spec Achieve the specication C
7 6 Prot Achieve or improve planned prot margin VI
8 7 Customer Provide a quality product that delights the customer I
9 8 Reputation Enhance the companys reputation in the marketplace I
1011 9 Integration Maximise the integration of operations VI
1
2
3
4 4.2.3.3 PD Aspects and DC Means
5
6 Before analysing the impact of resources on goals, a list of resources as dened
7 in [111] was presented to the participants. This list provides broad categories
8 or PD Aspects such as Concurrent Engineering, Team Engineering, etc.
9 Within each PD Aspect a number of DC Means are dened, such as conict
2011 resolution, (product) integrity, etc. These means were fully described to the
1 analysis team, in accordance with the information presented in [111]. Further
2 to this, the participants were asked to brainstorm any additional resources
3 that were specic to the organisation and scope of analysis. It was found that
4 these resources could be categorised within the original list presented and this
5 list was subsequently agreed as the high level list for analysis (see Table 4.1).
6
7
8 4.2.3.4 Assessment
9
3011 A matrix is used to relate the means to the goals, i.e. to allow the impact of
1 DC means on goals to be dened. As a development on the original approach
2 of Duffy two additional elements were introduced to enhance the overall
3 approach:
4
5 1. The concept of Ease is introduced to indicate the perceived ease/difculty
6 for Company A in further utilising/exploiting a particular means, based on
7 the expertise, intellect, resources, etc. available. For example, the analysis
8 team may believe that it would be much more difcult to further exploit
9 conguration management than process modelling. The ease refers both to
4011 the further exploitation of an existing resource or the introduction and
1 exploitation of a new resource. Ease may be indicated on a relative scale as
2 Low (L), Medium (M) or High (H). A Low ease assigned to a resource indi-
3 cates that the organisation is less capable of further exploiting that resource
4 than one with a Medium or High ease.
5 2. The assessment was carried out on the basis of Current and Ideal scenar-
6 ios. That is, the impact of a resource on a goal was assessed on the basis of
7 ideal conditions in the organisation and on the basis of the conditions that
8 currently exist. The ideal conditions would be considered to include all the
9 necessary people, training, computer systems etc. to support a particular
50 area. The denition of impact on a current and ideal basis allowed the scope
51111 for improvement to be identied.

43
The Need for Design Performance

Table 4.3. Analysis scales 1


Data Element Notation Numerical Value Meaning 2
Priority I/VI/C 1, 3, 5 Priority of Goal 3
Ease L/M/H 1, 3, 5 Ease of Exploitation of Resource 4
Impact L/M/H 1, 3, 9 Impact of Resource on Goal 5
6
7
8
9
A summary of the scales used in the analysis matrix, based on those adopted 1011
in QFD, is presented in Table 4.3. 1
The analysis team was divided into a number of sub-groups to conduct the 2
analysis of ease and impact for the matrix. Each sub-group analysed a differ- 3
ent set of resources and inserted values into a matrix template (printed on an 4
overhead transparency). Following the individual sub-group analysis a con- 5
solidation session took place where each of the sub-groups presented their 6
results to the rest of the team. This allowed particular issues to be raised, some 7
changes to be made and resulted in a greater degree of consensus on the 8
results. 91
Figure 4.2 presents results of the assessment based on ideal conditions. The 2011
completed matrix provides a template where goals/priorities are presented, 1
the resources and their groupings are listed and the ease of exploitation of 2
resources is dened. Further, it presents an overall model of the relationships 3
between each goal and each resource, i.e. in Figure 4.2 there are 288 rela- 4
tionships evident. This constitutes a comprehensive and detailed analysis of 5
Project P1. 6
7
8
4.2.3.5 Analysis 9
3011
Having agreed the nal data entries in the matrix templates these entries were 1
transferred to the software system to allow automated analysis and the pro- 2
duction of graphical results. 3
The analysis of the data is based on the following measures: 4
5
Contribution of DC Means: The contribution of DC means is dened as the 6
product of the impact and goal priority. This is used as a basis to calculate 7
the contribution of individual means and the overall DC Aspects. 8
Return on Investment (RoI): The RoI is based on the product of the 9
contribution and the ease for each DC means. 4011
Scope for Improvement: This is determined by subtracting the values 1
of current contribution from ideal values to indicate where the greatest 2
differences exist. 3
4
5
4.2.3.6 Presentation 6
7
The results of the analysis are transformed to graphical representation to 8
aid interpretation and transferred automatically to a standard presentation 9
provided via a data projector. The following graphs were used within the 50
standard presentation: 51111

44
Industrial Practice

1 Enterprise Requirements
2 3 3 5 3 5 3 1 1 3
3

Achieve Spec
4

Integration
Reputation
Coherence
Meet prog

Customer
5

Rework

Profit
Ease
6

Risk
PD Aspects and DC Means
7
Concurrent Engineering
8
DFX 1 9 9 3 1 9 9
9
DFX Management 1 3 3 1 3 9
1011
Life cycle issues 1 1 9 9 3 9
1
Providence 3 1 3 3 3 3
2
XTD 3 3 9 3 9 9 9 1 9
3
Decision Support
4
Authority, responsibility and control 3 9 3 9 9 9 3 9 9
5
Conflict resolution 3 9 3 9 9 9 3 3 1 9
6
Consistency management 3 9 9 3 9 3 1 9
7
Effective communication 1 9 9 9 9 9 3 3 9 9
8
Information integration 1 3 9 9 3 9 3 1 9
9
Integration and coherence 1 9 9 3 3 9 3 3 1 9
2011
Knowledge management 3 3 9 3 1 9 3 1 9
1
Multi-criteria management 1 9 9 9 9 9 9 3 1 9
2
Probability and risk assessment 1 9 3 9 9 9 3 1
3
Design Management
4
Design experience re-use 5 3 9 3 1 3 9 1 1
5
Distributed design authority 1 9 9 9 1 9 9 3 1 9
6
Planning scheduling and control 1 1 3 9 3 1 9
7
Process modelling 3 1 9 9 1 1 9 1 9
8
Resource management 3 1 9 3 3 3
9
Right-first-time, re-work and iteration
3011
control 1 9 3 9 1 9
1
Task management 5 1 3 9 1 3
2
Product Management
3
Configuration management 3 9 9 3 1 9 1 1 1 3
4
Design re-use and standardisation 5 9 9 9 1 9 3 1 3
5
Evolution management 3 3 3 9 9 1 1
6
Integration and Control 1 3 9 1 9 9
7
Integrity 1 3 9 9 9 9 3 3
8
Variant management 1 9 3 9 3 3 3 3
9
Viewpoint management 3 9 3 1 3 9
4011
Team Engineering
1
Inter-Intra integration 3 1 9 9 9 9 9 9 9 9
2
Formation 3 3 3 3 3 9 3 3 3 9
3
Assessment and enhancement 3 1 1 1 3 9 9 9 1
4
Empowerment 1 9 3 3 9 9 9 3
5
Negotiation support/management 3 9 3 9 9 9 3 9 9 1
6
7
8
Figure 4.2 Company A data input (ideal).
9
50
51111

45
The Need for Design Performance

A. Pie Charts comparing the contribution of : 1


2
1. All DC aspects on each goal
3
2. Each DC aspect on all goals (e.g. Figure 4.3 (a)) 4
3. Each DC aspect on individual goals 5
4. Each sub-aspect of a DC means on all goals. 6
7
B. Bar Charts comparing the ideal and current contribution of: 8
9
1. All DC aspects on each goal 1011
2. Each DC aspect on all goals 1
3. Each DC aspect on individual goals (e.g. Figure 4.3 (b) shows the contri- 2
bution on Meeting the Programme) 3
4. Each DC means on all goals. 4
5
C. Radar Diagrams comparing the return on investment (RoI) for each 6
sub-aspect on all goals (e.g. Figure 4.3 (c): the RoI for Team Engineering 7
sub-aspects). 8
91
D. Scatter Diagrams plotting ease versus contribution of each sub-aspect on 2011
all goals (e.g. Figure 4.3 (d) the ease versus contribution of Product 1
Management sub-aspects). 2
3
4
5
6
% of Total (Ideal) 7
Concurrent
Team Engineering
0 1 2 3 4 5 6 7 8 9 8
Engineering 10% 9
16%
Concurrent
3011
Engineering 1
Product
Management Decision Decision Support
IDEAL
2
20% Support
37% Design Management
CURRENT 3
4
Product
Design Management 5
Management
17% Team Engineering 6
7
(a) (b)
Configuration
8
6.0
management
9
Inter/Intra integration 5.0
Design re-use and
standardisation 4011
12.0
10.0
8.0
Evolution 1
4.0
% of Total (Ideal)

management
6.0 3.7 3.8
2
Contribution

Negotiation 4.0 Formation Integration and


support/management 2.0
0.0 3.0 3.0 Control 3
2.6 2.6

2.0 2.1
Integrity
4
Empowerment Assessment and
2.0
Variant management 5
enhancement 1.0 6
Viewpoint

0.0
management 7
0.0 1.0 2.0 3.0 4.0 5.0 8
Ease
(c) (d)
9
50
Figure 4.3 Results from the Company A trial. 51111

46
Industrial Practice

1 The results were presented to the analysis team a short time (approx. 30
2 minutes) after they had completed their data input in the matrices to ensure
3 continued engagement in the process. The results support discussion around
4 areas where the organisation may focus efforts in improving performance.
5 The automation of much of the analysis via computer support allows what-
6 if scenarios to be developed during this discussion session. This served to
7 conrm the results with the participants, i.e. many did not fully believe the
8 results and were allowed to alter particular values to simulate the effect on
9 overall results. The presentation of the results in the same day contrasted with
1011 the previous experiences of the company where the results would be obtained
1 a number of weeks later when many had forgotten the rationale behind them.
2
3
4 4.2.3.7 Agreed Target Areas
5
6 In the process of agreeing the target areas for performance improvement the
7 analysis team reviewed the overall results as presented in Table 4.4 and
8 the graphical outputs. In general, the team tended towards selecting those DC
9
2011
1 Table 4.4. Overall Company A results (ordered by ideal impact)
2 % of Total (Ideal)
3 No. Ease DC Means current ideal difference
4 1 L Multi-criteria management 1.31 5.19 3.87
5 2 M Inter/Intra integration 0.97 4.86 3.99
6 3 L Effective communication 0.61 4.86 4.35
7 4 L Distributed design authority 0.48 4.64 4.17
5 M Authority, responsibility and control 0.68 4.49 3.81
8 6 M Conict resolution 0.97 4.37 3.40
9 7 M Negotiation support/management 0.43 4.15 3.72
3011 8 M XTD 0.63 4.03 3.40
1 9 L Information integration 0.72 3.90 3.17
10 H Design re-use and standardisation 0.57 3.76 3.19
2 11 L Integrity 3.32 3.74 3.42
3 12 L Integration and coherence 0.97 3.69 2.72
4 13 L Probability and risk assessment 0.52 3.60 3.08
5 14 M Process modelling 0.25 3.13 2.88
6 15 M Knowledge management 1.04 3.08 2.04
16 M Consistency management 0.77 3.01 2.24
7 17 M Conguration management 0.43 2.97 2.54
8 18 M Formation 0.61 2.92 2.31
9 19 L DFX 0.43 2.80 2.47
4011 20 M Evolution management 0.36 2.58 2.22
21 L Integration and Control 0.66 2.56 1.90
1 22 L Empowerment 0.00 2.51 2.61
2 23 H Design experience re-use 0.72 2.27 1.54
3 24 L Right-rst-time, re-work & iteration control 0.00 2.20 2.20
4 25 L Planning scheduling and control 0.32 0.13 1.81
5 26 L Variant management 0.50 2.11 1.61
27 M Viewpoint management 0.20 1.97 1.77
6 28 M Assesst & enhancement 0.18 1.68 1.50
7 29 H Task management 0.18 1.56 1.38
8 30 L Life cycle issues 0.05 1.56 1.52
9 31 M Resource management 0.23 1.43 1.30
32 L DFX Management 0.14 1.34 1.20
50 33 M Providence 0.02 0.61 0.59
51111

47
The Need for Design Performance

Means appearing in the top-ten in terms of contribution. However, it was 1


recognised that Design for X (DFX) and X to Design (XTD) were inextricably 2
linked and that these areas should be treated as one in terms of focusing 3
improvement efforts. That is, Design for X can be seen to incorporate the 4
ability of the design department to design a product which supports existing 5
manufacturing capability, while X to Design may reect the ability of the 6
manufacturing department to manufacture in accordance with the specica- 7
tions. Therefore, although DFX was outside the top-ten in terms of the 8
ideal contribution it was grouped with XTD to become one of 7 target areas 9
for performance improvement (shaded in Table 4.4). 1011
1
2
4.2.4 Trial Results 3
4
The trial was conducted in order to gain insight into the needs of industry for 5
support in performance analysis and improvement. Therefore during the trial 6
all points raised by the participants, questions asked and general comments 7
were recorded. In addition a discussion took place at the end of the trial where 8
91
the facilitators requested more general feedback on the worth of the approach.
2011
The following provides a summary of the key points raised which act as
1
a guide to the development of the work reported in Part 2. The summary is
2
presented as strengths and weaknesses of the approach.
3
4
5
4.2.4.1 Strengths 6
7
The participants commented favourably on the overall logical structure 8
provided by the approach, which allowed a detailed analysis of their design 9
development performance in a relatively short time period. They felt that 3011
the matrix approach provided a structure that allowed them to engage in 1
self-analysis in a methodical way. 2
This methodical approach also supported a very detailed analysis, i.e. 9 3
goals and their associated priorities were specied resulting in 288 impact 4
relationships being dened. 5
The results, when presented within about 30 minutes of nalising the data, 6
created signicant discussion and many questions were raised as to the 7
8
validity of these results. What-if scenarios were requested by the partici-
9
pants, which were simulated using the software system. This supported
4011
consensus building within the team and ensured all team members were
1
given the opportunity to raise concerns and have them addressed.
2
The approach allowed Company A to assess performance against their spe- 3
cic enterprise requirements. This compared favourably with more general 4
approaches the company had used previously which were not seen to be as 5
relevant to their specic needs. 6
The ability of the approach to produce results on the day of the analysis 7
was welcomed by the analysis team as it allowed a review of the results 8
while the rationale was still evident. Delivering results after the event 9
was considered less favourable and unlikely to produce the same level of 50
consensus. 51111

48
Industrial Practice

1 4.2.4.2 Weaknesses
2
3 From a general viewpoint there was confusion over many of the terms being
4 used and their exact meaning/nature was questioned. In particular the
5 following were unclear:
6
7 The participants had difculty understanding the nature/meaning of
8 impact and contribution. Other terms were also referred to in discussing
9 this area, such as the effect and the effectiveness of a DC means.
1011 The relationship between a PD Aspect and a DC Means.
1 The meaning of each of the different DC Means in relation to Company
2 As environment, e.g. integration and control.
3 Although the list of DC means covered all of the resources that Company
4 A had identied within their organisation, there was a sense that they would
5 have been more comfortable dening their own individual resources and
6 groupings.
7 Although the analysis was automated as far as possible some of the for-
8 matting and presentation of graphical results had to be done manually
9 which restricted exibility in terms of creating what-if scenarios. That is,
2011 it was difcult to meet the needs of the analysis group in an appropriate
1 time period. Greater exibility in adding/removing resources/goals would
2 enhance the support provided.
3
4
5 4.2.5 Further Analysis
6
7 Following this initial trial and a review of the results a more fundamental
8
understanding of performance and formalism is presented in Chapter 5. As
9
this was being progressed a repeat analysis was carried out at Company A to
3011
assess the degree of improvement after a two and a half year period into
1
Project P1. This provided an opportunity to further evaluate the approach and
2
conrm the ndings in the initial trial.
3
The analysis was conducted at Company A three years later and was based
4
on the data originally obtained (Section 4.2.3). That is, the resources,
5
goals/priorities, etc. were considered to be unchanged for the purpose of inter-
6
nally benchmarking performance over the three-year period. The ideal and
7
8 current impact values from the original analysis served as the basis for com-
9 paring the updated performance. The analysis team was only asked to estab-
4011 lish the updated impact of the resources on the goals. This was input to the
1 software system that provided graphical results of the comparison between
2 the original and the updated. The use of the data from both the original and
3 updated analyses resulted in three sets of impact relationships, i.e.:
4 Ideal (established in the original analysis)
5 Original
6 Updated (3 years later)
7
8 A number of graphs were plotted to illustrate comparisons across the three
9 data sets. An overall comparison is shown in Figure 4.4 that relates the
50 contribution (as a percentage of total contribution) of resources [114]. The
51111 graph compares the current and ideal, as established in the original analysis,

49
The Need for Design Performance

% of Total (Ideal) 1
0.0 1.0 2.0 3.0 4.0 5.0 6.0
No. Ease MEANS FOR IMPROVEMENT 2
1 L Multi-criteria management
2 M Inter/intra integration 3
3 L Effective communication
4 L Distributed design authority 4
5
6
M
M
Authority, responsibility and control
Conflict resolution
5
7 M Negotiation support/management 6
8 M XTD
9 L Information integration 7
10 H Design re-use and standardisation
11 L Integrity 8
12
13
L
L
Integration and coherence
Probability and risk assessment
9
14
15
M
M
Process modelling
Knowledge management
1011
16 M Consistency management 1
17 M Configuration management
18 M Formation 2
19
20 M
L DFX
Evolution management
3
21
22
L
L
Integration and Control
Empowerment
Ideal 4
23 H Design experience re-use Updated 5
24 L Right-first-time, re-work and iteration control Updated
25 L Planning, scheduling and control 6
26
27 M
L Variant management
Viewpoint management
7
28
29
M
H
Assessment and enhancement
Task management
8
30 L Life cycle issues 91
31 M Resource management
32 L DFX Management 2011
33 M Providence
1
Figure 4.4 Comparison of results over a three year period at Company A. 2
3
4
with the current situation as dened 3 years later. As all factors remained con- 5
stant, with the exception of impact, the changes between the original values 6
and the updated ones are due solely to changes in impact. It can be seen from 7
Figure 4.4 that the ideal values were exceeded for a number of resources. 8
Following the original analysis Company A implemented a number of ini- 9
tiatives to address the key target areas identied using the analysis. The 3011
updated review highlighted that they had achieved signicant improvements 1
in their use of resources to achieve their goals. However, the improvements 2
gained raised further questions around their understanding of impact of 3
resources. In a review of this second analysis, the team commented that in 4
the original analysis they had limited knowledge of the systems, techniques, 5
etc. that could be employed by the organisation. The team felt their knowl- 6
edge was such that they underestimated the ideal impact of different methods, 7
tools, etc. as they were not as aware of these as the facilitators. This resulted 8
in a situation where they exceeded the ideal impact in the updated analysis 9
and raised the issue of dening what the ideal actually means. 4011
1
2
4.3 Summary 3
4
The company considered the industrial trials successful as they resulted in 5
an increased insight into their design development performance and pro- 6
vided them with focus areas for improvement. Although there were issues of 7
understanding the company felt that the results of the second trial were largely 8
accurate and that signicant improvements had been made, i.e. comparing 9
the original to the updated (3 years later) results. Some of the participants 50
provided general comments on the approach: 51111

50
Industrial Practice

1
2 Without the audit and analysis . . . many of the critical issues would have
3 received only a token effort, in favour of the intuitive techniques. Progress
4 to date has been so much more certain and condent, with the team armed
5 with the knowledge that they are not just indulging in fanciful HR and
6 formal techniques but are indeed doing more to guarantee the success of a
7 very large and complex undertaking.
8 Technical Director
9
1011 I was particularly impressed with the fact that while the [facilitating] team
1 provided the method, it was our input and experience which generated the
2 results which were appropriate for us. Through this approach, we have been
3 able to provide ourselves with a solid foundation for our improvement
4 initiatives and a realistic and pertinent set of objectives for our future
5 development
6
7 Technical Project Manager
8
9
2011 The review and trial reported here provided rst hand knowledge of indus-
1 trial issues and resulted in the identication of key areas of focus for the
2 remainder of the research, i.e. to address the existing weaknesses. Part 2
31 of the book is aimed at providing a more fundamental understanding of
4 performance and developing the underlying concepts that address these
5 weaknesses.
6
7
8 Notes
9
3011 1 This system formed a preliminary framework for the design and development of a nal
1 system solution suitable for industrial application and used in the evaluation of the research
reported here.
2 2 The actual name of the company and some additional details are considered condential
3 and withheld in this book.
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

51
1
2
3
4
5
6
7
8
9
1011
1 PART II
2
3
4
5
6
7
8
9
A Methodology
2011
1
2
for Enhanced
3
4
5
6
Design
7
8
9
Performance
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
5
A Formalism for
Design Performance
Measurement and
Management

Part 1 of this book highlights a number of important


1 weaknesses in our understanding of performance in
2 design development. A fundamental model of design
3 performance is introduced here, based on a well-proven
4 modelling formalism (IDEF), which relates specic
5 elements of performance and highlights its support for
6 performance measurement and management.
7 The chapter begins by presenting a knowledge-
8 processing model of activities in design development,
9 detailing the key elements of knowledge. This model is
1011 enhanced to provide a distinction between design and
1 its management within what is termed here as the
2 Design Activity Management (DAM) model, describing
3 managed activities. Typical relationships that may exist
4 between managed activities are highlighted to illustrate
5 how the model could describe the design process. A fun-
6 damental model of design performance is then introduced
7 based on the knowledge-processing model, clearly distin-
8 guishing and relating efciency and effectiveness. The
9 application of the performance formalisms provided in
2011 this model is presented within a scenario of analysis and
1 decision making in the Performance Measurement and
2 Management (PMM) model. Finally, the work described
3 in this chapter is related to the ndings from Part 1.
4
5
6 5.1 Activity Model
7
8 In analysing performance in design the subject of analy-
9 sis may vary considerably, i.e. the object of analysis may
3011 be a complete design project, a functional department,
1 a particular process, etc. The work described in this
2 chapter is focused on the performance of an individual

55
A Methodology for Enhanced Design Performance

G 1
2
3
4
5
Activity 6
I O
(A)
7
8
9
1011
R 1
2
Figure 5.1 Knowledge-processing activity. 3
4
design activity while being generically applicable across multiple activities. 5
As highlighted in Section 1.1 activities are the fundamental elements that 6
transform input to output and are the basic components of processes, phases, 7
projects, etc. An activity model is presented here (Figure 5.1) focusing on 8
knowledge in design. This model is based on IDEF [115], one of the 91
Integrated Computer Aided Manufacturing Denition (IDEF) techniques, 2011
which was specically created to model activities, processes or functions. 1
2
3
5.1.1 A Knowledge-Based Model of Design 4
5
Design may be seen as the processing of knowledge [116], i.e. knowledge is 6
continuously evolved as a result of specic activities between extremes of 7
abstract versus concrete and general versus specic [36, 41, 117]. Figure 5.1 8
illustrates such an activity and the key categories of knowledge that relate to 9
it. All inputs and outputs may be represented as forms of knowledge, e.g. a 3011
designer is represented in this model as a knowledge resource (R), the state 1
of the design prior to the activity may be described as the knowledge input 2
(I), etc. Four categories of knowledge are identied here: 3
Knowledge Input (I): the knowledge present prior to the activity. 4
5
Knowledge Output (O): the knowledge present as a result of the activity
6
taking place.
7
Knowledge Goal (G): the knowledge that directs and constrains the activity. 8
Knowledge Resource (R): the knowledge that acts on the input to produce 9
the output. 4011
These categories are detailed further below and it is shown that the category 1
in which an element of knowledge resides is not xed, but derived from the 2
context of the model, i.e. the activity to which it is related. For example, an 3
output of one activity may act as a constraint on another. 4
5
6
5.1.1.1 Input and Output 7
8
The input and output refer to the initial and nal states of knowledge, in rela- 9
tion to both the design1 and the design activity2. Typically, the knowledge 50
would include details of: 51111

56
A Formalism for Design Performance Measurement and Management

1 a) The design i.e. the input could be a concept sketch with an output being a
2 more detailed description of the design.
3 b) The design activity i.e. the input could be knowledge of current expendi-
4 ture related to the resources used in activities prior to the start of the activ-
5 ity being modelled. The output will also have a value for expenditure, which
6 will represent an increase on the input value, as further resources will have
7 been used during an activity.
8
9 The distinction between design and design activity knowledge is further
1011 elaborated in Section 5.2.
1
2
3 5.1.1.2 Resources
4
5 Designers are utilised within activities as sources of knowledge along with
6 knowledge of their particular attributes such as cost, availability, etc. People
7 are generally viewed as the core resource in design [118] but many other
8 resources, such as computer tools, materials, techniques and information
9 sources (e.g. standards) are also used. These resources incur a cost when used
2011 by an activity, e.g. the use of a method such as Quality Function Deployment
1 (QFD) [112] may involve a training or licensing cost. Although resources may
2 be categorised, e.g. Eynard [119] distinguishes between human, material and
3 informational, for performance management purposes all resources may be
4 represented in the form of knowledge that can be utilised within different
5 activities. These resources will have various attributes such as cost, capability,
6 availability, etc. [120] which are evaluated when being allocated.
7
8
9 5.1.1.3 Goals and Constraints
3011
1 Goals (G) are specic elements of knowledge that direct the change in the
2 state of the design from initial input (I) to nal output (O) states. A goal refers
3 to a future situation, which is perceived by the goal originator to be more
4 desirable than the current situation [15]. It is suggested that goals are almost
5 ubiquitous in design although they are often implicit and ill dened [121].
6 In design and development, goals are generally stated in the form of objec-
7 tives which make up the design specication e.g. a design goal to maximise
8 the value of a particular property (p) in the design, Gj: p = Max. Design goals
9 such as this allow the evaluation of the solution with respect to a datum in
4011 a way that allows a meaningful rank ordering of design proposals [15]. Figure
1 5.2 (a) shows that two proposals, O(s1) and O(s2), are considered. A value for
2 the property (p) may be established for each proposal and compared against
3 the datum (p=0). In this case, with the goal stated as Gj: p = Max, the com-
4 parison may establish the extent to which the proposals meet the goal.
5 Although both may be considered solutions as they meet the goal to some
6 extent, O(s1) is more desirable than O(s2) due to its greater difference to the
7 datum in the preferred direction3.
8 Design constraints place stricter limits on the desired future situation and
9 provide a basis for the validation of solutions. For example, a constraint may
50 be introduced which states the minimum value for a property (Gj: p pmin).
51111 In Figure 5.2 (b) the value of p for both proposals, O(s1) and O(s2), is again

57
A Methodology for Enhanced Design Performance

Gj : p = Max Gj : p pmin Gj : p pmin ^p = Max 1


p p 2
O(s1 ) O(s1 ) O(s1 ) 3
4
pmin 5
O(s2 ) O(s2 ) O(s2 ) 6
pmin
7
0 0 0 8
9
(a) (b) (c) 1011
1
Figure 5.2 Goals and constraints.
2
3
4
established and compared against the new datum, (pmin). However, the com- 5
parison in this case is carried out to establish which side of the datum the 6
proposal lies, not its distance from the datum. By specifying this constraint, 7
only proposals having a value of p pmin may be validated as solutions. 8
Therefore the design output O(s2) is not considered a valid solution in this 91
case. 2011
When using constraints there is less focus on the extent to which a con- 1
straint has been met, and more on if it has been met or not. Therefore, in the 2
example shown in Figure 5.2 (c) there is no distinction, in terms of being 3
above the minimum value (pmin), between the design proposals O(s1) and O(s2) 4
as both meet the constraint. However, the addition of a goal to maximise p 5
allows a distinction to be drawn between O(s1) and O(s2) in terms of how well 6
they meet this goal. When the property p for proposals O(s1) and O(s2) is 7
assessed it will become evident that O(s1) has a higher value for p than O(s2) 8
and therefore it meets the goal better. 9
Both goals and constraints play a role in the selection of design proposals. 3011
In both cases the value of a property in the design proposal, to which the goal 1
and/or constraint relate, is established. The comparison of the value to a goal 2
is more complex than its comparison to a constraint and allows evaluation as 3
opposed to validation. However, the similarity in the use of goals and con- 4
straints when assessing design proposals allows both to be treated similarly 5
within the models described here as only the nature of the comparison 6
changes. Consequently, G is used to represent the knowledge of both goals 7
and constraints within this work. 8
Although the overriding goal (G) in design may be to contribute to the 9
nancial performance of the organisation this goal must be broken down 4011
through a number of levels to provide meaningful direction to departments, 1
teams, individuals, etc. Goal decomposition, often via a hierarchical structure, 2
is used in design to reduce the complexity of problem solving and provide a 3
means by which sub-solutions may be evaluated throughout the design 4
process [12, 15, 122124]. Within a goal breakdown structure, sub-goals may 5
exist both for the design (DG) e.g. reliability, aesthetics, etc. and for the design 6
activity (DAG) for creating that design, e.g. time consumed, labour costs, etc. 7
(Figure 5.3). 8
During the design activity these goals are constantly changing, new goals 9
are added and others are removed [125]. Design management involves 50
continually assessing performance with respect to both design and design 51111

58
A Formalism for Design Performance Measurement and Management

1 G
2
3
4
5
6
7 DG DAG
8
9
1011
1
2
3 DGj DGm DAGj DAGm
4
Figure 5.3 Goal decomposition.
5
6
7
8 activity goals and altering aspects such as resource allocation to achieve
9 optimal performance [126]. The design management activity requires addi-
2011 tional knowledge of the goals, i.e. their relative priorities, to support decision
1 making with respect to multiple criteria [127, 128]. Often priorities for goals
2 are dened somewhat informally e.g. designers may rank goals in relation to
3 how difcult to satisfy they perceive them to be [36]. The prioritisation of
4 goals is further discussed in Section 7.2.3 of this book.
5
6
7 5.1.1.4 The Context of Knowledge Categories
8
9 A particular activity denes the input, output, goal and resource knowledge.
3011 That is, the category within which an element of knowledge exists is depen-
1 dent on the context in which it is described. For example, in Figure 5.4 the
2 activity of dening the Product Design Specication (PDS) [129] is illustrated.
3 This activity begins with knowledge of the customer need as input (I). The
4 goal (G) of the activity could be to create as comprehensive a specication as
5
6
Comprehensive
7
Specification
8
9
4011
1
2
3 Customer Define
PDS
4 Need PDS
5
6
7
8
9 Designer
50
51111 Figure 5.4 Knowledge context.

59
A Methodology for Enhanced Design Performance

possible. The output knowledge (O) is represented in the PDS, which is created 1
using a designer as the resource knowledge (R). 2
However, the category of each element of knowledge described above is a 3
result of the context represented in Figure 5.4. That is, the same knowledge 4
elements could exist within a different category of knowledge in a different 5
context. For example, at a higher level the customer need could be represented 6
as goal knowledge as one of the overriding goals in design is to satisfy cus- 7
tomer need. Similarly, the output of the activity create design specication 8
could include goals that are often used to guide subsequent design activities, 9
i.e. goals exist within elements of the Product Design Specication such as 1011
cost, weight, etc. (Figure 5.5). The output could also contain additional 1
resource knowledge, for example increased expertise of the designer, gained 2
through learning in design [130]. This knowledge would be available as 3
resource knowledge for subsequent design activities (Figure 5.6). 4
5
6
G1 7
8
91
2011
Activity
O1 G2 1
I1 2
(A1)
3
4
5
Activity O2 6
R1 I2
(A1) 7
8
9
3011
R2 1
Figure 5.5 Output goal knowledge.
2
3
4
5
G1 6
7
8
9
O1 G2 4011
Activity
I1
(A1) 1
2
3
O2
4
Activity
R1 I2 5
(A2)
6
7
8
R2
9
50
Figure 5.6 Output resource knowledge. 51111

60
A Formalism for Design Performance Measurement and Management

1 5.2 Activity Management


2
3 As outlined in Chapter 3 many models of the design process have been pub-
4 lished in the literature, aimed at developing a better understanding of the
5 process and ultimately improving the process performance. However, these
6 models have been primarily focused on improvements from a design (i.e. arte-
7 fact) viewpoint, i.e. the aim is to encourage systematic approaches, use of
8 design methods, etc. to improve the performance with respect to design goals.
9 For example Pugh [40] concentrates on the product component of total
1011 design.
1 The previous section highlighted the existence of both design and design
2 activity goals and the need to measure performance with respect to both types
3 of goal as part of managing design. A new model of design is presented here,
4 based on the activity representation in Figure 5.1, which distinguishes the
5 activities addressing these goals and further elaborates upon the elements of
6 knowledge involved.
7
8
9 5.2.1 Design and Management
2011
1 At a fundamental design activity level the design and design activity goals may
2 be managed intuitively by the designer, in what has been presented in Figure
3 5.1 as one activity. However, it is proposed that there are two types of activ-
4 ity taking place; design activities (Ad) and design management activities (Am).
5 Design activities are focused on the design goals (DG) while design manage-
6 ment activities are concerned with design activity goals (DAG) and managing
7 the trade-off between achieving design and design activity goals to ensure best
8 overall performance. At a Product Development project level these activities
9 are often dened separately and are generally carried out by different people
3011 e.g. the designer/design team and the design manager [51]. However, the dis-
1 tinction between these activity types exists even at the level of individual
2 design activities. A designer may punctuate a particular design activity (Ad),
3 considering design goals (DG), with a design management activity (Am), con-
4 sidering design activity goals (DAG) (Figure 5.7). For example, during sketch-
5 ing a designer may glance at their watch to evaluate the time elapsed in
6 relation to an implicit or explicit time goal (DAG) before proceeding. This
7 represents a change of focus and activity, i.e. from a design activity, focused
8
9
4011
1 DG DAG
2
3
4
5
6
7 Design
Design
Management
8 Activity
Activity
9
50
51111 Figure 5.7 Design/management activities.

61
A Methodology for Enhanced Design Performance

on producing a sketch in accordance with a design goal, to a design manage- 1


ment activity focused on ensuring a design management goal is achieved, e.g. 2
sketch is completed on time. The following further denes the fundamental 3
nature of these activities and the relationships between them. 4
5
6
5.2.1.1 Design Activity 7
8
Design activities have received signicant attention in research and a variety 9
of models have been proposed to describe the activities and their relation- 1011
ships within the design process [2, 39, 85, 87, 131133]. Many design process 1
models focus on distinguishing key phases such as conceptual design and 2
detailed design, with each phase resulting in different stages of the develop- 3
ment of a product [11, 12, 39, 40]. However, within each of these phases it 4
is recognised that general problem solving takes place and a number of activ- 5
ities are common within design problem solving. These problem solving 6
activities include analysis, synthesis and evaluation and are followed by a 7
decision point, where results are accepted or rejected. This may be referred 8
to as the basic design cycle [15]. 91
In the context of a specic design problem the activities of analysis, syn- 2011
thesis and evaluation require knowledge of the particular product domain, 1
e.g. electronic systems, gearbox theory, etc. In addition, for the activities to 2
be carried out requires know-how e.g. the knowledge of how to carry out 3
synthesis through the use of sub-activities such as selection and cong- 4
uration. Both types of knowledge are described here as Design Resource 5
(DR) knowledge used to support design activities, i.e. all of the knowledge 6
required to carry out a design activity with respect to a particular design 7
goal(s) (DG). 8
9
3011
5.2.1.2 Design Management Activity 1
2
When a designer evaluates some attribute of the process, e.g. time (t), they 3
are not designing but managing the design activity. The design management 4
activity may involve the designer in assessing an activity attribute, such as 5
time, comparing it against the design activity goal and deciding on further 6
action. This action could be to continue the design activity, conclude the 7
design activity, or alter goals and/or resources. The alteration of goals or 8
resources may involve a request to a higher level authority. 9
For any design activity, such as analysis, where both design and design 4011
activity goals are present, there will be a corresponding management activity, 1
i.e. analysis management. This management activity is aimed at achieving 2
design activity goals, such as those related to time and cost, while ensur- 3
ing an acceptable level of effectiveness against design goals is achieved by 4
the design activity [43, 98, 126, 134136]. In many cases this management 5
activity may be informal such as glancing at a watch, having implicit goals 6
and few guidelines or procedures. In other cases management activities may 7
be formal, such as the scheduling of activities, and have explicit guidelines, 8
goals, etc. 9
Design management activities require different resource knowledge to that 50
of design activities. This knowledge is concerned with the decisions that direct 51111

62
A Formalism for Design Performance Measurement and Management

1 the design activities [15] and is referred to here as Design Activity Resource
2 (DAR) knowledge. Typically this knowledge will include meta-knowledge
3 of the product domain, design activities and resources, and knowledge of
4 process management methods such as Project Network Techniques [137].
5
6
7 5.2.2 A Model of Design Activity Management
8
9 When activities are discussed in the literature, there is often no distinction
1011 between design activities and design management activities at the funda-
1 mental level such as that discussed above. For example, when the activity of
2 sketching is referred to, it is implied that a single activity addresses both
3 the design (i.e. the sketch itself) and design activity (e.g. the time taken to
4 complete the sketch) goals. Such an activity is referred to here as a managed
5 activity (A) which have sub-activities of design (Ad), aimed at achieving design
6 goals (DG), and design management (Am) aimed at achieving design activity
7 goals (DAG).
8 Given the basic design activity representation presented in Figure 5.1 and
9 the distinction between design and design management presented above, a
2011 further new model is introduced in Figure 5.8 to describe design and its man-
1 agement. This Design Activity Management (DAM) model distinguishes
2 between a design activity and a design management activity. It provides
3 a framework in which performance may be measured and managed (see
4 Section 5.5).
5
6
7 G
8
9
3011
DG DAG
1
2
DO
3 DI Design
4 Activity
5 (Ad )
6
7 I O
8
9
4011
1 Design DAO
DAI Management
2
Activity
3 (Am )
4
5
6 DR DAR
7
8
9 R
50
51111 Figure 5.8 Design Activity Management (DAM) model.

63
A Methodology for Enhanced Design Performance

The categories of input (I), output (O), goal (G) and resource (R) knowl- 1
edge, presented in Figure 5.1, are decomposed to reect categories related to 2
either design or design management activities as follows: 3
4
I DI and DAI
5
O DO and DAO 6
7
G DG and DAG
8
R DR and DAR 9
1011
These knowledge categories are further formalised in relation to the DAM 1
model. The category to which an element of knowledge belongs will again be 2
dependent on the context as highlighted in Section 5.1.1. 3
1. I , O , G and R : Refer to the overall input, output, goal and resource knowl- 4
edge as previously described. 5
2. Design Input (DI): This refers to the initial state of knowledge, or working 6
knowledge [124], with respect to the design prior to the design activity. The 7
state of the design may be described at any point from statement of need 8
through to nal description of the artefact. Therefore, design knowledge 91
input may be represented as a design specication, a design concept, full 2011
working drawing, etc., depending on the design activity to which it relates. 1
2
3. Design Output (DO): This refers to the nal state of knowledge with respect
3
to the design after the design activity. DO therefore represents the result of
4
the evolution of DI through a specic design activity. The representation
5
of this knowledge will vary in a similar way to DI and will depend on the
6
activity to which it relates.
7
4. Design Activity Input (DAI): This refers to the initial state of the design 8
activity knowledge with respect to the design management activity. This 9
could contain the output from a previous management activity and would 3011
typically refer to the cost of activities prior to the activity, time consumed 1
by prior activities, etc. 2
5. Design Activity Output (DAO): Refers to the nal state of the design activ- 3
ity knowledge after the execution of the design management activity. For 4
example this knowledge could contain the time elapsed, incurred cost, etc. 5
Therefore it represents the DAI with the addition of any resources used. 6
6. Design Goal (DG): The design goal(s) being addressed by the design activ- 7
ity. An example of a design goal could be to minimise the weight of the 8
product. These goals have been discussed in Section 5.1.1 where design 9
constraints are represented within DG. 4011
7. Design Activity Goal (DAG): The design activity goal(s) being addressed by 1
the design management activity. An example of such a goal could be the 2
desired lead time for a design activity. In addition, the relative priorities 3
between the design and design activity goals will be present within this 4
knowledge as derived from the higher level goal 5
6
8. Design Resource (DR): The knowledge required to carry out design activ- 7
ities with respect to a particular product domain, as discussed in Section 8
5.2.1. This knowledge includes: 9
Knowledge of particular disciplines e.g. electronics, metallurgy, etc. 50
Knowledge of the particular product domain e.g. gearboxes. 51111

64
A Formalism for Design Performance Measurement and Management

1 Knowledge (i.e. know-how) of how to carry out key design activities


2 such as analysis, synthesis, etc. including the use of particular support
3 tools such as CAD, QFD, etc.
4 9. Design Activity Resource (DAR): The knowledge necessary to carry out
5 design management activities. This type of activity is primarily focused on
6 analysing existing goals and resources, evaluating and comparing design
7 and design activity output against goals, and altering goals and resources
8 to achieve optimum performance. Thus, DAR includes:
9 Meta-knowledge of design activities and resources, e.g. the resources
1011
required to carry out synthesis, their average cost perhour, the current
1
cost based on allocated resources, resource availability, etc. This may
2
also include the inherent efciency of design activity/resource combi-
3
nations based on analysis of past projects.
4
5 Knowledge (i.e. know-how) of how to carry out design management
6 activities such as cost analysis, resource evaluation, etc. including the use
7 of particular support tools such as PERT and GANTT charts [137, 138].
8
9
2011 5.3 Managed Activity Relationships
1
2 The model presented in Figure 5.8 illustrates the distinction between design
3 and design management activities as sub-activities of an overall managed
4 activity (MA). Such MAs take place within the design process, relating to other
5 managed activities in various ways. The basic design cycle proposed by
6 Roozenburg is considered to be present within all phases of design and rep-
7 resents one of the most fundamental models of designing [15]. It is used here
8 as a basis for dening typical relationships that can exist between two or more
9 managed activities within the context of such a cycle. This highlights the
3011 applicability of the DAM model throughout the design process to distinguish
1 and relate design activities and their management.
2
3
4 5.3.1 Decomposition Relationships
5
6 Roozenburgs denition of the basic design cycle is adapted here to illustrate
7 three activity types; analysis, synthesis and evaluation, that may be consid-
8 ered as a decomposition of the overall design (problem solving) activity
9 (Figure 5.9). The model illustrates that each MA may have both design and
4011 design management activities at each level of decomposition. Similarly, the
1 managed activities of analysis, synthesis and evaluation may be decomposed
2 into managed sub-activities each having design and design management
3 activity components4.
4 The following outlines how these managed activities may be decomposed
5 and analysed at a lower sub-activity level. Take for example the MA of syn-
6 thesis. Design synthesis involves the combination of separate elements, ideas,
7 etc. into a new description that denes relationships among the previously sep-
8 arate items. Let us assume that synthesis is composed of two activities, selec-
9 tion and conguration, and that the output of selection is an input to
50 conguration5. The selection activity is aimed at selecting the appropriate ele-
51111 ments that make up the synthesis solution. Conguration is aimed at dening

65
A Methodology for Enhanced Design Performance

relationships (geometrical, spatial, etc.) between these elements to arrive at a 1


fully dened solution for the overall synthesis activity. 2
The overall goal of the higher level MA can be decomposed into individual 3
goals for each managed sub-activity. However, achieving each of the sub-goals 4
does not necessarily lead to achievement of the higher level goal(s) for 5
managed synthesis, as the managed synthesis activity itself will contribute to 6
achieving the goal(s). The goal/sub-goal relationships are maintained within 7
a Goal Breakdown Structure (GBS), e.g. [12]. In order to maintain coherence 8
(see Chapter 4) in performance measurement, the relationships between activ- 9
ities and sub-activities, goals and sub-goals, etc. must be known. The deni- 1011
tion of these relationships and their management under dynamic conditions 1
is the focus of other research e.g. [12, 122, 124, 139, 140]. 2
In decomposing managed synthesis to the two managed sub-activities of 3
selection and conguration the design goal for synthesis (DG) will be decom- 4
posed to individual goals for the selection activity (DGS) and the cong- 5
uration activity (DGC) as shown in Figure 5.10. The design activity goal (DAG) 6
will also be decomposed to individual goals for the activities of selection 7
management and conguration management (DAGSM and DAGCM). 8
Similarly, inputs, outputs and resources are dened specically for each 91
MA within the breakdown and represent the application of a decomposition 2011
function (fd) to those existing at a higher level. Therefore: 1
2
fd [DI] [DIS , DIC]
3
fd [DAI] [DAISM , DAICM] 4
5
fd [DO] [DOS , DOC]
6
fd [DAO] [DAOSM , DAOCM] 7
8
fd [DR] [DRS , DRC]
9
fd [DAR] [DARSM , DARCM] 3011
1
2
5.3.2 Temporal Relationships 3
4
The activities within the design process relate to each other in ways other than 5
through decomposition. That is, the activities have temporal relationships, 6
which support an overall process of design. For example, the output of the 7
synthesis activity could be seen generally as the input to the evaluation activ- 8
ity. This reects the general sequence of the design process but often there 9
is much iteration between the activities and a simplistic, sequential model does 4011
not fully represent all the relationships that may exist. The activities will have 1
relationships, i.e. links will exist between each sub-activity of the managed 2
activities. These relationships may be based on the information dependencies 3
between activities and be described as dependent, independent or interde- 4
pendent [131]. The following is a brief description of how such relationships 5
would be represented within the formalism introduced here. These relation- 6
ships are not considered exhaustive [142] or fully detailed [131], e.g. the out- 7
put of an activity may be represented as a goal for another activity, the strength 8
of the dependency between activities may vary, etc. However, the descriptions 9
provided here illustrate that such relationships may be described using the 50
DAM model. 51111

66
A Formalism for Design Performance Measurement and Management

1 Overall Design Activity

2
3
4 Design
5
6
7
Design
8 Management
9 Evaluation
1011 Evaluation

1
2 Synthesis
Decomposition
3
4
5 Analysis
6
7
8
9
Design
2011 Cycle
1
2
3
4 Figure 5.9 Managed activities within a process.
5
6
7
8 DG DAG
9
3011 Configuration
1
2 DGC DAGCM
3
4
5
6 Selection
7
8 DGS DAGSM
9
4011
Selection
1
2
3
4
5
6 Selection
7 Management
8
9
50
51111 Figure 5.10 Activity-goal decomposition.

67
A Methodology for Enhanced Design Performance

Dependent activities exist where the output of one task is required as input 1
to another, as represented in the general relationship between synthesis and 2
evaluation. Figure 5.11 illustrates a dependent relationship within the mod- 3
elling formalism used here. These activities therefore take place in series, i.e. 4
in this case selection would be completed before the conguration activity 5
begins. 6
Independent activities are those that may be carried out in parallel, or 7
concurrently, as there is no interaction between the inputs or outputs (Figure 8
5.12). This type of activity relationship underpins much of the work in 9
Concurrent Engineering where the basic proposal is that activities should 1011
be carried out in parallel whenever possible, thus shortening lead times [45, 1
109, 143145]. 2
Interdependent Activities more closely reect the iterative nature of design 3
[146]. For example, in the basic design cycle the evaluation of the design pro- 4
posal will follow the synthesis step. The results of this evaluation may feed- 5
back to synthesis in an iterative manner aimed at improving the design 6
proposal. Similarly, in the activities discussed here, it may be revealed during 7
the conguration activity that it is not possible to meet the goal for that activ- 8
ity (DGC). This may require the activity of selection to be carried out again in 91
order to bring new components into the potential solution. The iteration 2011
between activities such as selection and conguration may be repeated until 1
an acceptable solution is found. Therefore, the outputs and inputs may be 2
interrelated as shown in Figure 5.13. 3
In each of the relationships above it should be noted that when the design 4
process moves from one design activity to another there is a corresponding 5
transfer in the design management activity, i.e. when moving from selection 6
to conguration there is a corresponding move from selection management 7
to conguration management. 8
9
3011
1
Configuration 2
3
4
DIC DOC 5
6
Selection 7
8
9
DIS DOS 4011
1
2
3
4
5
6
7
8
9
50
Figure 5.11 Dependent (series) activities. 51111

68
A Formalism for Design Performance Measurement and Management

1 Configuration
2
3
4 DIC DOC
5
6 Selection
DI DO
7
8
9 DIS DOS
1011
1
2
3
4
5
6
7
8
9 Figure 5.12 Independent (parallel) activities.
2011
1
Configuration
2
3
4
5 DOC
DIC
6
7 Selection
8
9
3011 DIS
1 DOS
2
3
4
5
6
7
8
9
4011
1 Figure 5.13 Interdependent activities.
2
3
4
5 5.4 A Design Performance Model E2
6
7 The widespread use of efciency and effectiveness to describe performance
8 has been highlighted in Chapter 4, in addition to the variety of interpretations
9 of these terms when applied in design and development. Efciency () and
50 effectiveness () are presented in this book as fundamental elements of per-
51111 formance that may be used to fully describe the phenomenon. A new model,

69
A Methodology for Enhanced Design Performance

E2, is presented here, based on the design activity representation in Figure 5.1, 1
as a means to clearly determine the phenomenon of design performance and 2
allow efciency and effectiveness to be distinguished and related. Efciency 3
is related to input, output and resources, while effectiveness is determined by 4
the relationship between output and goal(s). These elements are presented 5
within the E2 model providing a fundamental representation of activity per- 6
formance. A scenario is subsequently developed in Section 5.5 to show how 7
E2 may be applied to design and design management activities within the 8
DAM model. 9
1011
1
5.4.1 Efficiency 2
3
Efciency: ratio of useful work performed to the total energy expended or 4
heat taken in [147]. 5
In general, the efciency of an activity is seen as the relationship (often 6
expressed as a ratio) between what has been materially gained and the level 7
of resource (material) used. If we assume that an activity transforms an input 8
to an output, using resources, under the direction of goals and constraints, 91
the efciency of that activity may be described as follows: 2011
1
(A) = M+ : RU and M+ = O I
2
Where: 3
4
(A): Efciency () of an Activity (A)
5
I: Input (Material) 6
7
O: Output (Material)
8
+ 9
M : Material gain
3011
RU: Resource (Material) used
1
For example, if we assume that moving a car from a point A to a point B is a 2
single activity we can establish the efciency of this activity by adopting the 3
above formalism. In this case the input and output would be representative 4
of the location of the car before (I) and after (O) travelling from A to B. The 5
material gain (M+) is the distance between the two locations (O I). The fuel 6
is represented as an available resource (R) of which a certain amount is used 7
(RU) during the activity of moving the car. Therefore the efciency () is 8
described in the ratio of distance travelled to fuel used (M+ : RU), often given 9
in kilometres/litres. 4011
Assuming design as a knowledge processing activity (Ak) (Figure 5.14), the 1
difference between the output (O) and the input (I) denes the knowledge gain 2
from the activity (K+). The cost 6 of the activity may be determined by mea- 3
suring the amount of resource knowledge used (RU). Therefore, the efciency 4
of this activity may be depicted as in Figure 5.14 and formulated as a ratio: 5
6
(Ak) = K+ : RU and K+ = O I
7
Where: 8
9
(Ak):Efciency () of an Activity (Ak)
50
I: Input (Knowledge) 51111

70
A Formalism for Design Performance Measurement and Management

1 O: Output (Knowledge)
2 +
K : Knowledge Gain
3
4 RU: Resource (Knowledge) Used
5
This formalism assumes that a quantitative comparison of the input and
6
output knowledge (I and O) can be carried out which results in a description
7
of the level of knowledge gained in the activity (K+). Similarly, it is assumed
8
that the level of knowledge used in the activity may be measured and that the
9
relationship between both quantities may be expressed in a meaningful form.
1011
1
2
3
5.4.1.1 Efciency Metrics
4
In practice a variety of metrics are used to determine efciency, reecting
5
different aspects of the input, output or resource knowledge. For example the
6
cost of using a designer within an activity may be measured to reect
7
the amount of nancial resource used in utilising this knowledge source.
8
The following example describes the use of time and cost to establish levels
9
of efciency within two activities A1 and A2 (Figure 5.15). The example illus-
2011
trates that different types of efciency may be measured and that an overall
1
efciency value should encompass these different types.
2
3
4 G
5
6
7
8 I O
Activity
9 (Ak )
3011
1
2
3 R
4
5
6
Efficiency (h)
7
Figure 5.14 Efciency ().
8
9
4011
G1 G2
1
2
3
4
Activity Activity
5 I1 O1 I2 O2
A1 A2
6
7
8
9 R1 R2
50
51111 Figure 5.15 Activity comparison.

71
A Methodology for Enhanced Design Performance

Assume that the input and output knowledge is the same for both activi- 1
ties (A1 and A2) and therefore the knowledge gain is equal, i.e. 2
3
(O1 I1) = (O2 I2)
4
K1+ = K2+ 5
6
Using Time (t) consumed as a metric to determine time-based efciency,
7
where t1 and t2 are the duration of activities A1 and A2 respectively:
8
9
1(t) = K1+ :t1 And 2(t) = K2+ :t2 1011
if t1 > t2 1
then 1(t) < 2(t) 2
3
Using Cost (c) as a metric to determine cost based efciency, 4
where c1 and c2 are the duration of activities A1 and A2 5
respectively: 6
7
1(c) = K1+ :c1 And 2(c) = K2+ :c2 8
if :c1 < :c2 91
then 1(c) > 2(c) 2011
1
In the example described above the efciency of A1 is less than the efciency 2
of A2 with respect to time7, while it is greater than A2 with respect to cost. 3
Thus, different types of efciency may be determined for an activity based on 4
the specic metrics used. However, to establish an overall efciency value 5
requires the use of an objective function, which reects the weighting placed 6
on metrics such as time, cost, etc., as used in general optimisation problem 7
solving [148]. This results in a value of efciency that allows a direct com- 8
parison between activities A1 and A2. That is, in the case of the example given 9
above: 3011
1
Overall Efciency 2
3
1 = K1+ : R 1 And j = K2+ : R2
4
Where: 5
6
R1: Total Resource Used in Activity A1
7
R2: Total Resource Used in Activity A2 8
9
This analysis of efciency can be applied to both design and design manage-
4011
ment activities although it may be difcult to distinguish between these
1
efciencies in some cases (see Section 5.5.2). Therefore:
2
(Ad) = DK+ : DRU and DK+ = DO DI 3
4
Where:
5
(Ad): Efciency () of Design Activity (Ad) 6
DI: Design Input (Knowledge) 7
DO: Design Output (Knowledge) 8
9
DK+: Design Knowledge Gain 50
DRU: Design Resource (Knowledge) Used 51111

72
A Formalism for Design Performance Measurement and Management

1 And
2
(Am) = DAK+ : DARU and DAK+ = DAO DAI
3
4 Where:
5
(Am)D: Efciency () of Design Management Activity (Am)
6
7 AI: Design Activity Input (Knowledge)
8
DAO: Design Activity Output (Knowledge)
9
1011 DAK+: Design Activity Knowledge Gain
1
DARU: Design Activity Resource (Knowledge) Used
2
3 Given the scenario of using time and cost metrics described above it is evident
4 that the efciency of a particular activity can be determined by the type of
5 metrics used and how they are combined for an overall measure. It should
6 be noted that the efciency of an activity is considered here to exist irrespec-
7 tive of whether it is measured or not, i.e. it is an inherent property of the
8 activity/resource relationship. The selection and application of metrics to
9 determine efciency allow particular views of efciency to be created, e.g. cost
2011 based efciency. The metrics used must therefore not only closely reect the
1 type of efciency phenomenon that is required to be measured (viewed) but
2 also the actual types of efciencies that are inherent in the activity.
3
4
5 5.4.2 Effectiveness
6
7 Effective: having a denite or desired effect [147].
8 Activities are generally performed in order to achieve a goal, i.e. have a
9 desired effect. However, the result obtained from performing an activity may
3011 not always meet the goal. The degree to which the result (output) meets the
1 goal may be described as the activity effectiveness. That is:
2
(A) = rC (MO , MG)
3
4 Where:
5
(A) : Effectiveness () of Activity (A)
6
7 rC: Relationship (Comparative) 8
8
MO: Material Output
9
4011 MG: Material Goal
1
For example a goal for a manufacturing activity might be to have ten or less
2
defective parts produced during a week of production (MG). The number
3
of defective parts produced (MO) may be measured and compared with the
4
goal (using rC ) to establish the manufacturing activity effectiveness. Clearly,
5
in this case, a more effective activity is characterised by fewer defective parts.
6
Effectiveness of activities in design where knowledge is processed (Ak) is
7
focused on the nature and comparison of the knowledge expressed in goals
8
and outputs [149, 150]. Therefore, activity effectiveness is depicted in Figure
9
5.16 and can be expressed as:
50
51111 (Ak) = rC (O , G)

73
A Methodology for Enhanced Design Performance

Effectiveness () 1
G 2
3
4
5
I O 6
Activity
7
(Ak )
8
9
1011
1
R 2
Figure 5.16 Effectiveness ().
3
4
5
6
Where:
7
(Ak) : Effectiveness () of Activity (Ak) 8
91
rC : Relationship (Comparative)
2011
O: Output (Knowledge) 1
2
G: Goal (Knowledge)
3
This formalism assumes that the output knowledge (O) and goal knowledge 4
(G) may be described in a manner which allows a direct comparison between 5
them, and a relationship to be determined which indicates how closely they 6
match. 7
8
9
5.4.2.1 Effectiveness Metrics 3011
1
In design, multiple goals are likely to exist with different priorities (as dis- 2
cussed in Section 5.1.1). Therefore the activity effectiveness with respect to a 3
single goal will often represent only a partial view of effectiveness measured 4
using a particular metric. That is, effectiveness may be described in relation 5
to particular type of goals, e.g. cost goals, in isolation of other goals such as 6
time, artefact aesthetics, etc. 7
For example, within an individual activity (A) (Figure 5.17) there could 8
exist two goals, derived from a higher level goal of meeting the specication 9
(G), which relate to the life in service (G1 ) and the dimensional accuracy (G2 ) 4011
of the design. Therefore, specic elements (O1 and O2 ) of the overall out- 1
put knowledge (O) must be evaluated (using time and dimension based 2
metrics for example) and compared (rC) to determine the respective levels of 3
effectiveness (1 and 2 ). That is: 4
5
1 = rC (O1 , G1)
6
and 7
8
2 = rC (O2 , G2)
9
where it is assumed that a common comparative function, rc, may be 50
used to establish both levels of effectiveness. In some instances the type of 51111

74
A Formalism for Design Performance Measurement and Management

1 G
2 (A)
3
2
4 G1 G2
5 1
6
7 O2
8 Activity
9 I (A) O1 O
1011
1
2
3
4
5 R
6 Figure 5.17 Effectiveness types.
7
8
9 comparative function used may vary and is dependent on whether a con-
2011 straint or a goal is represented within G1 and G2.
1 In the case described here the effectiveness of a particular design proposal
2 may be high with respect to life in service but low with respect to dimensional
3 accuracy. The determination of overall effectiveness for the activity, (A),
4 presents a multiple criteria decision making problem, similar to that des-
5 cribed in Section 5.4.1. That is, to fully evaluate the design proposal some
6 value of overall effectiveness must be established. There are a variety of
7 techniques that may be adopted to support optimisation [148] although the
8 nature of the design activity is such that obtaining values with which to carry
9 out optimisation is difcult, e.g. determining values for aesthetic elements.
3011 The weighting method [54, 148] presents one of the simpler approaches to
1 this type of problem. For example if we consider the priority (or weight) of
2 the life in service goal (G1) to be W1 with respect to that of the dimensional
3 accuracy goal (G2) of W2, then we could say:
4
Overall Effectiveness = (A) = (l W1) + (2 W2)9
5
6 It may be seen that the effectiveness metrics relate to the specic goals that
7 are present, e.g. some time-based metric is required to establish life in service
8 of the design proposal. The use of a particular metric provides a view of effec-
9 tiveness without providing the complete picture. That is a view of effective-
4011 ness with respect to life in service is obtained in isolation of other views.
1 Therefore a holistic approach to the determination of metrics, based on the
2 activity goals and priorities, should be adopted to provide comprehensive
3 analysis in this area. Such a comprehensive and fundamental approach is
4 currently lacking in the eld where there is a lack of coherence and integra-
5 tion between activity modelling, goal specication and metric determi-
6 nation. This results in metrics being applied without a full description of the
7 relation between the effectiveness obtained and the overall effectiveness,
8 i.e. performance may be described in relation to all the specied goals, but
9 not holistically.
50 The DAM model has shown that two types of goal exist within a MA, i.e.
51111 design goals (DG) and design activity goals (DAG), which relate to design

75
A Methodology for Enhanced Design Performance

activities and design management activities respectively. Consequently, two 1


types of effectiveness may exist within the DAM model (Figure 5.18). The 2
effectiveness of the design activity is referred to here as design effectiveness 3
while the effectiveness of the design management activity is termed design 4
management effectiveness10. The example of overall effectiveness, for an activ- 5
ity discussed above, focuses on design goals (DG). A similar example could 6
be stated for an activity having a number of design activity goals (DAG). For 7
example there may be goals relating to the cost and duration of the activity 8
with different priorities. Again the overall effectiveness of the activity with 9
respect to these goals, may be determined in the same way taking into account 1011
the individual effectiveness and goal priorities. Therefore: 1
2
(Ad) = rC (DO , DG)
3
Where: 4
5
(Ad): Effectiveness () of Design Activity (Ad)
6
rC: Relationship (Comparative) 7
8
DO: Design Knowledge Output
91
DG: Design Knowledge Goal 2011
1
And
2
(Am) = R (DAO , DAG) 3
4
5
6
7
G
8
9
(Ad ) 3011
DG DAG 1
2
DO 3
DI Design
Activity 4
(Ad ) 5
6
I O
7
(Am ) 8
9
4011
Design DAO 1
DAI Management 2
Activity
(Am )
3
4
5
DR DAR 6
7
8
R 9
50
Figure 5.18 Design and design management effectiveness. 51111

76
A Formalism for Design Performance Measurement and Management

1 Where:
2
(Am)r: Effectiveness () of Design Management Activity (Am)
3
4 rC: Relationship (Comparative)
5
DAO: Design Activity Knowledge Output
6
7 DAG: Design Activity Knowledge Goal
8
9 Establishing the overall effectiveness of a MA will involve the analysis of both
1011 design effectiveness and design management effectiveness and the use of some
1 form of objective function [148], be it formal or informal, to take account of
2 design and design activity goal priorities.
3 To ensure objectivity in measuring effectiveness it is necessary to express
4 both output and goal knowledge at the same level of abstraction and using
5 the same units of measurement. For instance, in the design of a gearbox shaft
6 it is not appropriate to describe the design activity output (DO) as a small
7 shaft when shaft size of 20 +/ 2mm has been specied as a design constraint
8 (DG). To determine the level of effectiveness in this case it is necessary to (a)
9 express the design constraint as small, medium or large, (b) provide a
2011 specic size of the shaft in millimetres, or (c) transform both DO and DG to
1 some intermediate common description.
2 In many cases it may be difcult to establish if maximum effectiveness has
3 been achieved with respect to particular goals. For example, if the design goal
4 is to maximise the life in service of the product it is difcult to determine
5 if/when maximum effectiveness has been achieved in the design activity. Such
6 a goal expresses a desirable direction for the value of life in service, i.e. greater
7 is better, but does not specify a target, which is considered to be the maximum.
8 Such a goal may be used as a basis for the comparison of different design pro-
9 posals, i.e. to support the measurement of relative effectiveness and decision
3011 making among alternatives [15, 40].
1
2
3 5.4.3 Relating Efficiency and Effectiveness
4
5 Efciency and effectiveness focus on related, yet contrasting performance ele-
6 ments. The efciency is inherent in the behaviour of a particular activity/
7 resource combination. It may be measured without any knowledge of the
8 activity goals, although the goals may inuence the behaviour of resources
9 used in the activity and consequently the level of efciency resulting from
4011 their use (see section 6.1.2).
1 Effectiveness, in contrast, cannot be measured without specic knowledge
2 of the activity goals. As is the case in measuring efciency, the measure-
3 ment of effectiveness involves the analysis of the activity output (O). However,
4 effectiveness is obtained through analysing a specic element of the output
5 knowledge, i.e. that which relates to the goal(s) of the activity.
6 In certain cases there exists a direct relationship between effectiveness and
7 efciency. This relationship exists when the specic element of the output
8 knowledge, which is evaluated to establish effectiveness, also describes an
9 element of the resource used. For example, a design management activity may
50 have a specic cost related goal of minimising the activity cost, i.e. DAGj:
51111 C = Min. Therefore the element of the output knowledge (DAO) that must

77
A Methodology for Enhanced Design Performance

be evaluated is the cost knowledge (DAOC). However, determining the cost 1


based efciency of the activity also involves the analysis of cost incurred 2
(DARU-C) in carrying out the activity as part of the overall resources used 3
(DARU). In this particular instance the element of output knowledge used to 4
establish effectiveness is the same as that used to establish efciency. 5
Therefore: 6
7
C = rc (DAOC, DAGC) 8
9
And 1011
C = DAK+ : DARU-C 1
2
But 3
DAOC = DARU-C 4
5
Therefore if DAGj: C = Min then a reduction in cost, C, will result in an 6
increase in efciency and effectiveness: 7
8
(C) (C) 91
2011
(C) (C) 1
Therefore 2
(C) (C) 3
4
That is, an increase in the cost based efciency of the activity will also result 5
in an increase in the cost based effectiveness of the activity, given an activity 6
goal of minimising cost. In cases such as this one the efciency of the activ- 7
ity can provide insight into why a particular level of effectiveness has been 8
obtained. 9
In other cases a direct relationship between efciency and effectiveness is 3011
not evident. Such cases exist where the specic element of the output knowl- 1
edge that is evaluated to establish effectiveness has no relationship to the 2
resource knowledge used in an activity. For example, where the goal of a 3
design activity may be to maximise the dimensional accuracy of the artefact, 4
DG(s) = Max(s), the element of the output knowledge (DO) which must be eval- 5
uated is the knowledge of the dimensional accuracy (DO(s)). It is clear that this 6
knowledge provides no indication of the resource knowledge (DR) used in the 7
activity. Therefore an increase in dimensional accuracy will give increased 8
effectiveness with respect to this goal but there is no direct relationship with 9
efciency in this case. 4011
1
2
5.4.4 Performance 3
4
Design effectiveness, design management effectiveness and activity efciency 5
reect different and in some cases directly related aspects of performance. 6
Design effectiveness illustrates how well the design goals have been met but 7
does not indicate the resource cost. Design management effectiveness indi- 8
cates if the design activity goals, such as resource cost, have been met. 9
However, these goals could be achieved by an inefcient activity if they were 50
dened in the absence of knowledge of the potential (or inherent) efciency 51111

78
A Formalism for Design Performance Measurement and Management

1
2
3 G Effectiveness ()
4
5
6
7
8 I Activity O
(Ak )
9
1011 Sentral
1
2
Efficiency ()
3 R
4
5
6 Figure 5.19 Performance model E2[1].
7
8
9
2011 of the activity and set at easily achievable levels. Efciency describes the inher-
1 ent behaviour of design and design management activities and, although it
2 does not indicate goal achievement, it can provide insight into design man-
3 agement effectiveness. Therefore, to obtain a fully informed view of activity
4 performance it is critical that both efciency () and effectiveness () are
5 evaluated (Figure 5.19). Therefore:
6 Design Development Performance
7 Efciency () and Effectiveness ()
8
9 That is, the performance of a MA, incorporating design and design manage-
3011 ment activities, is completely described within the elements of efciency and
1 effectiveness as presented above.
2
3
4
5 5.5 A Model of Performance Measurement and
6 Management (PMM)
7
8 The concept of managed activities, having both design and design manage-
9 ment activities, has been presented in Section 5.2. These managed activities
4011 are the fundamental elements of the design process, i.e. the design process
1 consists of a number of managed activities with relationships such as those
2 discussed in Section 5.3. The managed activities that make up the design
3 process may be dened formally or informally as a result of planning activi-
4 ties. A formal denition might involve a specic phase of project planning
5 and result in the generation of an overall project plan with activities, goals,
6 resources, etc. Informally, planning activities can occur at any stage in the
7 design process, e.g. the statement of a short term strategy for the location of
8 specic information [36]. These planning activities are aimed at developing
9 strategies for how to proceed based on knowledge of available resources
50 (DAR), potential efciencies of activity/resource combinations, previous
51111 levels of performance, etc.

79
A Methodology for Enhanced Design Performance

Having established the design and design activity goals the focus subse- 1
quently moves to ensuring these goals are achieved [36], i.e. optimising overall 2
effectiveness. In an informal sense, a designer will continually evaluate the 3
effectiveness of their activities, e.g. checking their watch to assess time elapsed 4
(design management effectiveness), evaluating the aesthetic strengths of a 5
particular concept (design effectiveness), etc. More formally, effectiveness 6
may be reviewed through simulating product behaviour and evaluating results 7
at specic stages as represented within many of the phase models of the design 8
process discussed in Chapter 4. 9
1011
1
5.5.1 Measuring and Managing Effectiveness 2
3
The denition of efciency and effectiveness presented in the E2 model may 4
be applied to both design activities and design management activities. 5
However, E2 does not describe how these activities, or their performance, are 6
related in order to ensure that the overall performance of the MA is optimised. 7
The measurement of design and design management effectiveness is pre- 8
sented here as a critical part of controlling a MA. The following highlights 91
how the E2 model may be applied within the DAM model to realise a process 2011
model for Performance Measurement and Management (PMM) in design 1
development. The description below focuses on a typical sequence of 2
events in evolving the state of the design from DI to DO, highlighting the main 3
decision points. 4
5
1. The design activity (Ad) takes DI as input and directed by knowledge of the
6
specic design goal (DG), produces an output (DO) aimed at meeting
7
the goal. Based on the denition of effectiveness described earlier this
8
output will be compared against the goal to determine the level of design
9
effectiveness, (Ad), achieved in the activity (Figure 5.20).
3011
2. The resulting level of design effectiveness (Ad) is used as an input of 1
control knowledge into the design management activity (Figure 5.21). The 2
description of design effectiveness may describe how well a design goal has 3
been met or whether a constraint has been satised or not (see Section 4
5.1.1). 5
3. The design management activity analyses design management effective- 6
ness, (Am), using knowledge (including meta-knowledge) of the resources 7
being used in both the design and design management activities. This 8
knowledge is primarily time and cost based i.e. it refers to the time con- 9
sumed or cost incurred during a particular activity-resource relationship. 4011
1
2
(Ad ) 3
DG 4
5
DI DO 6
Design
7
Activity
Ad 8
9
50
Figure 5.20 Design effectiveness. 51111

80
A Formalism for Design Performance Measurement and Management

1
2 (Ad )
3 Design
4 Management
5 Activity
6 (Am )
7
8 Figure 5.21 Effectiveness input.
9
1011
1 (Am )
2 DAG
3 (Ad )
4 Design DAO
5 DAI Management
6 Activity
7 (Am )
8
9 Figure 5.22 Design management effectiveness.
2011
1
2 This is compared against knowledge of the design activity goal (DAG), e.g.
3 to achieve a design lead time of 1 month, to determine the level of design
4 management effectiveness (Figure 5.22).
5
4. Utilising design activity resource (DAR) knowledge the design management
6
activity evaluates the relationship between design and design management
7
effectiveness and decides on the controlling action, if any11, which must be
8
taken as an attempt to optimise overall effectiveness. This controlling
9
action will typically involve changing the goals or resources in order to
3011
achieve a change in effectiveness.
1
2 Figure 5.23 is evolved from Figure5.8 to illustrate the decision points and ow
3 of control knowledge (shown as dashed lines) within a MA and serves to sum-
4 marise the steps described above. That is, the model describes the process of
5 measuring and managing performance in relation to both design and design
6 activity goals. The following outlines the types of controlling action that may
7 result from the evaluation of design and design management effectiveness:
8 At decision point ci the decision options are to terminate the activity having
9
established satisfactory levels of design and design management effective-
4011
ness or to continue with the activity.
1
2 At decision point cj the decision options are to redene goals and/or alter
3 resource allocation.
4 At decision point ck the decision options are to redene design goals (DG)
5 and/or design activity goals (DAG). For example, the outcome of the design
6 management activity may be to set a new launch date for the project. In
7 contrast, it may be more appropriate to reduce the targets specied in some
8 design goals, e.g. life in service, while maintaining the original planned
9 launch date.
50 At decision point cl the decision options are to alter design resources (DR)
51111 and/or the design activity resources (DAR). For example, the outcome from

81
A Methodology for Enhanced Design Performance

G 1
2
3
4
DG DAG ck 5
6
DO
DI Design 7
Activity 8
(Ad ) 9
1011
I O 1
2
3
4
Design DAO
DAI
5
Management
Activity
ci 6
cj 7
(Am )
8
91
DR DAR cl 2011
1
Stop 2
R 3
4
Figure 5.23 Performance Measurement and Management (PMM) process model.
5
6
7
the management activity may be to allocate additional design resources to
8
achieve increased design effectiveness with a probable negative impact on
9
design management effectiveness.
3011
The focus of the steps described above has been to optimise overall effec- 1
tiveness based on goal priorities as discussed in Section 5.4.2. However, the 2
controlling actions are based on design activity resource (DAR) knowledge, 3
which includes the inherent efciency of design activity/resource combina- 4
tions based on previous activities. This knowledge of efciency must be 5
acquired as part of the performance measurement activity. 6
7
8
5.5.2 Measuring Efficiency 9
4011
The efciency of an activity/resource combination in achieving a degree of 1
effectiveness provides an indication of the resources (e.g. time, money, etc.) 2
required in design development. This knowledge, which is often implicit, sup- 3
ports decision making, such as the allocation of resources, within the design 4
management activity in optimising overall effectiveness. The efciency may 5
be measured for both the design and design management activities using the 6
formalism provided in the E2 model. However, in practice it may be difcult 7
to clearly distinguish design and design management activities. The individ- 8
ual designer may be continually switching between design and design man- 9
agement activities, focusing on design and design activity goals respectively. 50
Although protocol studies have shown that different types of activity can be 51111

82
A Formalism for Design Performance Measurement and Management

1 identied when a designer externalises their thoughts [36], in practice this is


2 difcult for a designer to do and also complex to record. Therefore, it may be
3 only practical to measure the efciency of the overall managed activity in
4 some cases. This may be carried out as part of a formal project review and
5 evaluation [16]. The knowledge of achieved efciency (and effectiveness)
6 becomes part of Design Activity Resource (DAR) knowledge available for use
7 in future design management activities. Knowledge is accumulated in this way
8 through learning activities in design [130].
9
1011
1 5.5.3 Achieving Coherence
2
3 The denition of efciency and effectiveness presented above applies to the
4 measurement of activity performance at any level within the design process.
5 Given that activities will generally exist within an overall process and rela-
6 tionships, such as decomposition, will exist between them, the measurement
7 of isolated activity performance has limited value. The need for alignment
8 when measuring performance among multiple activities has been highlighted
9 within Chapter 4. Given that a Goal Breakdown Structure (GBS) can be created
2011 and maintained where relationships between goals at different levels are
1 dened, the E2 model may be applied in a manner which supports alignment
2 within design and/or design activity goals.
3 For example, we may establish weightings for the selection and congura-
4 tion activity goals (DGS and DGC) in Figure 5.10 with respect to the synthesis
5 activity goal (DG) based on the use-value analysis approach [12]. This knowl-
6 edge represents a relationship between goals at different levels and may be
7 used to support alignment. That is, if (using factors ranging from 0, reect-
8 ing the lowest, to 1, reecting the highest) the priorities are assumed to be 0.7
9 and 0.3 for selection and conguration respectively it can be seen that an
3011 increase in effectiveness of selection (S) will have a greater overall impact
1 than a similar increase in conguration effectiveness (C). This type of knowl-
2 edge of relationships in the GBS allows performance improvements that are
3 sub-optimal to be identied.
4 The PMM model includes both design and design management goals and
5 the discussion in Section 5.5.1 highlighted how the trade-off between these
6 goals may be managed. The controlling actions taken within the PMM model
7 are based on the relative priorities of design and design management goals
8 and therefore the model supports the alignment of performance across these
9 two goal types. Through supporting alignment both within and across goals
4011 the E2 model may be applied in a manner that provides a degree of coherence
1 in performance measurement.
2
3
4 5.6 Summary and Review
5
6 This chapter has introduced a model of design as a knowledge processing
7 activity, clearly dening its inputs and outputs. Further, a Design Activity
8 Management (DAM) model has been presented that distinguishes design
9 activities from design management activities and relates them to design and
50 design activity goals respectively. A generic model of design performance (E2)
51111 is presented which fully describes the phenomenon in terms of efciency and

83
A Methodology for Enhanced Design Performance

effectiveness. This E2 and the DAM models are integrated within a further 1
model for Performance Measurement and Management (PMM) in design 2
development, focused on controlling activities to achieve optimum effective- 3
ness. The key elements developed in the chapter are summarised here: 4
5
Design is modelled as a knowledge processing activity generating an output 6
(O) from a given input (I), under the direction of goals and constraints (G)
7
using varied resources (R).
8
Two key types of activity take place at any level within design development;
9
Design activities (Ad) aimed at satisfying design goals (DG) and Design 1011
Management activities (Am) aimed at meeting design activity goals (DAG). 1
These activities are inextricably linked although this link is often managed 2
informally at the level of individual designer activities (e.g. sketching, 3
calculating, etc.). 4
Design Development Performance, relating to the design and the design 5
management activities, is fully described in the concepts of Efciency () 6
and Effectiveness (). 7
The efciency of an activity (design or design management) is a ratio 8
between the quantity of knowledge gained (K+) and the level of resources 91
used (RU), i.e. it is a measure of how economical (in terms of time, money, 2011
etc.) the activity is. 1
The effectiveness of an activity (design or design management) is for- 2
malised as the comparative relationship (rC) between the output achieved 3
(O) and the specic goal(s) (G) being addressed. 4
Although the overall goal in design development may generally be to con- 5
tribute positively to prot, two types of goal are present within different 6
levels of activity, i.e. those that relate to the design (artefact) and those that 7
relate to the activity by which the design is realised. Both design (DG) and 8
design activity goals (DAG) must be considered for a complete analysis of 9
design performance. 3011
The measurement of effectiveness may be carried out against design goals,
1
for design activities, and design activity goals, for design management 2
activities, at any level of activity in design development. The values for 3
effectiveness are used to support the local control of the activities as 4
described within the process model for Performance Measurement and 5
Management (PMM). 6
Measuring design and design management effectiveness, (A ) and (A ),
7
d m 8
gives a comprehensive view of performance with respect to design and
9
design activity goals. However, a complete description of performance
must include the related measure(s) of efciency (), which gives further 4011
insight into effectiveness. 1
2
2
The denition of efciency provided in E may be applied to individual
3
design and design management activities. However, in practice it may be 4
more feasible to measure efciency of an overall MA. 5
The overview of the work in design performance, provided in Chapter 3, high- 6
lighted some weaknesses in this eld and identied areas that need to be 7
addressed. The following briey summarises the key features of the concepts 8
presented in this chapter. 9
1. Formalism of Performance: The E2 model, (Figure 5.19), presents a generic 50
and comprehensive description of performance in relation to a goal 51111

84
A Formalism for Design Performance Measurement and Management

1 directed activity such as design. The model details and relates the key ele-
2 ments of performance, efciency () and effectiveness (). Other elements
3 such as inputs, outputs, goals and resources are described within this model
4 and their relation to performance is determined.
5 2. Relating Design and its Management: Design and Design Management
6 activities are distinguished within the Design Activity Management (DAM)
7 model (Figure 5.8). Design activities address design goals while design
8 management activities are aimed at design activity goals and the overall
9 goal of the MA. The DAM model describes these activities and their related
1011 inputs and outputs. The E2 and DAM models provide a basis for dening
1 a process model for Performance Measurement and Management (PMM)
2 presented in Figure 5.23. The PMM model relates design and design man-
3 agement activities through performance measurement and management.
4 The measurement of effectiveness is seen to be critical in providing deci-
5 sion support for the local control of such activities. The model presented
6 is again generic in nature and may be applied at any level of design to
7 support measurement and management.
8 3. Requirement for Coherence: The models focus on the performance of activ-
9 ities in design development. The E2 and PMM models may also be used to
2011 support performance analysis where the focus may be on a number of activ-
1 ities, for example within a design phase/stage, the overall design process,
2 etc. The decomposition of activities and goals as discussed in Section 5.3.1
3 highlights that the E2 performance model may be applied at any activity
4 level within such a decomposition. The existence of a Goal Breakdown
5 Structure, and maintenance of the relationships within it, allows the E2
6 model to be applied in a manner that supports alignment within design or
7 design activity goals. In addition, the DAM and PMM models support
8 explicit denition of design and design activity goals and the alignment of
9 performance across these goals, i.e. it allows trade-offs to be identied and
3011 managed in achieving optimum overall effectiveness. Therefore, given the
1 correct conditions, such as the existence of a GBS, goal priorities, etc.,
2 the models developed here may be applied to give a degree of coherence
3 in performance measurement.
4
4. Inuences on Performance: The primary inuences on performance are
5
modelled as resources within the E2 representation and are therefore clearly
6
distinguished from output. Such an inuence could be a CAD system, an
7
expert designer, or a whole approach such as Concurrent Engineering
8
(CE)12. The degree of inuence is given by the change in efciency and/or
9
effectiveness resulting from the incorporation of a new tool, method, etc.
4011
Activities themselves may act as resources, i.e. the output knowledge of
1
some activities may be used as resource knowledge within another activity.
2
3 Efciency and effectiveness are the key elements of performance and it is
4 proposed that, although effectiveness provides direct knowledge of goal
5 achievement, efciency provides further insight into performance and sup-
6 ports planning activities in design. The key characteristics of efciency and
7 effectiveness are summarised in Table 5.1.
8 It can be seen from the table above that efciency is an inherently complex
9 property of a knowledge processing activity and although the knowledge
50 resource used in activities may be established using metrics based on time,
51111 cost, etc., evaluating the amount of knowledge gained is more difcult. It was

85
A Methodology for Enhanced Design Performance

Table 5.1. Efciency and effectiveness characteristics 1


Efciency Effectiveness 2
May be difcult to distinguish for separate Explicitly relates to the goals and priorities 3
design and design management activities. that exist for activities, giving a clear 4
Efciency does not indicate the degree of indication of goal achievement. 5
goal achievement1 and is therefore not Provides critical knowledge required to 6
focused on the specic context of goals control activities once they have been
and their priorities. planned and their goals established.
7
8
Given that design is a knowledge processing Measurement of effectiveness is simpler than
activity the measurement of efciency is efciency as the element of knowledge that 9
complex, as it requires the quantity of must be evaluated is specically related to 1011
knowledge gained to be established. the goal knowledge. Therefore, where goals 1
Knowledge of activity/resource efciencies have been dened, the output knowledge 2
element which must be evaluated is
supports the planning of activities more
indicated.
3
than the control of activities that have been 4
planned.
5
1 Efciency may be used to determine effectiveness where the goal is expressed as an
6
efciency goal but it does not, in isolation, indicate the degree of goal achievement. 7
8
91
2011
considered that, although a fundamental representation of efciency has been 1
proposed, the modelling of knowledge in a manner that supports quantita- 2
tive measurement warrants considerable investigation. In addition the nature 3
(or content) of the knowledge is seen here as more important than the quan- 4
tity (volume) of it, a view supported by Nonaka and Takeuchi [151] (p 58) in 5
their discussion on the use of information to support knowledge creation. 6
Consequently, Chapters 6 and 7 focus on effectiveness and the development 7
of an approach and tool to support its analysis. 8
9
3011
Notes 1
2
1 Design is used in this instance as a noun to describe the artefact or product. 3
2 Although a number of related activities may be described as a process, activities are funda- 4
mental elements of processes. This book focuses on the activity level and therefore, the term
design activity knowledge is used to include both individual activity and multiple activity
5
(process) knowledge. 6
3 Note that goal expression used here establishes which side of the datum is desirable and the 7
direction of preference with regard to p, i.e. the positive side of zero with greater value 8
meaning better goal satisfaction. 9
4 It is proposed here that all activities, taking place within the context of a process, can have
both design and design management sub-activities. 4011
5 It should be noted that it is not the intention here to accurately dene synthesis, as there 1
may be other activities within a denition of synthesis. The discussion here is aimed at illus- 2
trating decomposition in general rather than accurately dening specic activity relation- 3
ships or prescribing a process. The approach can be applied equally to any decomposition
of synthesis, or any other activity.
4
6 Cost is used here as a general metric to describe the level of time, money, material, etc. used 5
in the activity. 6
7 Time may not be viewed as a resource per se, when allocated to a particular activity it may 7
be analysed as a resource in establishing time-based efciency. 8
8 This relationship will vary depending on whether a goal or constraint is represented within
G and may be in the form of evaluation or validation. 9
9 In practice this type of calculation is an over simplication and in many cases the elements 50
of such an equation cannot be dened easily. 51111

86
A Formalism for Design Performance Measurement and Management

1 10 In some of the literature this has been described as product and process effectiveness.
2 However process effectiveness does not focus on the effectiveness of an individual activity
but on a collection of activities dened within a process.
3 11 It may be desirable to take no controlling action, i.e. to maintain all goals, resources, etc. as
4 they currently are and allow the managed activity to continue.
5 12 Although CE may be seen simply as carrying out activities in parallel the knowledge of
6 how to achieve CE may be modelled as resource knowledge here, which might be used at a
planning phase of design.
7
8
9
1011
1
2
3
4
5
6
7
8
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

87
1
2
3
4
6 The Impact of
Resources on
Effectiveness

The concept of activity effectiveness has been described


in Chapter 5 as the degree to which the output meets the
goal(s) and/or satises the constraint(s). In Section 5.5.1
it has been shown that the analysis of effectiveness pro-
vides a key element in the measurement and manage-
5 ment of design performance. Within the Performance
6 Measurement and Management (PMM) model, knowl-
7 edge of both design and design management effective-
8 ness supports decisions aimed at controlling design
9 development in order to optimise overall effectiveness.
1011 In addition to supporting the immediate control of a
1 managed activity, the measurement of effectiveness can
2 support continuous improvement of performance on a
3 more long-term basis. That is, it can form an important
4 part in the process of identifying, implementing and
5 reviewing performance improvement initiatives. In this
6 case performance improvement initiatives are dened as
7 the introduction of new resources (e.g. tools, methods,
8 approaches) and/or the further exploitation of existing
9 resources aimed at improving performance. The meas-
2011 urement of effectiveness and the inuence of different
1 resources upon it provide an indication of where
2 improvement initiatives should be targeted. In addition,
3 establishing the current effectiveness provides a baseline
4 to allow comparisons with future effectiveness and there-
5 fore the impact of any improvement initiatives to be
6 assessed.
7 The remainder of Part 2 focuses on the analysis of
8 activity effectiveness with respect to design and design
9 activity goals and on identifying the means by which the
3011 degree of effectiveness may be improved. Specically,
1 the relationship between the resources, i.e. approaches,
2 methods and tools, used in design and their impact on

89
A Methodology for Enhanced Design Performance

effectiveness is identied as a key area of focus. The industrial trial reported 1


in Chapter 4 identied a lack of clarity on the nature of resource impact. The 2
following chapter provides an overview of key principles that underpin an 3
approach to analysing this impact (presented in Chapter 7). 4
5
6
6.1 Influences on Effectiveness 7
8
The factors that inuence effectiveness are varied and can range from the 9
state of the world, e.g. the current political and economic climate, to the use 1011
of particular design tools by an individual designer. External inuences1 such 1
as the economy are considered beyond the scope of analysis in this work. Such 2
inuences can affect the companys strategy, reected through the business 3
goals. In Part 1 some key inuences on performance were identied includ- 4
ing strategy, goals and resources. These are further discussed here in relation 5
to the models developed in Chapter 5. 6
7
8
6.1.1 The Influence of Strategy 91
2011
The development of a strategy may be dened as a high level planning activ- 1
ity that results in a broad denition of how to achieve particular goals. This 2
type of activity can occur at any stage or level in the design process, i.e. from 3
the formal denition of an overall business strategy to the statement of a short 4
term strategy for locating information to support a specic design activity 5
[36]. The choice of strategy can inuence the resulting effectiveness in each 6
of these cases. The inuence discussed here is one that has been widely 7
reported in literature, e.g. [152], aimed at dening the link between the result 8
of the high level planning activities, e.g. a new product strategy [95], and the 9
output achieved from implementation of the strategy. 3011
Once an organisation denes a particular strategy the focus moves to the 1
delivery of this strategy using the available resources, i.e. its implementation. 2
The relationship between planning a strategy and implementing it is not a 3
unidirectional one [33, 153] and often, when planned intentions do not 4
produce the desired results, strategies must be redened/evolved [154]. The 5
nature of current business environments, such as those inuenced by changes 6
in technology, means that strategies are much more dynamic than before [33, 7
151]. It has been shown in Chapter 5 that the measurement of effectiveness 8
provides key information to support the control of a managed activity. In a 9
similar way, at a higher level, effectiveness can provide critical feedback to 4011
inform the continual evolution of strategy in order to control implementa- 1
tion of that strategy. Figure 6.1 illustrates a feedback/control loop between 2
planning and implementation. Although this gure depicts the development 3
and implementation of a new product strategy [95] the scope may vary from 4
overall business strategy development to local strategy development, e.g. the 5
strategy of a particular project team. 6
The time period between the denition of a new product strategy, its imple- 7
mentation within design and development and the measurement of the effec- 8
tiveness of the eventual output (e.g. the product sales) may be up to a number 9
of years in some cases. During this time it is difcult to take account of the 50
large number of internal and external factors that affect the activities involved 51111

90
The Impact of Resources on Effectiveness

1
2
level of innovativeness
3 market size, growth, etc.
technological maturity
4 spend on R&D
PLANNING .

5 .
6
7
8
9
FEEDBACK/
1011 CONTROL
1
2
3


4 .

5
6 IMPLEMENTATION
7
8 .

9
2011 . .

1
2
3 Figure 6.1 Planning and implementation of strategy.
4
5
6 in taking a product from concept to mass production [9]. Therefore, estab-
7 lishing a causal link between strategy and effectiveness has been proven
8 difcult in many of the studies reported in the literature [64].
9
3011
1 6.1.2 Influence of Goals
2
3 Goals may both represent the starting point for the development of a strat-
4 egy and be a component of the results. That is, a new product strategy may
5 be created to achieve particular high level goals such as prot and subse-
6 quently further goals may be specied that are related to the delivery of
7 elements of the strategy, such as levels of product innovativeness, R&D
8 budget, etc. As highlighted in Chapter 5 the categories of knowledge (i.e. input,
9 output, resources and goals) are dened by the activity to which they relate.
4011 For example in the case of strategic planning activities (Figure 6.2) the input
1 knowledge may represent the overall goals of the organisation. The goal of
2 the strategic planning activity may be to dene a coherent plan, which is based
3 on achieving those goals. Therefore, the output could be activities and sub-
4 goals dened within such a plan. That is, although goals inuence the output
5 of an activity, this output may itself be composed of activities and/or goals.
6 The denition of goals and measurement of effectiveness with respect to
7 these goals has an impact on the behaviour of resources, e.g. in the case of
8 human resources goals may provide motivation to the individual. The obser-
9 vation of Johnson [155] that What you measure is what you get reects the
50 view that when people are aware of being measured they tend to respond
51111 through changes in their behaviour. However, it is proposed here that it is

91
A Methodology for Enhanced Design Performance

Coherent 1
Plan 2
3
4
5
Strategic Activities/ 6
Organisational
Goal Planning Goals/ 7
. 8
9
1011
1
Planning 2
Resources 3
Figure 6.2 Strategic planning activity. 4
5
6
not measurement, per se, that directly inuences behaviour but the goal that 7
is linked with the measure(s) and the knowledge that performance against 8
this goal will be assessed. For example, if the time taken for a particular 91
activity is being measured there is often a goal (implicit or explicit) of min- 2011
imising the duration of that activity. It is this goal, and the knowledge that 1
performance against the goal may be visible to peers, superiors, etc. that often 2
inuences the behaviour of those involved in the activity. Therefore commu- 3
nication of goal knowledge (G) to the individual (R) involved in an activity 4
(A) will inuence the output obtained (O) and consequently, the level of 5
effectiveness (), as illustrated in Figure 6.3. 6
7
8
6.1.3 Influence of Resources 9
3011
Although goals (G) may have an indirect impact on the output (O), via the 1
behaviour of the resources (R) as presented above, it is the resources them- 2
selves that have a direct inuence as presented in the E2 model. Essentially, 3
an activity cannot take place, and realise any level of effectiveness, in the 4
absence of a resource. Considerable effort is spent on ensuring the right 5
resources are used at the right time, to carry out the right activities for the 6
7
8
G 9
4011
1
2
3
Activity
I
(A)
O 4
5
6
7
8
R 9
50
Figure 6.3 The inuence of goals. 51111

92
The Impact of Resources on Effectiveness

1 right reasons, to give the right results [156]. Resources are seen as a critical
2 inuence on the effectiveness of design activities.
3 The work presented here is concerned with further analysing the inuence
4 of resource knowledge on activities in design. This knowledge (R) is actively
5 engaged in transforming the state of the design/design activity knowledge.
6 Given two similar activities (Ai and Aj) with the same input (I) and goals
7 (G) it is the resource knowledge that directly affects the output (O) and
8 consequently the level of effectiveness.
9
1011
1 6.2 Resource Knowledge (R)
2
3 Within the E2 model resource knowledge is explicitly represented as an
4 element (i.e. a type of input) of the formalism and the carrier of this knowl-
5 edge may be in various forms. That is, the formalism does not take account
6 of the characteristics of the resources themselves, only the knowledge repre-
7 sented in the resource. Resource knowledge could typically exist within people
8 such as designers, organisational approaches such as Concurrent Engineering
9 (CE), explicit methods such as Quality Function Deployment (QFD) and tools
2011 such as Computer Aided Design (CAD) software. For example, particular rules
1 to support the parallel execution of activities in a CE based environment [157]
2 may be seen as resource knowledge that can be utilised within activities.
3 According to Fadel [158]:
4 . . . being a resource is not an innate property of an object, but is a property that is
5 derived from the role an object plays with respect to an activity.
6
7 That is, as discussed in Section 5.2.1, an element of knowledge may only be
8 described as resource knowledge when related to a specic activity. With ref-
9 erence to the E2 model, resources provide all the knowledge inputs required
3011 to support the execution of the activity with the exception of the initial state
1 of the design/design process (I) and the goals/constraints (G). Although the
2 designer is a critical resource and acts as a source of both design (DR) and
3 design activity knowledge (DAR) there are an increasing number of other
4 resources available to support design development activities, including:
5 Approaches: such as Concurrent Engineering [143] and Team Engineering
6 [159]
7 Methods: such as Quality Function Deployment [112] and Function
8 Analysis [15]
9
Tools: such as Computer Aided Design (CAD) systems [160] and morpho-
4011
logical charts [12]
1
2 Resources in each of these categories may support the capture and represen-
3 tation of knowledge and/or contain existing knowledge, both design (DR) and
4 design activity knowledge (DAR). For example, CAD systems allow the
5 capture of the design knowledge (geometric, spatial, etc.), but also may
6 contain rules for the automation of some design activities such as clash detec-
7 tion (DR). Scheduling and resource allocation rules (DAR) are represented
8 in project planning systems such as Microsoft Project which often allows
9 the representation of design activity knowledge in graphical form (e.g.
50 PERT chart). QFD also provides specic mechanisms for the capture and rep-
51111 resentation of design knowledge (DR), such as the House of Quality, but also

93
A Methodology for Enhanced Design Performance

1
2
3
Activity 4
(Ai ) 5
6
7
8
Activity 9
(Ar )
1011
1
2
3
Figure 6.4 Activity output as resource.
4
5
6
contains an inherent process (DAR) for developing customer needs through 7
to a product design. Approaches such as Concurrent Engineering (CE) are 8
ill dened in the literature but may be literally interpreted as advocating the 91
parallel execution of activities and could therefore be interpreted as design 2011
activity knowledge (DAR). Particular methods and tools, such as those 1
reported in [45] and [145], support the CE approach. 2
A signicant amount of research has been carried out in the area of re- 3
sources to support design, particularly in the classication of these resources 4
[3, 161164]. Araujo [78] and Cantamessa [161] distinguish between resources 5
that directly address design goals (DG) and those that address design activ- 6
ity goals (DAG). Although these classications contribute to the understand- 7
ing of resources in design and development there exists little consensus 8
on an appropriate classication. Indeed, from the discussion above it may 9
be seen that resources such as QFD often address both design and design 3011
activity goals. It is not the focus of this work to dene a detailed resource 1
classication but to provide a means for analysing the impact of resources on 2
the effectiveness achieved with respect to a specic goal(s), within any clas- 3
sication scheme. It will be shown in Chapter 7 that the developed solution 4
provides the exibility to adapt to any classication scheme for resources, 5
including that developed within a particular organisation. 6
In certain cases the output knowledge (O) from an activity may represent 7
resource knowledge (R) in relation to another activity. Figure 6.4 describes 8
the situation where the activity, Ar, acts as a provider of a resource to another 9
activity, Ai. For example, during the design of a gearbox (Ai) an information 4011
search (Ar) may be undertaken to investigate latest advances in gearbox 1
design. The output of this search activity may not directly alter the state of 2
the design and consequently, the activity itself may be viewed as a resource 3
activity in this context2, i.e. with respect to the gearbox design activity. 4
5
6
6.3 Relating Resource Use and Effectiveness 7
8
Resources are used within an activity in order to deliver an output that meets 9
the goals and/or satises the constraints of the activity, i.e. to achieve a certain 50
level of effectiveness. However, the level of effectiveness that may be achieved 51111

94
The Impact of Resources on Effectiveness

1 by particular resources is not inherent in the resource but will depend on the
2 activity/context in which the resource is used. For example, resources may be
3 more/less effective in situations where the focus (i.e. goals) is on innovative
4 as opposed to more routine development [76].
5
6
7 6.3.1 Resource Impact
8
9 The relationship between the use of a particular resource and the effective-
1011 ness achieved in relation to a specic goal is described here as the Impact (Im)
1 of that resource. Given an activity (A) with an input (I) and a goal (G), the
2 impact of a resource, Im(R), describes its capability to act upon the input (i.e.
3 a function of the input) in a manner that achieves the desired output, i.e. that
4 which best meets the goal. That is,:
5 Im (R) = f (I) : O G
6
7 Therefore an increase in the impact (I) of a resource (R) will result in an
8 increase in the effectiveness () achieved, i.e.:
9 Im(R) (A)
2011
1 Figure 6.5 illustrates a comparison of two similar design activities (A1 and A2),
2 for example the activity of draughting, utilising different resources (DR1 and
3 DR2). If we assume the same goal of maximising dimensional accuracy (DG)
4 and the same input of a concept sketch (DI), then the use of CAD (DR1) in
5 one case versus manual draughting techniques (DR2) in another may produce
6 different dimensional accuracy in the output. If we assume that a CAD system
7 produces greater dimensional accuracy3, then the output of the draughting
8 activity using CAD (DO1) will meet the goal better than the output of the activ-
9 ity while using manual techniques (DO2). Therefore, the impact on effective-
3011 ness of CAD, Im(DR1), is greater than the impact of manual techniques,
1 Im(DR2), i.e.:
2 Im(DR1) > Im(DR2)
3 And therefore:
4 1 > 2
5
6 Although no absolute measure of effectiveness is obtained here, establishing
7 the relative impact allows a comparison of effectiveness impact across similar
8
9 DG DG
4011 1 2
1
2
3 Design Design
4 DI Activity DO1 DI Activity DO2
5 A1 A2
6
7
8
9
DR1 DR2
50
51111 Figure 6.5 Comparing impact of resources.

95
A Methodology for Enhanced Design Performance

activities using different resources. This can support decisions regarding 1


the selection of the most appropriate resources, i.e. tools, methods and/or 2
approaches, to achieve specic goals for a particular activity [3]. 3
4
5
6.3.2 The Resource Impact Model 6
7
In the discussion of resource impact above there has been an implicit degree 8
of exploitation assumed. In reality the full capability of a resource may not 9
be realised in its use, i.e. it may not be fully exploited. For a particular 1011
resource, full exploitation (i.e. 100%) of its potential may often be impossi- 1
ble, rather a potential exploitation (Expt) is estimated. For example the full 2
potential of a CAD system may only be exploited if full training is given on 3
all aspects of functionality, the user is fully competent and uses all the fea- 4
tures, it is fully integrated with other systems, etc. In addition the actual 5
exploitation (Exal) may be less than the estimated potential in particular 6
situations. That is, in the case of CAD systems not all users will work to full 7
capacity, be equally competent, use the correct procedures, etc. It is proposed 8
here that such a difference in exploitation will result in a difference between 91
the potential effectiveness (pt) and the actual effectiveness (al) that may be 2011
achieved using a particular resource. 1
Figure 6.6 presents a Resource Impact model illustrating the relationship 2
between exploitation of a resource and effectiveness achieved with respect to 3
a specic goal. The relationship between the change in exploitation (Ex) of a 4
resource and the effectiveness () achieved is dened here as the Impact 5
Prole. This relationship is assumed to be linear in order to aid clarity though 6
its true nature may be quite different (see Section 6.3.3). Therefore, any 7
increase in the exploitation will result in a corresponding increase in the effec- 8
tiveness achieved. Having established the impact prole of a resource, its 9
potential and actual exploitation, the scope for improvement (S) in effective- 3011
ness can be determined. This scope for improvement indicates the increase 1
in effectiveness that may be obtained through further exploiting a resource 2
from actual to potential exploitation. If the resource has not yet been intro- 3
duced, i.e. its actual exploitation is zero, then the scope for improvement is 4
represented as Smax. 5
The case described in Figure 6.6 presents an impact prole where 100% 6
exploitation of the resource will directly result in 100% effectiveness being 7
obtained. However, this might not be the case in many situations, i.e. the 8
achievement of 100% effectiveness with respect to a goal in a specic activity 9
may not be attributable to a single resource used within that activity. The 4011
impact prole shown in Figure 6.7 illustrates that full exploitation of the 1
resource will result in approximately 60% effectiveness being achieved. 2
Consequently, given the same actual and potential exploitation as presented 3
in Figure 6.6, the change of impact prole results in a decrease in effectiveness 4
and a reduced scope for improvement. 5
6
7
6.3.3 Nature of Impact Profile 8
9
The relationship between the exploitation of a resource and the effectiveness, 50
i.e. the impact prole, may not be a linear one. Resources may show a high 51111

96
The Impact of Resources on Effectiveness

1 100%
2
3 Effectiveness ()
4
5
6 pt
7
8
9 Impact
S Profile
1011
1
2
3 Smax
al
4
5
6
7
8
9
2011
0 Exal Expt 100%
1
2
3 Exploitation (Ex)
4 Figure 6.6 The Resource Impact (RI) model.
5
6
7
8 100%
9
3011
1
2
3
4
5
Effectiveness ()
6
7
8 pt
9
4011 Impact
S
1 Profile
2
3 al
4
5
6
7 0 100%
Exal Expt
8
9
Exploitation (Ex)
50
51111 Figure 6.7 Effect of change in resource impact.

97
A Methodology for Enhanced Design Performance

impact on effectiveness during initial exploitation, whereas further exploita- 1


tion may give less dramatic results. For example, the initial use of a CAD 2
system may signicantly impact the level of effectiveness with respect to par- 3
ticular goals. However, as the exploitation continues to increase the corre- 4
sponding increase in the effectiveness obtained may be smaller (e.g. R1 in 5
Figure 6.8). Each resource may have a different impact prole, e.g. R1, R2 and 6
R3 in Figure 6.8. 7
The nature of the impact relationship will affect the scope for improvement 8
through additional exploitation of a resource. For example, in comparing 9
Figure 6.7 to Figure 6.9, it can be seen that for the same increase in the 1011
exploitation the degree of increase in effectiveness may be different. The 1
actual impact of a resource on effectiveness may only be truly evaluated after 2
the activity, i.e. once the output has been produced and compared to the goal. 3
This requires the measurement of effectiveness achieved by the same activity 4
with and without the resource being used to allow a comparison of effective- 5
ness to be established. To create a full impact prole would involve the study 6
of effectiveness achieved for different degrees of exploitation and would 7
require extensive research into the use of the resources in practice over a 8
period of time. The unique and non-repeatable nature of many activities in 91
design means that it is difcult to apply such a measurement procedure. 2011
Although the model developed here provides a basis for this analysis, the 1
determination of exact impact proles of specic resources against goals is a 2
topic for further investigation. 3
4
5
6
7
100% 8
9
3011
1
2
3
R1 4
5
6
Effectiveness () 7
R2 8
9
4011
1
2
3
4
R3 5
6
0 100% 7
8
Exploitation (Ex)
9
50
Figure 6.8 Nature of impact relationship. 51111

98
The Impact of Resources on Effectiveness

1 100%
2
3
4
5
6
7
8
9
1011 Effectiveness ()
1 pt
2 S
3
al
4
5
6
7
8
9
2011 0 Exal Expt 100%
1
2 Exploitation (Ex)
3
4 Figure 6.9 Impact prole and scope for improvement.
5
6
7
8 6.3.4 Assessing Relative Impact
9
3011 Resources may have inherent attributes that indicate the effectiveness they
1 can deliver in a particular activity. For example, within a simulation activity,
2 the ability of a 3D CAD system to provide a sophisticated representation of
3 an artefact may be assumed to negate the need for a physical prototype. This
4 could be seen to address the overall design activity goal (DAG) of reducing
5 cost. It is proposed here that, given a number of different resources and
6 a similar level of exploitation, the relative impact on effectiveness may be
7 estimated. Assuming that the relationship between exploitation and effec-
8 tiveness is a linear one the impact prole is therefore also established, e.g. the
9 proles for resources R1, R2, and R3 presented in Figure 6.10 as High (H),
4011 Medium (M) and Low (L). This provides a basis for an approach to analysing
1 effectiveness as dened within Chapter 7.
2 The simplicity of the approach used here, where a linear relationship is
3 assumed, allows a comparison of the impact of resources to be determined
4 within a relatively short time in comparison to more detailed data collection.
5 The time and resources consumed in performance measurement is consid-
6 ered to be a barrier to the implementation of performance measurement
7 in industry [44, 165]. Having established the impact proles for various
8 resources against goals and the actual and potential exploitation, compar-
9 isons may be made in terms of scope for improvement. This supports
50 the identication of resources where further exploitation may deliver the
51111 greatest benet.

99
A Methodology for Enhanced Design Performance

100% 1
2
3
4
5
H 6
7
8
9
Effectiveness () 1011
1
M
2
3
4
L 5
6
7
8
0 100%
91
2011
1
Exploitation (Ex)
2
Figure 6.10 Relative impact of resources. 3
4
5
6
6.3.5 Resource Effectiveness 7
8
Many resources in design will inuence the achievement of more than one 9
goal. For example, the use of 3D CAD may improve the effectiveness with 3011
respect to design goals (DG) while also reducing activity cost, which may be 1
specied as a design activity goal (DAG). In addition to multiple goals, the 2
priorities of the goals or the goals themselves may change over time. 3
Therefore, assessment of the impact of resources on effectiveness must take 4
account of the overall environment in which the resource is being used in 5
order to increase the validity of the results [94]. For example, in Figure 6.11, 6
a resource (R) may be used to achieve two different goals (G1 and G2) and will 7
therefore impact the effectiveness achieved with respect to each of these goals 8
(1 and 2). The priorities (or weightings) of these goals may vary and there- 9
fore to provide further insight into the impact on effectiveness, in a situation 4011
where there is impact on more than one goal, these priorities should be taken 1
into account. The weighted impact is dened here as the Resource 2
Effectiveness (R) and it allows the overall impact of a resource, within a sit- 3
uation where there are multiple goals/priorities, to be assessed. For example, 4
the resource effectiveness of resource R in relation to goal G1 in Figure 6.11 5
is dened as: 6
7
Resource Effectiveness = R (G1) = (Im(G1)(R) W1)
8
The resource effectiveness in relation to the higher level goal (GO) will be 9
a combination of the weighted impact on effectiveness achieved with 50
respect to each of the goals (ImG1 and ImG2). For example, if the priorities 51111

100
The Impact of Resources on Effectiveness

1 GO
2 R
3
2
4 G1 G2
5 1
6
7 O2
8 Activity
I O
9 (A) O1
1011
1
2
3
4
R
5
6 Figure 6.11 Impact on multiple goals.
7
8
9
(or weightings) of goals G1 and G2 in relation to the higher level goal (GO) are
2011
W1 and W2 respectively then the overall resource effectiveness in relation to
1
GO may be determined as follows4:
2
3 Overall Resource Effectiveness = R(GO) =
4 (Im(G1)(R) W1) + (Im(G2)(R) W2)
5
The use of a CAD system to carry out a draughting activity provides a typical
6
example of the situation described above. In this case the overall goal (GO)
7
may be the production of a drawing providing a 2-dimensional representa-
8
tion of the product. Within this overall goal, sub-goals related to dimensional
9
accuracy (G1) and adherence to ISO5 standards (G2) may exist, each with par-
3011
ticular priorities (W1 and W2 respectively). Within the overall output (O), i.e.
1
the completed drawing, there will be outputs representing the dimensional
2
accuracy (O1) and the adherence to ISO standards (O2). Therefore the impact
3
of the CAD system with respect to each of these goals can be represented as:
4
5 Im(G1)(R) = f (I) : O1 G1
6
and
7
8 Im(G2)(R) = f (I) : O2 G2
9 And the overall Resource Effectiveness may be represented as:
4011
1 R(GO) = f (I) : O G
2 OR
3
R(GO) = (Im(G1)(R) W1) + (Im(G2)(R) W2)
4
5 The use of resource effectiveness to take account of relative priorities of
6 goals/sub-goals supports the concept of alignment with respect to design goals
7 (DG) or design activity goals (DAG) as highlighted in Chapter 5. That is, the
8 existence of a goal breakdown structure (GBS) can allow goal/sub-goal rela-
9 tionships to be dened, such as the relative weightings relationship illustrated
50 above. Although the goal/sub-goal relationship may be more complex than
51111 simple weightings, its denition allows the resource effectiveness in relation

101
A Methodology for Enhanced Design Performance

to higher level goals to be determined having established the impact on 1


sub-goals for a specic activity. Consequently, having established the impact 2
on goals for the draughting activity discussed above the resource effective- 3
ness in relation to higher level goals, such as those dened at a project level, 4
may be determined. This supports the analysis of alignment within the goal 5
structure and allows sub-optimal use of resources to be identied. 6
7
8
6.4 Principles of Analysing Impact of Resources 9
1011
Resources have been identied as a key factor inuencing the effectiveness 1
of activities in design and the analysis of their impact has been discussed. 2
A number of principles have been developed in this chapter that contribute 3
to an approach for analysing resource impact. These principles are reviewed 4
here: 5
6
1. The impact of a resource, Im(R), describes the capability of the resource
7
to act upon the input in a manner that achieves the desired output, i.e. that
8
which best meets the goal.
91
2. The analysis of resource impact must be based on the particular context in 2011
which the resource is used, i.e. the impact (Im) on effectiveness () of a 1
resource (R) is established for a given activity (A) with a particular goal 2
(G) and input (I). 3
3. Resources may be exploited to different degrees within activities in design. 4
The Resource Impact model illustrates the relationship between exploita- 5
tion and effectiveness, i.e. the impact prole. This prole is assumed to be 6
linear though its true nature requires further investigation. Establishing 7
impact proles provides a means to identify the scope for improvement 8
(S), i.e. the additional effectiveness that may be achieved through further 9
exploitation of a resource. 3011
4. True objective assessment of a resource impact may only be carried out on 1
a retrospective basis and, due to the non-repeatable nature of many design 2
activities, may be of limited use in predicting future impact. That is, knowl- 3
edge of previous effectiveness may not indicate future effectiveness within 4
a given context. It is proposed here that the relative impact of resources 5
may be assessed to allow the identication of resources where further 6
exploitation may provide greatest benet. 7
5. In practice, a resource may be used in more than one activity, and/or to 8
achieve more than one goal. The concept of resource effectiveness is intro- 9
duced here to take account of the weighted impact of resources. Resource 4011
effectiveness provides a means to support alignment across design or 1
design activity goals. 2
3
4
6.5 Summary 5
6
The measurement of effectiveness provides a key element of information 7
concerning the performance of an activity. In addition, knowledge of the 8
factors inuencing this aspect of performance is equally important in that it 9
provides the insight necessary to identify areas for performance improve- 50
ment. Resources such as approaches, methods and tools provide signicant 51111

102
The Impact of Resources on Effectiveness

1 inuence over the eventual effectiveness of activities as they act directly on


2 the input to deliver the output. The impact (Im) of such resources describes
3 their capability to create an output, from a given input, which meets the goals.
4 Thus, an increase in the impact of a resource will result in an increase in the
5 effectiveness achieved for a given degree of resource exploitation.
6 This chapter has focused on the nature of resources as inuences on effec-
7 tiveness, and a Resource Impact model has been presented to illustrate the
8 relationship between the exploitation of a resource and the effectiveness
9 obtained. This serves to clarify the concept rst introduced in the industrial
1011 trial reported in Chapter 4 where confusion surrounding resource impact
1 was evident. The context in which resources are used will be specic to indi-
2 vidual activities, departments, organisations, etc. The principles described
3 here provide the underpinning knowledge required in order to develop an
4 approach for measuring the impact of resources and therefore the effective-
5 ness they deliver, within any context. Such an approach is dened within the
6 following chapter.
7
8
9 Notes
2011
1 1 External inuences are dened here as those that are not within the direct control of the
2 organisation, such as government policy, legislation, technological changes, etc.
2 It should be noted that in a different context, i.e. in relation to other activities, the infor-
3 mation search output may not be viewed as a resource.
4 3 Note: the reverse may also be the case.
5 4 More sophisticated techniques may be required to determine the overall weighted impact in
6 such a situation, such as those reported in [148].
5 International Standards Organisation
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

103
1
2
3
4
7 The PERFORM
Approach

The previous chapter explored the basic principles in


modelling a specic area of performance, i.e. the impact
of resources on activity effectiveness, and presented the
Resource Impact model describing relationships between
key elements. This formalises elements of the approach
5 used in an industrial trial (see Chapter 4) and forms
6 the basis for the development of an overall structured
7 approach, PERFORM, to analysing resource impact in
8 an organisational context. The PERFORM approach is
9 described in the following chapter by means of an initial
1011 overview, followed by more detailed discussion on the
1 different phases of the approach. The use of a facilita-
2 tor(s) and computer-based support to enhance the
3 approach is also described. A discussion at the end of the
4 chapter reviews the key characteristics of the approach
5 and offers a comparison against requirements established
6 in Part 1.
7
8
9 7.1 Overview
2011
1 The overall consideration of this book is on the analysis
2 of performance to identify its means of improvement.
3 The PERFORM approach is based on the view that activ-
4 ities in design are aimed at achieving goals and various
5 resources may be used to carry out these activities. The
6 approach involves an analysis of how different resources
7 impact upon the goals in design and the identication
8 of those resources that should be exploited further to
9 achieve better performance. The aim is not to obtain
3011 absolute values per se, but to determine the means by
1 which performance may be improved. Therefore, the
2 approach taken here is to focus on the relative impact of

105
A Methodology for Enhanced Design Performance

specic approaches, methods and tools on the PD goals in an effort to identify 1


areas for improvement. 2
The implementation of the approach utilises many of the principles and 3
tools of Quality Function Deployment (QFD) [112]. QFD was originally 4
invented in Japan to support the design process within the shipbuilding indus- 5
try in the late 1960s. It has become widely used within product design [96] 6
and its use has extended to a more general application of dening the best 7
means, from a number of alternatives, to achieve a set of goals. PERFORM is 8
similarly intended for use in dening the best means (i.e. resources) to achieve 9
effectiveness in relation to specic goals in design. 1011
Although the scope of analysis in PERFORM may be conned to a small 1
number of activities, it is described here in terms of its application within the 2
context of the overall design development process. Such an analysis will often 3
require knowledge of different aspects such as marketing, design/develop- 4
ment and production and will involve individuals from different functional 5
areas of an organisation [14]. The following description of PERFORM is based 6
on its use with an analysis team. The approach is implemented with the 7
support of one or more facilitators who have experience in using the approach 8
and may provide additional input where they are knowledgeable in a partic- 91
ular area1. In addition a software system has been developed to support the 2011
approach. 1
The PERFORM approach is implemented within a number of phases, each 2
producing particular outputs (Figure 7.1): 3
1. Specication: Which results in the denition of the scope of analysis, the 4
goals related to this scope and their priorities and the resources that are 5
currently used or may be used to support these goals. 6
7
2. Assessment: Where the exploitation of the resources is dened and the rela-
tionship between the resources and the achievement of goals (i.e. the 8
impact) is established. 9
3011
3. Analysis: Which involves the use of a matrix approach and specic analy- 1
sis measures to provide a number of different analyses on the effectiveness 2
of the resources. 3
4. Presentation: Where the results of the analysis are transformed to a 4
graphical format. 5
5. Review: The results are reviewed and discussed by the participants in the 6
analysis to identify any errors that may have been made in earlier phases, 7
revise the data if necessary and implement what-if scenarios. 8
The results of the PERFORM approach provide the basis for the selection of 9
target areas (i.e. resources) that should be introduced and/or exploited further 4011
to achieve improved effectiveness. An overview of the key elements of the 1
approach is presented in Figure 7.2. 2
3
4
7.2 Specification 5
6
Phase 1 involves the specication of the area of analysis in terms of its overall 7
scope, the goals being addressed and the resources to achieve these goals. 8
This phase requires signicant input from participants and methods such as 9
Afnity Diagrams, Brainstorming, Tree Diagrams, etc. [15, 112] are used to 50
encourage this input and provide structure to it. 51111

106
The PERFORM Approach

1 Phases Outputs
2
3
4 Scope of analysis
5 Specification Goals & Priorities
6 Resources
7
8
9
1011 Assessment Resource Exploitation
1 Resource Impact
2
3
4
5 Resource Effectiveness
6 Analysis
Scope for Improvement
7
8
9
2011
Graphical Results
1 Presentation
(multiple views)
2
3
4
5 Revised Data
6 Review What-if Scenarios
7 Consolidated Results
8
9 Figure 7.1 Process model of the PERFORM approach.
3011
1
2
3
4
5 Goal
6 Definition
7
8 Specification
Selection of Target Areas
9 Prioritised Goals
Conflict categorisation

4011
3.5
Criteria prioritisation

3.0 3.0 3.0


Criteria relations def'n
% of Total Ideal

Tools Facilitator(s) PERFORM 2.7 0.00 0.50 1.00


Established methods
1.50 2.00 2.50 3.00 3.50

1
2.5
% of Total (Ideal)

Explicit procedures

CAD
Contribution

2.2

Software 2.0 2.0

1.8
2.0
Function/Req's
Breakdown

EDM
2
1.6 Negotiation

System
1.5 1.5

QFD

Email 1.0
Conflict categorisation Quality Control

3 0.5

0.0
Safety/Liability
knowledge
6.0

5.0
Reuse procedures
Criteria prioritisation
4.0
Risk assess't

Safety/Liability

Methods
4
0.0 1.0 2.0 Ease 3.03.0 4.0 Criteria relations def'n
5.0 knowledge
Risk assess't
2.0 ideal

Resource DFX 1.0


current
Reuse procedures
5
0.0 Established methods

Definition QFD
Quality Control Explicit procedures
6 Approaches QFD
Function/Req's
Breakdown

7 Concurrent
Engineering
Analysis
Negotiation

8 Decision Presentation of Results


support Review/Further Analysis
9 Analysis Team
Assessment
50
51111 Figure 7.2 Overview of PERFORM approach.

107
A Methodology for Enhanced Design Performance

7.2.1 Scope 1
2
Before estimating and analysing any impact on effectiveness, it is necessary 3
to fully dene the overall scope of the analysis. The team involved in the analy- 4
sis may have an implicit denition of the scope but this must be discussed 5
and agreed explicitly before proceeding. As discussed in Chapter 2, the analy- 6
sis of performance may be carried out within different scopes, from a Product 7
Development program to an individual activity. The scope establishes both 8
the range and level of activities and provides a boundary for the analysis. 9
Where the results of the analysis are being communicated beyond the team 1011
actively involved in the process, the scope allows the results to be placed in 1
context by others. 2
3
4
7.2.2 Goal Definition 5
6
The scope of the analysis provides a basis for dening a prioritised list of goals 7
for the activities that exist within this scope. For example, where the scope is 8
dened as a project, the goals of the project would be dened here. Although 91
goals such as minimising cost and reducing time are often implicit within the 2011
activities in design they must be explicitly dened here in order to ensure a 1
common understanding exists within the analysis team and support a detailed 2
analysis. That is, the resources may be analysed for their impact against spe- 3
cic goals in order to dene those that make the best impact. In some cases 4
the goals may already be dened within the organisation and are fully under- 5
stood and agreed by the analysis team. Where this is not the case a process 6
of brainstorming and consolidation may be carried out to provide a list of 7
agreed goals as follows: 8
9
Brainstorming: The denition of goals is initiated using brainstorming
3011
where all the individuals in the team contribute and establish a large
1
number of possible goals that may be applicable. No judgements are made
2
by the group at this stage on the suitability of the goals.
3
Consolidation: In this stage the team members describe and clarify the 4
meaning of those goals they have dened (also known as scrubbing 5
the data) to ensure that a common understanding exists. In many cases 6
individuals in the analysis team may dene the same goals using slightly 7
different terminology. A consolidation exercise allows the group to agree 8
common denitions of particular goals and eliminate duplication. 9
This process results in a set of design and design activity goals that relate to 4011
the scope of analysis. 1
2
3
7.2.3 Goal Prioritisation 4
5
In using PERFORM it is assumed that the different goals that have been 6
dened may not all be of equal priority (in relation to some higher level goal) 7
and therefore these priorities must be dened. The determination of goal 8
priorities may be carried out using a number of methods. Existing methods 9
available to support the weighting of goals include simple ranking, pair com- 50
parison [102], prioritisation matrices [112] and the Analytical Hierarchy 51111

108
The PERFORM Approach

1 Process (AHP) [100]. These methods are aimed at the transfer of tacit knowl-
2 edge (in relation to goal importance) of the team members into explicit
3 knowledge and arriving at consensus within a team. The methods range in
4 complexity and consequently speed and ease of use. All of the methods are,
5 to some extent, subjective and as such there is no best method that may be
6 unanimously adopted for every situation [15, 166]. It is not the intention here
7 to analyse prioritisation methods or propose a method based on the results
8 of this analysis, as any of the above methods may be used within PERFORM.
9 Rather, the focus is on further formalising the principles developed in
1011 Chapters 5 and 6 in order to provide a means for their implementation. For
1 the purposes of illustration the prioritisation method adopted in PERFORM
2 is a relatively simple one where participants are asked to select the impor-
3 tance ranking from a three level scale of Low (L), Medium (M) and High (H).
4 An example of goals and priorities for an actual design project carried out
5 in practice is shown in Table 7.1. The topic area is used here as shorthand
6 to allow abbreviated reference in presentation and discussion during the
7 remaining phases of the approach.
8
9
2011 7.2.4 Resource Definition
1
2 Following the denition of goals and priorities it is then necessary to estab-
3 lish the resources, i.e. approaches, methods and/or tools, which are currently
4 being or might be used to achieve these goals. The denition of resources will
5 be dependent on the initial scope of the analysis as dened earlier, e.g. if the
6 focus is on the design phase of Product Development various manufacturing
7 resources may not be considered.
8 Some organisations may have a detailed list of the resources they use within
9 Design and Development and may be able to directly relate this list to the
3011 scope in which the analysis is being carried out. However, such a list may not
1 be available and also, through brainstorming and facilitation, new resources
2 can be identied and incorporated within the analysis. The process of resource
3 denition may therefore begin with a blank sheet or be facilitated through
4 the introduction of a predened list of resources, represented within areas,
5 such as those shown in Table 4.1, dened in [111]. The use of such a list must
6 be open to amendment to t the needs of the particular analysis.
7 The denition of resources is carried out using a process similar to that
8 used to dene the goals but incorporating the use of Afnity Diagrams [112]
9 to allow resource groups to be created. The following are the key steps:
4011
Brainstorming: An initial list of resources is brainstormed. This list
1
includes those resources currently used and those that have not been intro-
2
duced in the organisation but are felt suitable. No judgements are made by
3
the group at this stage on the suitability of the resources.
4
5
6 Table 7.1. Specication of goals and priorities
7 Goal Topic Area Description Priority
8 G1 Risk Minimise development work and technology risk M
9 G2 Rework Reduce rework H
G3 Programme Meet the programme H
50 ... ... ... ...
51111

109
A Methodology for Enhanced Design Performance

Consolidation: As described in section 7.2.2 the results of the initial brain- 1


storming are discussed and the data is scrubbed to arrive at a nal list 2
of resources. 3
Grouping of resources: Having dened a number of individual resources it 4
is then often possible to group these resources within a common Resource 5
Area. For example, Resource Management and Process Modelling may 6
both be seen to be part of a common Resource Area of Design Management 7
in Table 4.1 This process may be repeated for the resulting areas to arrive 8
at a smaller number of (higher level) Resource Areas if required. Within 9
PERFORM the creation of such areas allows analysis of the impact of a 1011
Resource Area, in addition to individual resource impact. The level at which 1
the resource is dened will depend on the scope of analysis, e.g. QFD may 2
be dened as a resource while particular tools that are considered to be part 3
of QFD may not be explicitly represented. Analysis of resource impact at a 4
high level facilitates increased speed and quickly indicates areas for further 5
analysis. 6
7
The specication phase of the PERFORM approach provides a prioritised list 8
of goals and a list of resource areas with individual resources specied within 91
them. This forms the basis for the assessment phase of the process. 2011
1
2
7.3 Assessment 3
4
In the assessment phase values are assigned that provide further information
5
on resources and enhance the analysis. The relationships between the indi-
6
vidual resources and the goals are also dened within this phase. The follow-
7
ing presents the approach taken in establishing these values and relationships.
8
9
7.3.1 Resource Exploitation 3011
1
The concept of exploitation (Ex) of a resource was introduced within Chapter 2
6 and related within the Resource Impact model. Having dened the re- 3
sources, the potential (Expt) and actual2 (Exal) exploitation of these resources 4
may now be assessed by the analysis team. The exploitation is expressed as a 5
percentage with 100% representing the ideal exploitation. Therefore poten- 6
tial and actual resource exploitation may be considered as a percentage of the 7
ideal exploitation3. 8
The potential and actual degrees of exploitation allow the scope for improve- 9
ment to be established. However, this provides no indication of the effort 4011
required to improve the exploitation from actual to potential. That is, for two 1
resources with the same scope for improvement it may be easier for the organ- 2
isation to further exploit one over another. To enhance the analysis carried 3
out in PERFORM the concept of ease of exploitation is introduced as described 4
in section 4.2.3. 5
6
7
7.3.2 Resource Impact 8
9
The capability of the resource to act upon the input in a manner that achieves 50
the desired output was dened as the resource impact in Chapter 6. The 51111

110
The PERFORM Approach

1 100%
2
3
4
5
Im = H
6
7
8
9 Effectiveness ()
1011 Im= M
1
2
3
4 Im = L
5
6
7
8 0 100%
9
2011 Exploitation (Ex)
1 Figure 7.3 Impact assessment.
2
3
4
5 relative impact of resources and consequently their impact proles, can be
6 estimated based on knowledge of inherent resource attributes. The represen-
7 tation of impact is based on that normally used within Quality Function
8 Deployment (QFD) to establish the relationships between customer needs and
9 characteristics of the design concept. The impact may be described here as
3011 being Low (L), Medium (M) or High (H)4 resulting in typical impact proles
1 as represented in Figure 7.3.
2 As discussed in Chapter 6 a resource may have an impact on the achieve-
3 ment of more than one goal and this impact may be varied. Within the analy-
4 sis discussed here a large number of resources and goals may be dened and
5 therefore the number of impact relationships may be large. The PERFORM
6 matrix is introduced here in order to provide a framework in which these
7 relationships may be dened. Within this matrix the resources are represented
8 in rows while the goals are represented in the columns. Where an impact
9 relationship exists between a resource and a goal it may be represented, using
4011 the notation dened above, at the intersection between the row and the
1 column as shown in Figure 7.4.
2
3
4 7.4 Analysis
5
6 Having specied the various elements such as resources and goals, assessed
7 the relationships, exploitation of the resources, etc., the next phase involves
8 the analysis of this data in order to provide greater insight. This analysis
9 provides different views of the data that allow various aspects to be con-
50 sidered, e.g. comparing the impact of different resources, establishing where
51111 the greatest potential for further exploitation exists, identifying goals that are

111
A Methodology for Enhanced Design Performance

Goals 1
2
3
Gj
Resource
4
(Ri ) has a Medium 5
Impact on Goal (Gj ) 6
7
Ri M 8
Resources

9
1011
1
2
3
4
5
Figure 7.4 Modelling relationships within a matrix. 6
7
8
91
not adequately supported, etc. The following section outlines the basic 2011
approach and representation used to describe the data before developing a 1
number of measures that may be applied to give the different views required. 2
3
4
7.4.1 Analysis Approach 5
6
Within a simple matrix, where there are a small number of cells, it may be 7
possible to easily obtain an overview of the data and highlight areas of inter- 8
est. For example, in Figure 7.5 it can be seen that resource R2 is having the 9
greatest overall impact and that goal G1 is only being addressed by one 3011
resource. However, if the number of cells is large then obtaining such an 1
overview is difcult. Further analysis of the data is required to easily identify 2
resources with greatest impact, goals being addressed by only a small number 3
of resources, etc. 4
5
6
7
G1 G2 G3 G4 8
9
R1
4011
L 1
2
R2 L M H H 3
4
R3 H L
5
6
7
R4 H L 8
9
50
Figure 7.5 Data overview. 51111

112
The PERFORM Approach

1 A simplistic approach to analysis would be to calculate the sum of the rows


2 and columns within the matrix and in this way obtain an overview of the data.
3 However, this assumes that all the goals are of equal priority and does not
4 incorporate the ease or degree of exploitation in the analysis. The type of
5 problem being addressed here is one that relates closely to those described
6 within the eld of multiple criteria decision making [128]. That is, the various
7 goals may be described as criteria, which have various weightings, while the
8 multiple resources represent alternative means of achieving the goals/satisfy-
9 ing the criteria. Different methods may be adopted to support such decision
1011 making such as the rank-sum rule [15] and the datum method [129]. The
1 method used within this work is based on the prioritisation matrix commonly
2 found within implementations of QFD. This method is widely used in den-
3 ing the most appropriate means (i.e. resource) to achieve particular objectives
4 (i.e. goals). Within PERFORM, it allows the goal priorities to be taken into
5 account in determining the effectiveness of resources through using the
6 product of the individual impacts and priorities. In addition, the method sup-
7 ports coherence using multiple interrelated prioritisation matrices. That is
8 those resources identied as a focus for performance improvement using the
9 PERFORM approach may be transposed to the goals of a further analysis at
2011 a lower level (Section 7.7).
1
2
3 7.4.2 Data Representation
4
5 To support the mathematical analysis of the data, quantitative values are
6 assigned to the qualitative values for Ease, Impact and Priority. Table 7.2
7 provides an overview of the data elements and values used.
8 The values dened here are based on those commonly used within the area
9 of QFD. Although a scientic basis for the use of these values does not cur-
3011 rently exist they are widely adopted by QFD practitioners [112].
1 The various data elements are represented within a complete PERFORM
2 matrix to provide a framework in which they can be analysed (Table 7.3). This
3 framework also provides the basis for the implementation of the computer-
4 based support tool described in Chapter 8.
5
6
7 Table 7.2. Representations of data elements
8 Data Element Notation Numerical Value Meaning
9
Ease (E) L/M/H 1, 3, 5 Ease of Exploitation of Resource
4011 Exploitation (Ex) % 0100 Exploitation of Resource
1 Impact (Im) L/M/H 1, 3, 9 Impact of Resource on Goal
2 Priority (W) L/M/H 1, 3, 5 Priority of Goal
3
4
5
6 Table 7.3. The PERFORM Matrix
7 Gj Gm
8 Wj ... Wm
9 Ri Ei Exi(al) Exi(pt) Imij Imim
... ... ... ...
50 Rn En Exn(al) Exn(pt) Imnj Imnm
51111

113
A Methodology for Enhanced Design Performance

The matrix notation used is as follows: 1


2
Ri = Resource i, i = 1, . . . ., n; 3
Gj = Goal j, j = 1, . . . ., m; 4
5
Wj = Priority of Goal j; 6
Imij = Impact of Resource i on Goal j; 7
8
Ei = Ease of Further Exploitation of Resource i; 9
Exi(al) = The Actual Exploitation of Resource i; 1011
1
Exi(pt) = The Potential Exploitation of Resource i; 2
3
4
7.4.3 Analysis Measures 5
6
A number of measures are dened here, based on those used within the 7
prioritisation matrix, which may be applied to the data to give results that 8
provide additional insight. A brief description of each measure and the type 91
of information that results from their application is presented here. The ele- 2011
ments of the matrix that are considered in the measure are highlighted next 1
to its denition. The measures may be categorised as: 2
3
1. Resource Effectiveness Measure (REM). 4
2. Analysis Measures (AM). 5
3. Improvement Measures (IM). 6
4. Return on Investment (RoI). 7
8
An overall measure is also dened that allows the data to be normalised to 9
support comparison against a common datum. 3011
Resource Effectiveness Measure (REM): a single measure is dened here that 1
provides the basis for the denition of other measures in the analysis. This 2
measure takes account of the impact (Imij) of a resource (Ri) on a goal (Gj) 3
and the priority of that goal (Wj). The measure is dened as follows: 4
5
6
Gj Gm 7
Wj Wm 8
PRi Gj = Wj Im ij 9
Ri Ei Exi(al) Exi(pt) Imij Imim 4011
1
2
Rn En Exn(al) Exn(pt) Imnj Imnm
3
4
5
Measure 1 REM 6
7
8
The application of this measure allows:
9
A comparison of the resource effectiveness for a specic resource (Ri) 50
against different goals (Gj . . . Gm). 51111

114
The PERFORM Approach

1 A comparison of the effectiveness of different resources (Ri . . . Rn) in


2 relation to the same goal.
3 Those goals that are not impacted by a resource and those most impacted
4 to be identied (through assuming equal weighting for all goals).
5
6 Analysis Measures (AM): these measures (in addition to the core measure)
7 provide an analysis of resource effectiveness (both individual resources and
8 groups) based on the assumption that all resources are equally exploited.
9 Three initial measures are dened here to provide this analysis. The goals that
1011 are dened in the specication stage will be sub-goals of some higher level
1 goal (GO) such as prot. The Resource Effectiveness in relation to this higher
2 level goal may be calculated through summing the individual values obtained
3 using Measure 1 for a particular resource, i.e. across a row of the matrix.
4
5
Gj Gm
6
7 Wj Wm
SW Im
m
PRi (Go ) = PRi (Gj....m ) = j ij
8 Ri Ei Exi(al) Exi(pt) Imij Imim j =1

9
2011
1 Rn En Exn(al) Exn(pt) Imnj Imnm
2
3
4
Measure 2 AM1
5
6
7 This measure can be applied in a similar manner to that of Measure 1 to allow:
8
9 A comparison of the impact of particular resources (Ri) on the overall goal
3011 (GO).
1 Those resources that have a high impact on GO to be distinguished from
2 those with little or no impact, and therefore critical or redundant resources
3 to be identied.
4 The specication phase of the PERFORM approach can support the creation
5 of Resource Areas, e.g. Design Management, which is a number of resources
6 grouped together. The Resource Effectiveness for a Resource Area may be
7 determined through summing the individual resource effectiveness values.
8 That is, summing a part of a column in the matrix provides an analysis of the
9 effectiveness of a number of resources against a particular goal.
4011
1
2 Gj Gm
3
SW Im
n
Wj Wm PRi ....Rn (Gj ) =
4 i =1
j ij

5 Ri Ei Exi(al) Exi(pt) Imij Imim


6
7
8 Rn En Exn(al) Exn(pt) Imnj Imnm
9
50
51111 Measure 3 AM2

115
A Methodology for Enhanced Design Performance

The application of this measure allows: 1


A comparison of the effectiveness of different Resource Areas (Ri . . . Rk , 2
Rl . . . Rn) against a particular goal (Gj), e.g. comparing the effectiveness of 3
Design Management resources to the effectiveness of Decision Support 4
resources in relation to the goal of meeting the programme. 5
6
A comparison of the effectiveness of a particular resource area (Ri . . . Rn)
7
across different goals, e.g. comparing the effectiveness of Design
8
Management resources against a goal of meeting the programme to their
effectiveness against a goal of reduced rework. 9
1011
Similarly the effectiveness of a number of resources against the higher level 1
goal (GO) may be dened as: 2
3
4
Gj Gm
5
6
S SW Im
n m
Wj Wm PRi ....Rn (Go ) = PRi ....Rn (Gj ...Gm ) = j ij
Ri Ei Exi(al) Exi(pt) Imij Imim
i=1 j=1 7
8
91
Rn En Exn(al) Exn(pt) Imnj Imnm 2011
1
2
3
Measure 4 AM3 4
5
6
The application of this measure allows: 7
8
A comparison of the effectiveness of particular Resource Areas (Ri . . . Rn),
9
e.g. comparing the effectiveness of Design Management to that of Decision
3011
Support against the higher level goal (GO).
1
Improvement Measures (IM): the measures dened above provide a basic 2
analysis of the goal/resource relationships and the identication of anomalies 3
such as a resource that does not address any goals. However, further analy- 4
sis is necessary to allow the identication of opportunities for performance 5
improvement. The IMs are aimed at identifying the difference between the 6
actual effectiveness and the potential effectiveness of resources/resource 7
groups. The potential effectiveness is that which may be obtained through 8
further exploitation of the resources beyond their actual (or current) level of 9
exploitation. Eight additional measures are dened within this area. 4011
The AMs do not incorporate any information regarding the exploitation 1
of the resource and therefore, effectively assume 100% exploitation. As dis- 2
cussed in Chapter 6 the exploitation (Ex) of the resource may vary and in 3
many cases further exploitation is possible, resulting in increased effective- 4
ness. By incorporating the exploitation within the analysis, further variations 5
on the basic measures may be dened. That is, measures may be dened to 6
determine the potential effectiveness based on the potential exploitation of the 7
resource (Expt) and the actual effectiveness based on the actual exploitation 8
(Exal). 9
Therefore, the potential and actual effectiveness of a resource in relation to 50
a specic goal may be determined as follows: 51111

116
The PERFORM Approach

1 Gj Gm
2
3 Wj Wm
Potential PRi Gj = Wj Imij Exi(pt)
4 Ri Ei Exi(al) Exi(pt) Imij Imim
5

6
7 Rn En Exn(al) Exn(pt) Imnj Imnm
8
9
1011 Measure 5 IM1
1
2
3 Gj Gm
4
5 Wj Wm Actual PRi Gj = Wj Imij Exi(al)
6 Ri Ei Exi(al) Exi(pt) Imij Imim
7

8
9 Rn En Exn(al) Exn(pt) Imnj Imnm
2011
1
2 Measure 6 IM2
3
4
5
6 Similarly, a resource will have potential and actual effectiveness with respect
7 to a higher level goal (GO) that may be determined by:
8
9
3011 Gj Gm
1
Wj Wm Potential PRi (Go ) = PRi (Gj .... m ) =
2
S W Im W Ex
m
3 Ri Ei Exi(al) Exi(pt) Imij Imim j ij j i(pt)
j=1
4
5
6 Rn En Exn(al) Exn(pt) Imnj Imnm
7
8
9 Measure 7 IM3
4011
1
2 Gj Gm
3
Wj Wm Actual PRi (Go ) = PRi (Gj .... m) =
4
S W Im W Ex
m
5 Ri Ei Exi(al) Exi(pt) Imij Imim j ij j i(al)
j=1
6
7
8 Rn En Exn(al) Exn(pt) Imnj Imnm
9
50
51111 Measure 8 IM4

117
A Methodology for Enhanced Design Performance

The analysis of the effectiveness of a number of resources (Resource Areas) 1


against a particular goal can also be carried out on the basis of their potential 2
or actual exploitation. That is: 3
4
Gj Gm 5
Wj Wm 6
Potential PRi ... Rn (Gj ) =
7
S W Im W Ex
n
Ri Ei Exi(al) Exi(pt) Imij Imim
j ij j i(pt) 8
i =1
9
1011
Rn En Exn(al) Exn(pt) Imnj Imnm
1
2
3
Measure 9 IM5 4
5
Gj Gm 6
Wj Wm
Actual PRi ... Rn (Gj ) =

S W Im Ex
n
Ri Ei Exi(al) Exi(pt) Imij Imim
j ij i(al)
i =1

7
Rn En Exn(al) Exn(pt) Imnj Imnm 8
91
2011
Measure 10 IM6 1
2
3
An analysis may also be carried out relating a number of resources to a higher
level goal in relation to potential and actual exploitation. 4
5
6
Gj Gm 7
Wj Wm 8
Potential 9
Ri Ei Exi(al) Exi(pt) Imij Imim PRi .... Rn (Go ) = PRi .... Rn (Gj ... Gm ) = 3011
S SW Im Ex
n m
j ij i(pt) 1
i =1 j=1
2
Rn En Exn(al) Exn(pt) Imnj Imnm
3
4
5
Measure 11 IM7 6
7
Gj Gm 8
Wj Wm
9
Potential 4011
Ri Ei Exi(al) Exi(pt) Imij Imim PRi .... Rn (Go ) = PRi .... Rn (Gj ... Gm) = 1
S SW Im Ex
n m

j ij i(pt) 2
i =1 j=1
3
Rn En Exn(al) Exn(pt) Imnj Imnm 4
5
6
Measure 12 IM8 7

118
The PERFORM Approach

1 These measures may be applied in a similar manner to the rst four meas-
2 ures. The results from the potential and actual situations may be compared
3 to allow the identication of instances where gaps exist between them. The
4 existence of such a gap indicates that the resource(s) may be exploited further
5 to achieve higher levels of effectiveness with respect to a particular goal(s).
6 The gap between potential and actual effectiveness is dened here as the Scope
7 for Improvement, i.e.:
8
9
1011
1 Scope for Improvement = Potential P Actual P
2
3 Measure 13 IM9
4
5
6
7 Therefore Scope for Improvement in:
8
9 of a resource in relation to a specic goal =
2011 Measure 5 Measure 6
1
2 of a resource in relation to a higher level goal =
3 Measure 7 Measure 8
4
5 of a resource group in relation to a specic goal =
6 Measure 9 Measure 10
7
8 of a resource group in relation to a higher level goal =
9 Measure 11 Measure 12
3011
1
2
Although the measures described above allow the scope for improvement to
3
4 be dened, they do not allow any analysis of how easy (or difcult) it may be
5 to improve the effectiveness. That is, it does not indicate the degree of dif-
6 culty in closing the gap between actual and potential effectiveness. The ease
7 of further exploiting an existing or new resource is captured in the assess-
8 ment phase of PERFORM as discussed earlier (Section 7.3.1). It may be pre-
9 sented in the results through the use of specic diagrams, for example the use
4011 of a scatter diagram to relate the ease to the resource effectiveness (e.g. Figure
1 9.2 (c)). However, it is also incorporated within the analysis here to assist in
2 highlighting those resources where further exploitation may give the greatest
3 return.
4 Return on Investment (RoI): The Return on Investment (RoI) measure is
5 dened here to provide an indication of the return, in terms of increased effec-
6 tiveness, from the investment of time, people, money, etc. to further exploit
7 a resource. This investment could be in the form of training, installation of
8 new equipment, etc. The measure is dened here based on the effectiveness
9 of the resource in terms of the higher level goal (GO) as in many cases the
50 resource will impact on more than one of the specied goals. It is dened
51111 as follows:

119
A Methodology for Enhanced Design Performance

1
Gj Gm
2
3
S
m
Wj Wm
RoIRi (Go ) = Wj Imij Ei
Ri Ei Exi(al) Exi(pt) Imij Imim
j=1 4
5
6
Rn En Exn(al) Exn(pt) Imnj Imnm 7
8
9
1011
Measure 14 RoI
1
2
3
This measure may be applied across all the individual resources to allow the 4
comparison of the RoI in each of them. The results provide an indication of 5
areas where it may require more investment to gain increased effectiveness 6
and those areas where less investment is required. 7
Normalising the Results: In order to be able to compare all the results 8
against a common datum the data resulting from the analysis is normalised. 91
That is, the effectiveness of resources against goals is normalised using 2011
the Total Effectiveness of all the resources against the higher level goal (GO) 1
2
3
Gj Gm
4
Wj Wm
PRi ... Rn (Go ) = PRi ... Rn (Gj ...m ) =
5
6
S SW Im
n m
Ri Ei Exi(al) Exi(pt) Imij Imim
i =1 j=1
j ij 7
8
Rn En Exn(al) Exn(pt) Imnj Imnm 9
3011
1
2
Measure 15 Total Effectiveness 3
4
calculated using the following: 5
Where: 6
7
n = total number of resources in the analysis scope
8
m = total number of goals in the analysis scope 9
4011
Therefore, as an example of this normalisation, the effectiveness of a partic-
1
ular resource against a higher level goal (GO) would be normalised as follows:
2
m 3
WjIij 4
j=1
Normalised Ri (GO) =
n m
Equation 1 5
WjIij 6
i=1 j=1
7
The use of normalisation in this case aids comparison of values obtained from 8
the application of the different measures. For example, the effectiveness of a 9
particular area of resources could be directly compared with the effectiveness 50
of an individual resource, where both are expressed as a percentage of the 51111

120
The PERFORM Approach

1 Total Effectiveness. This allows the contribution of that resource to the overall
2 resource area to be assessed. Although various different sets of results
3 may be derived from using the PERFORM approach they may be expressed
4 on a common scale, i.e. percentage of Total Effectiveness, through the use of
5 normalisation.
6
7
8 7.5 Representation
9
1011 The application of the measures to the data captured within the PERFORM
1 matrix allows different views of the data to be generated and represented in
2 various formats. The results can be represented within a table to allow detailed
3 and accurate comparison of data. However, such an analysis will provide
4 limited benet given that the approach dened here is aimed more at ease of
5 use and the provision of rapid feedback, e.g. using simple scales of 1, 3 and 5
6 for Priority, than on detailed assessment. The representation of the results in
7 a graphical format provides a more easily interpreted comparison of values
8 allowing relative effectiveness to be more quickly assessed. The PERFORM
9 approach relies primarily on the use of graphical representation in the
2011 form of standard graph types, such as bar charts, scatter diagrams and radar
1 diagrams. The creation of such charts is supported by software tools such
2 as Microsoft Excel 97 which has been integrated within the PERFORM
3 system (See section 7.8). The results can be presented to the analysis team in
4 a variety of forms depending on requirements. For example, they could be
5 presented at the time of the analysis (i.e. on the day that the team is brought
6 together) using electronic presentation equipment. Alternatively, they could
7 be presented in the form of a printed report at a later date.
8
9
3011 7.6 Review
1
2 The presentation of the results provides an opportunity for the team to revisit
3 the data dened in the assessment phase of the process. That is, the team
4 is given the opportunity to review and discuss the results and identify any
5 apparent anomalies. Data can be easily altered at this point and re-analysed
6 to allow what-if scenarios to be performed and support sensitivity analysis
7 [127]. The development of the software system to support the approach
8 greatly simplies this phase and allows different scenarios to be investigated
9 at the time of the analysis with the team present. That is, the changes can be
4011 input and results provided in a typical time of 5 minutes5.
1
2
3 7.7 Further Analysis
4
5 Where the initial scope of analysis is at a high level in terms of the denition
6 of goals and resources, further analysis may be appropriate at a lower level.
7 This analysis would follow the same process as described above with the
8 exception that the areas of focus that have been dened in Phase 1 would
9 become the goals of Phase 2. This reects the approach developed in Enhanced
50 Quality Function Deployment (EQFD) where multiple prioritisation matrices
51111 are generated and linked to support in-depth analysis [112]. For example

121
A Methodology for Enhanced Design Performance

... G1.2 ... ... 1


2
... 3
L
4
R1.2 L H 5
6
... ... ... G2.3 ...
L 7
...
8
M L ... L M 9
... L H
1011
1
R2.3 H L 2
3
G1.2 ... H L 4
... ... G 3.3 ... 5
... ...
6
...
G2.3
... H M 7
...
8
... L ...
91
... ... ... H ...
2011
...
G3.3 1
Goal Breakdown Structure ... ... ... L 2
3
4
Figure 7.6 Alignment in levels of analysis. 5
6
7
(Figure 7.6), in the rst phase of analysis Design for Manufacture (DFM) (R1.2) 8
may be highlighted as a resource that has high effectiveness in relation to the 9
goal (G1.2) of reducing rework. Subsequently a goal (G2.3) may be specied for 3011
Phase 2 focusing on further exploitation of DFM. More detailed resources 1
would be specied within this phase such as particular DFM tools or tech- 2
niques (R2.3) and analysed for impact in the same manner as described above. 3
Ultimately, the PERFORM approach will allow the identication of very spe- 4
cic areas of focus, e.g. the purchase of a specic software application. The 5
linking of different levels of analysis in this way supports the alignment of 6
goals within a Goal Breakdown Structure (GBS). That is, the high level goal 7
of reducing rework (G1.2), achieved through further exploiting DFM (G2.3) and 8
ultimately the goal of implementing a software tool (G3.3). 9
In addition to further analysis at different levels the analysis may also be 4011
repeated at different times. That is, having established the existing levels of 1
resource effectiveness, selected areas for improvement and implemented 2
those improvements, the PERFORM approach may be used to conduct a 3
further analysis to establish the degree of improvement achieved. This allows 4
a form of benchmarking [25] to be carried out internally in the organisation 5
and supports an approach of continuous improvement [167]. 6
7
8
7.8 Supporting the Approach 9
50
The early phases of specication and assessment are supported by a facilita- 51111

122
The PERFORM Approach

1 Gj Gm
3.5
Conflict categorisation

2 3.0 3.0

2.7
3.0

0.00 0.50
Criteria prioritisation

Criteria relations def'n

1.00
Established methods
% of Total Ideal
1.50 2.00 2.50 3.00 3.50

3
2.5

% of Total (Ideal)
Explicit procedures
2.2

Contribution
Gj 2.0 2.0 2.0
Function/Req's
1.8
Breakdown

4
1.6 Negotiation
1.5 1.5

Scope 1.0
Conflict categorisation Quality Control
QFD

Ri I ij
5 I of O 0.5
Safety/Liability
knowledge
6.0

5.0

4.0
Reuse procedures
Criteria prioritisation
Risk assess't

Analysis
0.0
Safety/Liability
Risk assess't Criteria relations def'n

6
0.0 1.0 2.0 Ease 3.03.0 4.0 5.0 knowledge

2.0

1.0
ideal
current
Reuse procedures Established methods

7
0.0

Quality Control Explicit procedures

8 QFD
Function/Req's
Breakdown

9 Ri Rn
Negotiation

1011 E2 Relationship
1 Representation Modelling Analysis/Presentation
of Scope
2
3 Facilitation
4 Computer Support
5
6 Figure 7.7 Supporting the PERFORM approach.
7
8
9 tor(s) experienced in the implementation of the approach. The approach is
2011 enhanced where the facilitator provides detailed knowledge of the subject
1 area, i.e. resources that may be used to support design, including those con-
2 sidered by industry to be leading edge. The facilitation of the team using
3 an experienced facilitator(s) also supports the generation and resolution of
4 conict in a constructive manner to arrive at consensus. Achieving consen-
5 sus in this manner provides an important element of ownership on the part
6 of the team involved. That is, the team view the results of the analysis as their
7 results, based on their input data and not generated by an external party.
8 A computer-based system has been developed to support the implementa-
9 tion of the approach, particularly in the analysis and presentation phases. In
3011 general the primary facilitator is supported by a co-facilitator who is also
1 responsible for the data input and operation of the PERFORM software
2 system. The later phase, analysis and presentation, needs less facilitation as
3 additional support is provided by the computer-based software tool (Figure
4 7.7). This tool allows results to be quickly generated and presented on the day
5 of the analysis.
6
7
8 7.9 Discussion
9
4011 When multiple alternatives and multiple criteria are present, decision makers
1 often base decisions on the application of rules of thumb and decision
2 making may be improved through the use of explicit methods [15]. PERFORM
3 provides a structured approach for the assessment of multiple resource/
4 goal relationships (i.e. impacts). The use of a structured approach encourages
5 decisionmaking based on explicit criteria and provides a mechanism for
6 recording decision rationale in an easily understood form [16].
7 The analysis approach dened here supports performance measurement,
8 performance management and decision making for performance improve-
9 ment. That is, this approach allows the user to measure the impact on
50 effectiveness of a particular resource. It provides an indication of the level of
51111 effectiveness currently being and that might be achieved. Further, it supports

123
A Methodology for Enhanced Design Performance

the management of performance through providing insight into means for 1


performance improvement and further analysis to evaluate the result of 2
implementing those means. It supports decisions by providing the informa- 3
tion required to justify performance improvement initiatives in one area 4
over another and allowing the selection of particular focus areas from many 5
alternatives, with respect to multiple criteria (i.e. goals). 6
The PERFORM approach forms a component of the methodology for 7
design performance modelling and analysis. This methodology is presented 8
in the following chapter drawing together the elements presented within 9
Chapters 5, 6 and 7 and concluding part two of this book. 1011
1
2
Notes 3
4
1 The approach may be enhanced in certain areas by such input but can be successfully 5
implemented without it.
6
2 Where the resources identied at this stage have not yet been introduced in the organisa-
tion the actual degree of exploitation will be 0 (zero). 7
3 In some cases the team may assign a value of 100% to the potential or actual exploitation 8
where they feel the ideal may be, or has been, reached. 91
4 In certain cases a relationship between a resource and effectiveness in relation to a particular 2011
goal may not exist.
5 This is based on results obtained through industrial implementation of the approach as 1
discussed in Chapter 12. 2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

124
8
A Methodology for
Performance Modelling
and Analysis in Design
Development

Chapters 5, 6 and 7 have focused on providing new


1 insights into the phenomenon of design performance and
2 developing means to support its analysis. Each of the
3 chapters provides components that support the model-
4 ling of performance to support its measurement and
5 management and the analysis of performance in order to
6 achieve improvement. Chapter 5 presents a knowledge-
7 processing model of activities in design development. A
8 fundamental model of design development performance
9 is presented and related to the activities of design and
1011 design management. The focus of the work presented
1 in Chapter 6 is on a particular aspect of performance.
2 The Resource Impact model is developed relating the use
3 of resources and the effectiveness achieved. Finally, in
4 Chapter 7, an approach is dened that integrates the con-
5 cepts described in Chapters 5 and 6 within a structured
6 approach for the analysis of performance to identify
7 means for improvement.
8 The following chapter introduces an overall method-
9 ology for design performance modelling and analysis to
2011 achieve improvement. The methodology is composed of
1 the different components presented in the previous three
2 chapters. The overall methodology is outlined initially,
3 followed by a brief review of key elements illustrating
4 their evolution and relationship to each other.
5
6
7 8.1 Overview of Methodology
8
9 The methodology for design performance modelling
3011 and analysis is developed from an understanding of the
1 design performance phenomenon. A formalism for design
2 development performance provides a basis for modelling

125
A Methodology for Enhanced Design Performance

performance in any situation and deriving measures for its evaluation. Having 1
established an initial model of performance an analysis approach, PERFORM, 2
provides a means to identify areas for performance improvement. The degree 3
of improvement may be established through further application of the 4
formalism to model and measure performance. The methodology therefore 5
presents a cycle for achieving continuous performance improvement (Figure 6
8.1). The key elements of the methodology are: 7
8
1. Design Development Performance Formalism (Chapter 5): A formalism for
9
modelling performance in design development to support its measurement
1011
and management is dened. This captures the fundamental nature of
1
performance and its relation within the activities in design development.
2
The models developed here provide a comprehensive representation of
3
performance (dened as efciency and effectiveness) and provide a basis
4
for measuring these. This provides a basic understanding of design devel-
5
opment performance that is necessary to conduct further analysis in this
6
area. The models provide a context for the denition of an area of focus in
7
the remainder of the book.
8
2. Resource Impact Model (Chapter 6): Having established a comprehensive 91
view of design performance an area of focus is dened and addressed. One 2011
aspect of performance, effectiveness, is selected. Further, an area of inu- 1
ence on performance is identied for analysis, i.e. the impact of resources. 2
3
4
Measures and values of 5
efficiency and effectiveness 6
7
8
PERFORM
9
Specification 3011
1
(Ad ) 2
3
Assessment 4
(Ad )
5
6
Analysis 7
(Ad ) (Am ) 8
9
(Am ) 4011
Presentation 1
2
3
(Am ) 4
Review
5
6
7
8
Areas for improvement in 9
efficiency and Effectiveness
50
Figure 8.1 A methodology for design performance modelling and analysis. 51111

126
A Methodology for Performance Modelling and Analysis in Design Development

1 The nature of this inuence is explored and modelled within the Resource
2 Impact model. Further elements are derived as part of a framework to
3 support an analysis of resource effectiveness, such as Potential/Actual
4 Exploitation and Ease of Exploitation.
5 3. Analysis Approach (Chapter 7): The PERFORM approach brings together
6 the elements of the framework identied in Chapter 6 within an overall
7 analysis approach to be carried out in a facilitated workshop environment.
8 This approach utilises a matrix as a core tool to allow the representation
9 of relationships between resources and goals, i.e. impact. The results of the
1011 analysis allow the identication of resources that may be exploited further
1 to achieve greatest performance improvement.
2
3 The methodology presented here provides the knowledge, principles and
4 approach necessary to understand design development performance, to
5 model an aspect of it for a particular organisation and to derive a means for
6 its improvement.
7
8
9 8.2 Design Development Performance Formalism
2011
1 Within this area of the methodology the nature of performance and its
2 relation to design activities is analysed. The development of a fundamental
3 understanding of design performance involves a number of key areas:
4 1. The development of a Design Activity Management (DAM) model, based
5 on an analysis of the design activity from a knowledge-processing
6 viewpoint and aimed at distinguishing and relating design and design
7 management activities.
8 2. The development of a Design Performance model (E2), which identies ef-
9 ciency and effectiveness as the core elements of performance and relates
3011 them within the basic knowledge-processing model of the design activity.
1
3. The denition of a process model for Performance Measurement and
2
Management (PMM) based on the elements developed in 1 and 2 above.
3
The description of performance provided in E2 is related to the DAM model
4
in order to describe the process of measuring and managing performance
5
as part of design and design management activities.
6
7 Design is viewed here as the processing of knowledge in order to transform
8 an input to an output, under the direction of goals/constraints, using partic-
9 ular resources (Figure 8.2 (a)). It is recognised within this description that the
4011 goals and/or constraints can relate to the design itself (i.e. the artefact) or
1 the activity of designing. The DAM model distinguishes those activities
2 addressing design goals, i.e. design activities, from those addressing goals
3 related to the activity of designing, i.e. design management activities. These
4 activities are modelled within an overall activity, dened here as a managed
5 activity (Figure 8.2 (b)).
6 The modelling of design activities from a knowledge-processing viewpoint
7 also provides the basis for the denition of design performance. Design per-
8 formance is dened here within the E2 model as efciency and effectiveness,
9 where efciency relates input, output and resource knowledge within the
50 activity and effectiveness describes the relationship between the actual and
51111 intended output knowledge (i.e. the output and the goal) (Figure 8.2 (c)).

127
A Methodology for Enhanced Design Performance

1
2
3
G Design
4
5
6
I Activity O 7
Design 8
Management 9
1011
R
1
2
(a) (b) 3
4
5
6
G Effectiveness 7
Design 8
91
I Activity O
2011
1
Design 2
Management 3
Efficiency R 4
5
6
(c) (d) 7
8
Figure 8.2 Design development performance formalism. 9
3011
1
The denition of efciency and effectiveness provides a basis for determin- 2
ing appropriate metrics to use in their evaluation. 3
From an understanding of design and design management activities and 4
the key elements of performance, i.e. efciency and effectiveness, the PMM 5
model was developed. This model describes the use of efciency and effec- 6
tiveness to support decision making and control (represented as dashed 7
lines) of design/design management activities under dynamic conditions 8
(Figure 8.2 (d)). 9
4011
1
8.3 Area of Performance Analysis 2
3
The analysis of effectiveness provides an important element of controlling 4
design and design management activities and relating them within the 5
PMM model. Although there are a number of factors inuencing effective- 6
ness, such as strategy and goals, the resources used to carry out design/design 7
management activities are considered to have the most direct inuence. 8
The area of performance analysis that is investigated is the relationship 9
between resources, such as approaches, methods and tools and the level of 50
effectiveness achieved. 51111

128
A Methodology for Performance Modelling and Analysis in Design Development

1 100%
2
3
4
5
6
7
8
9 Effectiveness ()
1011
1 pt
2
3 S Impact
4 Profile
5
al
6
7
8
9
2011 0 Exal Expt 100%
1
2 Exploitation (Ex)
3
4 Figure 8.3 Resource impact model.
5
6
7 The impact of a resource has been described here as the capability of the
8 resource to act upon the input in a manner that achieves the desired output.
9 Different resources used in design may be exploited to different degrees and
3011 therefore realise different levels of effectiveness. The relationship between the
1 exploitation of a resource and the effectiveness achieved is modelled within
2 the Resource Impact model as the impact prole. Although in many cases this
3 relationship may be quite complex it is assumed to be linear for the purpose
4 of simplication (Figure 8.3).
5 The Resource Impact model provides a basis for analysing the impact of
6 various resources. Having identied the actual and potential exploitation of
7 a resource and its impact prole in relation to a particular goal the scope for
8 improvement in terms of effectiveness may be dened. The relative estima-
9 tion of impact allows the comparison of scope for improvement across
4011 a number of goals. In addition the resource effectiveness may be dened,
1 which indicates the weighted impact. This supports the analysis of resource
2 impact in an environment where multiple goals with different priorities are
3 addressed.
4
5
6 8.4 Analysis Approach
7
8 The elements developed within Chapter 6 provide the key principles that
9 underpin the implementation of the PERFORM approach. It assumes that the
50 impact of a resource on a goal may be estimated using the judgement of
51111 individuals with appropriate knowledge. PERFORM provides a structured

129
A Methodology for Enhanced Design Performance

approach for capturing the knowledge of these individuals. The approach 1


incorporates many of the principles dened within the method of Quality 2
Function Deployment, which is widely used in industry. 3
The overall approach takes the form of a facilitated workshop where key 4
individuals involved in design development assume an active role in the 5
analysis. This analysis team dene elements within the PERFORM matrix 6
(Figure 8.4), which provides a framework in which the Resource Impact model 7
is implemented. Within this matrix resources are dened within rows while 8
the goals that they address are represented within the columns. The intersec- 9
tion of a row and a column provides a cell where the Impact (Im) of a resource 1011
on a goal may be dened. Further elements identied within Chapter 6 1
are dened within the matrix to create a more comprehensive model. The 2
completed matrix includes a denition of: 3
4
Resources (Ri . . . Rn)
5
Goals (Gj . . . Gm) 6
Goal Priorities (Wj . . . Wm) 7
Actual and Potential Exploitation of the Resources (Exi(al) and Exi(pt)) 8
Ease of Exploitation of the Resources (Ei) 91
Impact Relationships between Resources and Goals (Imij) 2011
1
2
Gj ... Gm 3
4
Wj ... Wm 5
6
Ri Ei Exi(al) Exi(pt) Imij ... Imim 7
8
... ... ... ... ... ... ...
9
Rn En Exn(al) Exn(pt) Imnj ... Imnm 3011
1
2
3
Figure 8.4 The PERFORM Matrix. 4
5
Having dened the various elements within the PERFORM matrix the data 6
may be analysed, using the software reported in Chapter 12, to allow the rep- 7
resentation of the impact of resources in a number of ways. For example, the 8
impact of a number of resources on a specic goal or the impact of a specic 9
resource on a number of goals may be presented. 4011
The results derived from implementing the PERFORM approach support 1
decision making with respect to selecting the means to improve performance. 2
That is, those resources that have a high impact on a number of goals may 3
be identied and their current exploitation assessed. The effect of further 4
exploitation can be simulated to allow the scope for improvement to be illus- 5
trated. Those resources with the greatest scope for improvement may be 6
selected for focus depending on the ease of further exploitation, which will 7
also be dened and compared. In this way, the selection of areas of improve- 8
ment is supported by PERFORM but not automated. That is, the implemen- 9
tation of the approach can provide different presentations, or views, of the 50
results to support decision-makers. 51111

130
A Methodology for Performance Modelling and Analysis in Design Development

1 8.5 Summary
2
3 A methodology for design performance modelling and analysis has been
4 presented in this Chapter. The methodology is composed of a number of
5 elements, i.e. a formalism of design development performance, modelling
6 relationships between resource use and effectiveness and an approach to
7 analysis based on these models. The denition of these elements ensures that
8 the PERFORM approach is implemented on a sound basis and provides a clear
9 context for the approach within the broader area of performance. The results
1011 of the implementation of the PERFORM analysis approach support the
1 ongoing management of performance byproviding insight into how resources
2 impact on effectiveness in design. That is, resources may be utilised in such
3 a way as to maximise the impact on effectiveness and improve performance.
4 Part 3 of this book presents applications and summarises key features of
5 the methodology. This is presented in a number of sections describing dif-
6 ferent types of application, each aimed at demonstrating a particular aspect
7 of the methodology.
8
9
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

131
1
2
3
4
5
6
7
8
9
1011
1 PART III
2
3
4
5
6
7
8
9
Application and
2011
1
2
Key Features
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111
Part III Prelude

Part 2 presented an insight into design development and


1 its performance, including models that describe perfor-
2 mance at the activity level and outlined its measurement
3 and management. Further, an approach was presented,
4 based on an understanding of factors inuencing per-
5 formance, which provides a means to improve perfor-
6 mance. The overall methodology presented in Chapter
7 8 therefore provides a basis for design development
8 performance to be formalised, measured, managed and
9 improved. The following chapter presents a validation of
1011 the concepts using a variety of means.
1 Part 1 of this book provided an analysis of research and
2 practice in design development performance and high-
3 lighted some shortcomings in the area. The methodology
4 presented in Part 2 is composed of a number of elements
5 that address the shortcomings:
6
7 1. The E2 Model: describing the performance of a knowl-
8 edge based activity in terms of efciency and effec-
9 tiveness for any scope of analysis.
2011 2. The DAM Model: presenting a managed activity that
1 distinguishes and relates design and design manage-
2 ment activities.
3 3. The PMM Model: describing performance measure-
4 ment and management within design development
5 and alignment across design and design activity goals.
6 4. Resource Impact Model: illustrating resources as key
7 inuences on effectiveness and describing the nature
8 of that inuence.
9 5. The PERFORM Approach: based on the principles of
3011 analysing resource impact and providing a structured
1 approach to the identication of means to improve
2 performance.

135
Part III Prelude

These elements are the subject of validation by a number of approaches, each 1


focusing on specic, and sometimes overlapping, elements. The following 2
types of validation were used: 3
4
Worked Example: information from previously reported protocol studies
5
allowed a worked example to be developed, illustrating the application of
6
all the elements presented in Part 2.
7
Metrics Review and Analysis: a review of metrics presented in the litera- 8
ture allows an analysis of the applicability of the E2, DAM and PMM models 9
through assessing whether their principles are violated by existing metrics. 1011
Industrial appraisal: presentation of the work to individuals from industry 1
and academia with a view to obtaining a critical appraisal from experts in 2
the area of design development, performance measurement and manage- 3
ment and risk management. Such appraisals assess all of the elements 4
described above although the focus may alter based on the expertise of the 5
appraiser. 6
System Development: the development of a computer-based system to 7
evaluate the realisation of the PERFORM approach. That is, such a devel- 8
opment determines if the approach may be described logically within an 91
operational software architecture. 2011
Industrial Implementation: the implementation of the PERFORM approach 1
within an industrial environment to assess the degree to which it is 2
applicable within such an environment and its ability to identify areas 3
for performance improvement. 4
5
The table presented below provides an overview of how the different elements 6
of the methodology are addressed by the validation approaches: 7
8
9
E2 Model DAM PMM Resource PERFORM
3011
Model Model Impact Model Approach
1
Worked Example
Metrics Review/Analysis 2
Industrial appraisal 3
System Development 4
Industrial Implementation 5
6
7
Part III discusses the validation approaches and results. The book concludes 8
with a review of the developed methodology, leading to key insights. 9
4011
1
2
3
4
5
6
7
8
9
50
51111

136
1
2
3
4
9 Worked Example

The Delft workshop Research in Design Thinking II


Analysing Design Activity, that took place in 1994,
focused on data collected from design experiments. The
experiments were carried out with an individual and a
team of three designers working on the design of a fas-
5 tening device. A number of design researchers analysed
6 the data and the results of their analysis are presented
7 within a series of papers [87]. The outline of the experi-
8 ment and researcher observations is used here as a
9 basis for applying the work presented in this book. The
1011 following describes how performance in the design
1 experiment may be formalised and its key elements/
2 relationships claried using the models presented in
3 Chapter 6. Further, the PERFORM approach is applied
4 to illustrate its application in identifying areas for
5 improvement.
6
7
8
9 9.1 Overview of the Delft
2011 Assignment
1
2 The design assignment presented to the participants in
3 the protocol experiment is outlined below:
4
5
6
7
8
9
3011
1
2

137
Application and Key Features

1
HiAdventure Inc is a fairly large US rm (some 2000 employees) making 2
backpacks and other hiking gear. They have been very successful over 3
the last ten year, and are well known nationwide for making some 4
of the best external-frame backpacks around. Their best selling back- 5
pack, the midrange HiStar, is also sold in Europe. In the last one and a half 6
years, this European activity has suffered some setbacks in the market; in 7
Europe internal-frame backpacks are gaining a larger and larger market 8
share. 9
As a response, HiAdventure has hired a marketing rm to look for new 1011
trends and opportunities for the European market. On the basis of this 1
marketing report, HiAdventure has decided to develop an accessory for the 2
HiStar: a special carrying/fastening device that would enable you to fasten 3
and carry the backpack on mountain bikes. 4
The device would have to t on most touring and mountain bikes and 5
should fold down, or at any rate be stacked away easily. A quick survey has 6
shown that there is nothing like this on the European market. 7
This idea is particularly interesting for HiAdventur, because the director, 8
Mr. Christiansen, has a long-standing private association with one of 91
the chief product managers at the Batavus bicycle company (one of the 2011
larger bicycle manufacturers in northern Europe, based in Holland). Mr. 1
Christiansen sees this as a good opportunity to strike up a cooperation and 2
to prot from the European marketing capabilities of Batavus. 3
The Batavus product manager, Mr Lemmens, is very enthusiastic about 4
putting a combination product on the market, a mountain bike and a back- 5
pack that can be fastened to it. The idea is to base the combination product 6
on the Batavus Buster (a midrange mountain bike) and to sell it under the 7
name Batavus HikeStar. 8
The design department at Batavus has made a preliminary design for the 9
carrying/fastening device, but both Mr. Christiansen and Mr. Lemmens are 3011
not very content with it. The users test performed on a prototype also 1
showed some serious shortcomings. 2
That is why they have hired you as a consultant to make a different 3
proposal. Tomorrow there is going to be a meeting between Mr. Christiansen 4
and Mr. Lemmens, scheduled as the last one before presenting the idea to 5
the board of Batavus. Before then, they need to have a clearer idea of the 6
kind of product it is going to be, its feasibility and price. 7
You are hired by HiAdventure to make a concept design for the device, 8
determining the layout of the product, concentrating on: 9
Ease of use 4011
A sporty, appealing form 1
Demonstrating the technical feasibility of the design 2
3
Ensuring that the product stays within a reasonable price range 4
You are asked to produce annotated sketches explaining your concept 5
design. 6
7
Good Luck! 8
9
50
51111

138
Worked Example

1 The assignment was to be carried out within 2 hours and an experimenter


2 (designated EXP1) was made available to the individual and team as a support
3 person who provided information to be used during the design activity.
4
5
6 9.2 Formalising Performance
7
8 In developing a formalism of performance from the protocol studies two cases
9 may be considered, i.e. the case of an individual designer (Case #1) versus
1011 the case of a team of three individuals (Case #2). The overall scope of the
1 performance analysis is bounded by the initial description of the task, i.e. it
2 involves the design and development of a concept in accordance with the brief
3 given. The brief provides both background information on the state of the
4 problem (i.e. current design state) which may be considered as input knowl-
5 edge (I), while also providing detail on goals (G) such as the desired form of
6 the product and the time allowed to carry out the task (2 hours). The overall
7 goal (GO) may therefore be seen as the delivery of a product concept in
8 2 hours. Both Design and Design Management activities will take place over
9 the two hour period as described within the DAM model in Section 5.2.2. An
2011 overview of the DAM model as related to the Delft case is provided in Figure
1 9.1. The overall inputs, outputs, goals and resources are further detailed and
2 related to design and/or design management activities as follows:
3 Inputs and Outputs: The initial brief constitutes the state of the design at
4 the start of the activity (DI) and the anticipated output at the end of the activ-
5 ity is a concept, represented by annotated sketches (DO). The brief presents
6 an isolated case and therefore knowledge of previous activity costs, time
7
8
9 G0 design development of a
3011 concept within a 2 hour period
1
2 DG DAG
3
4
5 DI DO
6
7
8
9 I problem background O fully defined concept and
4011 defined in brief activity description (e.g.
1 DAI DAO resource used, etc.)
2
3
4
5 DR DAR
6
7
8 R information resources,
designer(s), support
9 people, etc.
50
51111 Figure 9.1 DAM model of the Delft case.

139
Application and Key Features

elapsed, etc. is not applicable as design activity input (DAI) at the beginning 1
of the activity. However, as the activity progresses knowledge of the time 2
elapsed becomes a key input to the design management activity, e.g. in one 3
case a schedule is prepared, which is subsequently monitored by the time- 4
keeper. 5
Goals: The goals described within the brief refer only to the design (DG), 6
i.e. ease of use, appearance/form, technical feasibility and price. In addition 7
a time limit of 2 hours is set which may be categorised as a design activity 8
goal (DAG). 9
Resources: One of the key distinctions between the protocol data reported 1011
is in relation to the resources used to carry out the activities. Case #1 con- 1
siders the completion of the task by an individual designer, while in Case #2 2
the task is completed by a team of designers. Where an individual designer is 3
used that individual constitutes both a design (DR) and a design activity 4
resource (DAR) as the individual will carry out the design activities to create 5
the concept and the design management activities such as time management. 6
In the data reported for the team case, a specic individual (timekeeper) was 7
allocated the task of time management i.e. managing the design activity with 8
respect to the time goal, in addition to contributing to the design. Therefore, 91
two of the individuals constitute design resources (DR) only while the remain- 2011
ing designer constitutes both a design and design activity resource (DAR). 1
The above outlines key elements of a formalism that may be used to 2
describe performance of the activities and serve as a basis for analysing per- 3
formance and improving it. These elements are presented within Table 9.1. It 4
should be noted that the elements described here are those that are mentioned 5
6
7
Table 9.1. Elements of performance in Delft study 8
Element CASE #1 Individual CASE #2 Team 9
Inputs 3011
DI1 Design Brief Design Brief 1
DAI1 No initial Input No initial Input
2
Outputs 3
Concept description including Concept description including
information on: information on: 4
DO1 Ease of use Ease of use 5
DO2 Appearance Appearance 6
DO3 Form Form 7
DO4 Technical feasibility Technical feasibility
DO5 Estimated price Estimated price
8
DAO1 Time elapsed Time elapsed 9
Goals 4011
DG1 Ease of use Ease of use 1
DG2 Sporty appearance Sporty appearance 2
DG3 Appealing form Appealing form 3
DG4 Technically feasible Technically feasible
DG5 Reasonable price Reasonable price
4
DAG1 Complete within 2 hours Complete within 2 hours 5
Resources 6
DR1 Individual Designer Team member 1 7
DR2 Team member 2 8
DR3 Team member 3 (part) 9
DR4 Information resources via EXP1 Information resources via EXP1
DAR1 Individual Designer Team member 3 (timekeeper)
50
51111

140
Worked Example

1 explicitly in the text. Additional elements are implicit within the case, e.g. the
2 timekeeper may use a watch as a design activity resource (DAR), the com-
3 pleteness of the information provided by the annotated sketches may be seen
4 as a design goal (DG), additional information elements will be represented in
5 the output such as the approximate size (DO), etc.
6
7
8 9.3 Measuring Performance
9
1011 Performance may be measured through developing efciency and effective-
1 ness measures based on the understanding gained from formalising perfor-
2 mance in the manner described in Chapter 6. For example, the efciency could
3 be determined through some comparison of the quantity of the information
4 provided in the annotated sketches in relation to the time consumed and/or
5 number of people used. That is, the knowledge gained (K+) in design devel-
6 opment may be estimated through some assessment of completeness of the
7 annotated sketches, e.g. a comparison of the number of information elements
8 could be carried out. Information on the resources used is not readily avail-
9 able with the exception of the time and number of people in each case and
2011 therefore efciencies may be measured and compared as follows:
1
Efciency in terms of time: time =
2
No. of information elements : time used
3
4 Efciency in terms of people: people =
5 No. of information elements : no. of people
6
From the information made available on the experiments it is not possible to
7
distinguish the efciency of the design activity from that of the design man-
8
agement activity. However, if the time used in checking and managing the
9
schedule were recorded then the efciency of this management activity could
3011
be analysed. An example of a measure would be to compare the number of
1
scheduling related instructions (DAO) and the time required to dene them
2
(DAR).
3
The design effectiveness may be determined from a comparison of
4
the information provided in the sketches with the specic design goals. The
5
design management effectiveness may be determined through relating the
6
time elapsed to the time goal (DAG). The effectiveness of design development
7
in each case may be determined in relation to the goals as follows:
8
9 Effectiveness in terms of ease of use: ease of use = = rC (DO1 , DG1)
4011
Effectiveness in terms of appearance: appearance = = rC (DO2 , DG2)
1
2 Effectiveness in terms of form: form = = rC (DO3 , DG3)
3
Effectiveness in terms of feasibility: feasibility = = rC (DO4 , DG4)
4
5 Effectiveness in terms of price: price = = rC (DO5 , DG5)
6
Effectiveness in meeting time goal: time = = rC (DAO1 , DAG1)
7
8 The manner in which the design goals are specied makes such comparisons
9 difcult, e.g. the appearance of the product is specied as sporty and
50 appealing. Assessing the degree to which these goals are met is likely to
51111 require subjective measures. Similarly the specication of the price as within

141
Application and Key Features

a reasonable range requires further clarication for a more accurate analy- 1


sis of performance. This highlights the difculty in analysing performance at 2
early stages in design where specic goals are difcult to determine and the 3
need to invest time improving clarity where possible. Applying efciency and 4
effectiveness metrics as described above provides an assessment of perfor- 5
mance of the activities once the experiment is completed. The following 6
section looks at how the PERFORM approach may be used to identify how to 7
improve future performance. 8
9
1011
9.4 Applying PERFORM to Improve Performance 1
2
Within the case reported in the protocol studies the focus has not been on 3
improving performance in design but on gaining a better understanding of 4
how designers design. The design task is created as an experiment and viewed 5
as an isolated case. However, in an industrial setting the individual designer 6
or the three person team are likely to carry out similar tasks in the future. 7
Therefore the approach presented in Chapter 8 could be employed to further 8
analyse performance with a view to targeting areas for improvement in future 91
activities. The following describes the steps that are necessary to carry out a 2011
performance analysis using PERFORM. 1
Specication: to dene the scope, goals/priorities and resources. The scope 2
of the analysis has been discussed above and the goals have been specied. 3
Further denition is required to specify priorities for the goals (i.e. Low, 4
Medium or High) and dene a specic list of resources. The example used 5
here focuses on the use of design resources (DR) as the resources used in the 6
design management activity (DAR) are not easily distinguished and are rela- 7
tively small in comparison. Within the experiments it is suggested that the 8
information resources are not fully exploited. The analysis of performance 9
presented here is therefore focused on the impact of these resources in an 3011
effort to target areas for improvement. In addition, although the PERFORM 1
analysis allows grouping of resources to compare the effectiveness of differ- 2
ent groups, only one overall group is considered here, i.e. the information 3
resources (DR4). These resources are broken down into individual resources 4
that were used in the Delft experiment: 5
6
DR4.1 Marketing Research
7
DR4.2 Details on use 8
9
DR4.3 Trials evaluation
4011
DR4.4 Batavus design 1
2
DR4.5 Drawing of bike
3
DR4.6 Drawings of Buster 4
5
DR4.7 HiStar product information
6
DR4.8 Drawing of HiStar 7
8
DR4.9 Comparable products
9
Assessment: in the assessment phase the potential (Expt) and actual degree of 50
exploitation (Exal) is estimated for each resource. To simplify this example 51111

142
Worked Example

1 the potential degree of exploitation is assumed to be 100% in relation to all


2 resources. The ease of further exploiting each resource within the design and
3 development of the concept is also estimated as being High (H), Medium (M)
4 or Low (L). Each individual resource is then assessed in terms of its perceived
5 ability to assist in achieving the desired output, i.e. its impact, again expressed
6 as H, M or L. Table 9.2 provides an example of an assessment using the
7 resources and goals of the Delft example. The result of this phase of the assess-
8 ment would then be input to the PERFORM system for further analysis.
9 Analysis: using the analysis approach, data representation technique and
1011 measures described within Section 7.4 results are generated automatically
1 using the PERFORM system. Those measures used to relate the effectiveness
2 of resource groups are not considered here as there is assumed to be just one
3 group of resources due to the small number. Rather, a comparison of the effec-
4 tiveness of the individual resources is considered, both in relation to indi-
5 vidual goals and the overall goal (GO). Measures 2, 7, 8, 9, 10 and 14 as
6 described in Section 7.4.3 are applied within the analysis.
7
Representation: The PERFORM system allows the data from the analysis
8
results to be automatically represented in graphical format to ease interpre-
9
tation. In addition the raw data may be numerically ordered to provide views
2011
that show increasing or decreasing magnitude. A sample of the results is pre-
1
sented in Figure 9.2 as follows:
2
Figure 9.2 (a) presents a bar graph comparing the effectiveness of all the
3
4 resources against individual goals. The data presented here is created using
5 Measures 9 and 10. This allows the identication of particular goals where
6 there is much scope for improvement in terms of further exploiting the
7 resources to meet these goals (e.g. DG2 and DG3). Also, it highlights goals
8 that are most and least impacted upon by the resources. This may identify
9 the need for additional resources in order to achieve the goals.
3011 Figure 9.2 (b) presents a radar diagram representing the results of apply-

1 ing the Return on Investment (RoI) measure (Measure 14) on the data. This
2 measure takes account of the ease of exploiting the resource, the impact
3 on goals and the goal priorities to provide an indication of the increased
4 effectiveness that may result from further exploitation. The results shown
5 here are based on an analysis of the RoI of exploiting each resource in
6
7 Table 9.2. Assessment of resource exploitation and impact
8 Goals
9 Priority (W) M H H M L
4011 Resources (R) Ease (E) Potential Actual Ease Sporty Appeal Feasi- Price
Degree Degree of bility
1 of Exploi- of Exploi- Use
2 tation tation
3 Ex(Pt) Ex(Al)
4 4.1 Marketing Research M 100% 20% H H L L H
5 4.2 Details on use H 100% 35% H H H L H
4.3 Trials evaluation H 100% 50% M M H H M
6 4.4 Batavus design M 100% 90% H M H M H
7 4.5 Drawing of bike L 100% 10% H H H H H
8 4.6 Drawings of Buster M 100% 80% M L M H
9 4.7 HiStar product information H 100% 34% M M M H
4.8 Drawing of HiStar L 100% 35% L H H H H
50 4.9 Comparable products L 100% 65% L H L
51111

143
Application and Key Features

% of Total (Ideal) DR4.1


30.0
1
0% 5% 10% 15% 20% 25% 30% 35% 40%
DR4.9 25.0
DR4.2 2
20.0
DG1 15.0 3
DR4.8
10.0
5.0 DR4.3
4
DG2
0.0 5
DG3 6
POTENTIAL DR4.7 DR4.4 7
DG4 ACTUAL
DR4.6 DR4.5
8
DG5 9
1011
(a) (b) 1
20.0 % of Total (Ideal) 2
DR4.1 DR4.2
18.0
18.5 0.0 5.0 10.0 15.0 20.0 3
16.0
15.6 15.6
DR4.5 4
DR4.3 DR4.4
14.0 DR4.8 5
Potential Effectiveness

12.7
% of Total (Ideal)

12.0
10.7
11.9 DR4.2 6
10.0 DR4.5 DR4.6 DR4.4
8.0
7
DR4.3
6.0
6.2 8
DR4.7 DR4.8 DR4.1
4.0
3.9 5.1
POTENTIAL
91
DR4.9
2.0
DR4.9 DR4.7
ACTUAL 2011
0.0
0.0 1.0 2.0 3.0 4.0 5.0 DR4.6
1
Ease 2
3
(c) (d)
4
Figure 9.2 Presentation of analysis results. 5
6
7
relation to the overall goal (GO). Clearly Resource 4.2 (details on the use of 8
the product) provides greatest return from increased use. 9
Figure 9.2 (c) also incorporates the ease of exploitation in a scatter diagram 3011
comparing the ease with the effectiveness (potential) for different resources 1
against the overall goal (Measure 2). Those resources plotted in the top 2
right quadrant provide the most desirable balance between ease and effec- 3
tiveness, i.e. they are both relatively easy to exploit and the resulting 4
effectiveness is high. 5
6
Figure 9.2 (d) illustrates the comparison of potential and actual effective-
7
ness for individual resources against the overall goal (Measures 7 and 8),
8
ordered by values of potential effectiveness. It provides an indication of the
9
resources having greatest overall effectiveness and where the greatest scope
4011
for improvement is possible.
1
Other views of the data may be represented here, depending on the needs of 2
those involved in the analysis and used to support discussion during review. 3
Review: the nature of activities carried out during the review of the data 4
will depend on the individuals involved in the analysis. That is, a number of 5
individuals may wish to review and discuss the results together whereas in 6
other cases the performance analysis may be carried out by an individual and 7
this type of review may be inappropriate. The case presented here is unlikely 8
to be carried on into further, more detailed analysis, i.e. looking at specic 9
features of the resources used and the results presented in Figure 9.2 may be 50
used to directly support decision making on performance improvement. 51111

144
Worked Example

1 From the results it may be seen that the greatest scope for improvement
2 through further exploitation exists in DR4.5 (i.e. making greater use of the
3 information in the drawing of the bike) as evident in Figure 9.2 (d). However,
4 further exploiting DR4.2 (the details on use) is seen to be easier (see Figure 9.2
5 (c)) than doing so with DR4.5 while still having a signicant scope for improve-
6 ment. It is also seen to provide a greater return on investment (see Figure 9.2
7 (b)). The results provide support for decision making on improvements but
8 do not automate the process and nal decision making is carried out by the
9 individual or team.
1011 Having established target areas for performance improvement, e.g. the
1 further exploitation of DR4.2 and DR4.5, this decision may be implemented in
2 future activities. The formal model of performance and the assessment meas-
3 ures described earlier may be reapplied using the results of future activities
4 to assess the degree of improvement realised. The assessment of performance
5 and subsequent analysis to identify areas for improvement constitutes a cycle
6 of continuous improvement (Figure 8.1).
7
8
9 9.5 Summary
2011
1 The example given here is based on a small experimental scenario and
2 although somewhat simplistic in nature it serves to illustrate how the con-
3 cepts presented in Part 2 may be applied. The key points resulting from the
4 case are:
5 The modelling formalisms presented in Chapter 6 allow a model to be
6
developed that describes the key elements of performance in the Delft case
7
and allows efciency and effectiveness to be distinguished and measures
8
to be established.
9
3011 It is possible to identify and distinguish design and design management
1 activities within the experiments, i.e. the actual design of the artefact and
2 time keeping, supporting the principles described in the DAM and PMM
3 models. However, from the information available on the Delft experiments
4 it is not possible to readily identify measures that distinguish the efciency
5 of the design activity from that of the design management activity. This
6 may not be signicant here due to the somewhat articial environment
7 resulting in comparatively little time spent on design management as the
8 management task is a relatively small one (i.e. mainly timekeeping).
9 However, this could be a critical area in larger projects where design man-
4011 agement requires considerable resource.
1 The efciency of the overall managed activity could be considered to be the
2 relationship between the completeness of the information provided in
3 the concept and the time taken (assumed to be 2 hours). Similarly the
4 overall effectiveness could be assessed as the degree to which the overall
5 goal of producing a design concept within 2 hours was met, i.e. it will be a
6 combination of the design and design management effectiveness. This illus-
7 trates the support that the E2 and DAM models provide for performance
8 measurement.
9 The PERFORM approach was applied in a scenario investigating the effec-
50 tiveness of the information resources and resulted in the identication of
51111 particular resources for increased exploitation. This supports the concepts

145
Application and Key Features

described within the Resource Impact model and the overall approach as 1
dened in Chapter 8. 2
Continuous improvement may be implemented and measured using a 3
combination of the performance formalisms and the PERFORM approach 4
as illustrated in Figure 8.1 which describes how all the elements may be 5
related within a methodology for performance modelling and analysis. 6
7
8
9
1011
1
2
3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

146
1
2
3
4
10 Analysis and
Critique of
Metrics

In Chapter 5, activities in design were considered to exist


within two categories, design activities aimed at achiev-
ing design goals (DG) and design management activities
focused on design activity goals (DAG). Efciency and
effectiveness were related to both design and design man-
agement activities to give four aspects of performance for
5
consideration, i.e.:
6
7 1. Design Efciency: relating the knowledge gained in the
8 design activity to the knowledge used.
9 2. Design Management Efciency: relating knowledge
1011 gained in the design management activity to the
1 knowledge used.
2 3. Design Effectiveness: comparing the output and goal
3 knowledge of the design activity.
4 4. Design Management Effectiveness: comparing the
5 output and goal knowledge of the design management
6 activity.
7
Metrics have already been presented in the literature,
8
which reect different aspects of performance in design.
9
But a denitive list does not exist that could be readily
2011
applied in design and give sufcient insight into perfor-
1
mance. In many cases the aspect of performance repre-
2
sented by a metric is not clear. The E2 and DAM models are
3
proposed as providing fundamental and comprehensive
4
descriptions of performance. It should therefore be possi-
5
ble to describe established performance metrics in relation
6
to the models. Further, the principles dened within the
7
models, such as the denitions of efciency and effective-
8
ness, should not be violated by such metrics. The follow-
9
ing presents the results of a review of metrics to establish
3011
if they may be described in relation to the models pre-
1
sented here and whether they violate its principles.

147
Application and Key Features

10.1 Approach 1
2
Haffey [168] presents a review of metrics in design listing a range of metrics 3
that have been proposed in the literature relating to performance. These 4
metrics were analysed in relation to the four views of efciency and effec- 5
tiveness listed above to establish which view(s) a metric described. The 6
detailed results of this exercise are presented in Appendix A. 7
The true meaning behind the metrics quoted by different authors is often 8
not readily apparent and a number of assumptions had to be made within the 9
exercise. These assumptions are documented as part of the analysis results. 1011
In addition the following points are intended to clarify some of the results 1
described in the table presented in Appendix A: 2
3
A metric such as customer satisfaction is considered to ascertain the degree 4
to which the nal product meets or exceeds the initial requirements of the 5
customer. Within the scope of the models presented here this can be seen 6
to (partially) measure design and/or design management effectiveness 7
where it is assumed that customer satisfaction is measured in relation to 8
design (artefact) or design activity goals. For example, the metric may illus- 91
trate whether the product functionality satises the customer (design effec- 2011
tiveness) or how satised the customer is with the date on whichthe product 1
was delivered (design management effectiveness). It is assumed here that 2
the goals were accurately dened from the customer requirements and 3
therefore meeting the goals also means meeting the requirements. 4
Some of the metrics listed in the review are not directly attributable (NDA) 5
to design development activity performance. For example, the values 6
obtained using a metric such as market share achieved could be partly 7
attributed to artefact performance and considered to reect design effec- 8
tiveness. However, the values could equally be attributed to the corporate 9
image of the organisation promoting the product, the effectiveness of their 3011
supply chains, etc. The metric is therefore considered too vague to be 1
directly attributed to design development performance. 2
In some cases the exact meaning of the metric is unclear from the source 3
and therefore not possible to categorise or analyse it fully in this review. 4
A metric measures design development performance if it can be seen to 5
relate to effectiveness (design or design management) and/or efciency 6
(design or design management). Where this is not the case an effort is made 7
to dene the metric in relation to the broader work presented here, e.g. it 8
may be a measure of design activity output (DAO). 9
4011
1
10.2 Results 2
3
The results of the analysis provided support for the concepts presented in this 4
book, both in terms of the validity of the E2 and DAM models and increased 5
insight into design development metrics. 6
7
8
10.2.1 Supporting E2 and DAM Models 9
50
From an analysis of the metrics a number of observations can be drawn: 51111

148
Analysis and Critique of Metrics

1 The simple statement of a metric often provides insufcient understand-


2 ing of its application within design development performance, i.e. the
3 metrics existing in literature are open to various interpretations. The
4 assumptions/comments provided against each metric, which are based on
5 the understanding presented in Chapter 5 and categorisation in relation to
6 E2 and DAM models highlight that the concepts presented in this book
7 provide increased clarity.
8 The comprehensive nature of the model is illustrated by the fact that all
9 types of efciency and effectiveness were evident in the metrics, i.e. the four
1011 aspects described above.
1 The E2 and DAM models illustrate that the efciency metrics analysed in
2 the review relate to design and design management efciency and do not
3 distinguish them.
4
5
The models served to clarify the metrics. That is, many of the metrics could
6 not be attributed directly to efciency and/or effectiveness but could be
7 explained in relation to the models, e.g. some were seen to measure output
8 only while others described attributes of resources.
9 The generic nature of the models was veried. For example, some metrics
2011 were related to manufacturing activities and not design development.
1 However, it was possible to describe these metrics in relation to the E2 and
2 DAM models where the activities are considered to be manufacturing and
3 manufacturing management.
4 All of the metrics within the scope of design development were explained in
5 relation to the E2 and DAM models and therefore the principles described
6 within the models were supported (i.e. not violated) through the analysis.
7
8
9 10.2.2 Insights into Design Development Metrics
3011
1 In addition to providing support for the principles described within the E2
2 and DAM formalisms the review provided an opportunity to draw on these
3 formalisms in terms of providing increased understanding of design devel-
4 opment metrics. That is, a number of observations could be made in the
5 review, based on the new insight provided in this book:
6
7 1. Metrics currently existing in the research do not distinguish between design
8 and design management efciency. That is, a metric such as development
9 cost reects cost based efciency, where the expression is taken to mean
4011 the cost required to complete development and produce an artefact
1 (output). The metric is not presented in sufcient detail to distinguish
2 between the cost of design activities and that of design management activ-
3 ities and therefore may only be assumed to reect the combined efciency
4 of these activities. This limits the insight into performance in design devel-
5 opment, e.g. it does not allow the identication of inefcient design man-
6 agement activities being compensated by highly efcient design activities.
7 Identifying areas for efciency improvement is more difcult where such
8 metrics are used.
9 2. Many of the metrics, by denition, only measure output but are assumed
50 to reect design effectiveness in most cases. For example, measuring reli-
51111 ability does not determine design effectiveness per se, as it is not clear if a

149
Application and Key Features

high or low degree of reliability is desired. Therefore design effectiveness 1


may only be determined when this value of output is compared to the goal. 2
In the analysis carried out here metrics such as reliability are considered 3
to represent design effectiveness as it is assumed that reliability is estab- 4
lished as a design goal. The exercise highlights the importance of clearly 5
expressing design goals where possible. 6
3. A metric such as cost, if discussed in isolation, measures the amount of a 7
particular resource (money) consumed. The cost of development is a more 8
complex metric as it relates the cost to the achievement of a particular 9
output, i.e. it is assumed that the use of the word development means an 1011
artefact is produced. The cost of development is therefore considered to be 1
a measure of efciency. 2
4. Measures such as adherence to schedule may provide values that are inu- 3
enced by efciency but this is a direct measure of design activity effective- 4
ness only. That is, a schedule could be prepared poorly allowing inefcient 5
activity to adhere well (high effectiveness) or a very efcient activity to 6
adhere poorly (low effectiveness). 7
8
5. Metrics relating to rework are difcult to classify without some detail on
91
what rework means. Rework could be the result of a design error, i.e. failure
2011
to meet design goals relating to manufacturing requirements and therefore
1
represent design effectiveness. However, rework could also be caused by
2
late changes (to goals) introduced by the customer and therefore does not
reect the design effectiveness. Also, design is an iterative process and the 3
iterations could be considered by some to represent rework although this 4
may be a necessary process to achieve the required output. That is, design 5
may be carried out as a process of trial and error in some cases. 6
7
6. Many metrics do not reect efciency or effectiveness but an aspect that 8
inuences it. For example the level of skill of designers is an attribute of 9
the resources used and will have an inuence on the performance of design 3011
development. If the nature of this inuence can be determined then an 1
assessment of the designers skills may allow efciency and/or effectiveness 2
to be predicted in certain situations. 3
7. Many metrics are applicable to an activity/process, which is outside the 4
scope of design development, such as prot, break-even time, etc. It is dif- 5
cult to determine the relationship to design without using empirical analy- 6
sis and application of the metrics over a long period of time within a stable 7
environment. Such a stable environment rarely exists due to the novel and 8
non-repeatable nature of design development and such a study is consid- 9
ered beyond the scope of the book. Therefore these measures do not 4011
measure design development performance per se, but the values obtained 1
using them will be affected by design development performance. 2
8. The metrics that are described within the literature are treated in an iso- 3
lated manner, i.e. the relationships between the metrics are not formalised. 4
These relationships are a key element in achieving coherence in perfor- 5
mance measurement, e.g. understanding the trade-off between cost of 6
development and the quality of the nal output. 7
9. Complexity and novelty are commonly expressed inuences of design 8
development performance and are generally related to the design (artefact). 9
Complexity can be considered to relate the quantity of K+ that has been 50
transformed within an activity. That is, a highly complex product will 51111

150
Analysis and Critique of Metrics

1 involve a greater quantity of knowledge during its design development.


2 Novelty may be related to the nature of the knowledge that it transformed,
3 i.e. a highly novel product will involve more new knowledge than a less
4 novel one. These two dimensions require further research to establish their
5 inuence on design development performance.
6
7
8 10.3 Summary
9
1011 The analysis and critique found that the metrics presented by Haffey were
1 further explained using E2 and DAM models and could be placed within one
2 or more of the following categories:
3 Design Effectiveness
4
5 Design Management Effectiveness
6 Design/Design Management Efciency
7 Inuences on design development performance
8 Performance of activities outside the scope of design development and not
9 directly attributable to design development performance.
2011
1 The exercise supports the view that the E2 and DAM models provide a fun-
2 damental understanding of performance that is not violated by current per-
3 formance metrics. That is, of the metrics reviewed, none of them reected an
4 area of performance that is not described within the models and the related
5 descriptions in Chapter 5.
6 In addition to providing an evaluation of the E2 and DAM models the exer-
7 cise allowed the use of these models to provide further analysis of metrics
8 and gain improved insight into the area. It was found from the work that a
9 substantial degree of ambiguity exists within the literature concerning the
3011 meaning of different metrics and that the models provided a basis for reduc-
1 ing this ambiguity. The exercise also highlighted areas where further work
2 is necessary, e.g. to determine if metrics distinguishing design and design
3 management efciency may be dened, establish a means for measuring
4 complexity and its inuence, etc.
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

151
11 Industry
Appraisal

Five key industrial practitioners and an academic


researcher were invited to obtain a critical appraisal of the
key concepts and the implementation of those concepts,
presented within the book. Their areas of expertise are
shown in Table 11.1.

Table 11.1. Appraiser Expertise


Key areas of expertise
Design and Product Development
Design Development and Performance Management
Design Development, Process Development/Improvement
Mathematical Modelling, Operational Research, Risk Management
Statistical Analysis and Operational Research

The results of the appraisals are summarised below in


two sub-sections: the rst focusing on key questions or
issues raised during the appraisal and documenting how
these were addressed; the second highlighting the degree
to which the concepts being presented reected and sup-
ported industrial practice.

11.1 Issues Raised


The following outlines issues raised by the appraisers in
relating to industrial practice. These were addressed by
the authors through clarication of the elements being
discussed at that time or an appropriate response. Table
11.2 presents the results in a chronological order that
describes the subject being discussed, the issue raised
and the resolution of the issue at various points in the
discussion.

153
Application and Key Features

Table 11.2. Discussion on Key Issues 1


Subject(s) Question/Issue Raised Response/Resolution 2
Discussed 3
1. E2 Model . . . we would use requirements Quality [AQ7]is generally seen to relate to 4
and acceptance criteria instead design (artefact) effectiveness, i.e. how well 5
of goals/constraints but mean are you achieving your design goals?
the same thing . . . But effectiveness here also covers the 6
. . . does quality relate to effectiveness of the activity (and process). 7
effectiveness? 8
. . . yes process effectiveness is 9
what people miss in their
analysis . . . 1011
2. E2 Model. . . .the goals must exist at Different levels were discussed and the 1
various levels in Product identication of the need for coherence 2
Development and there will through goal alignment is identied. The 3
be different goals for the Design Activity Model distinguishes between 4
project manager than for the Design and Design Activity Goals. Within
designer. . . certain levels of activity these goals may be 5
addressed by the same person but at a project 6
level they may be distinguished between the 7
designer (design goals) and the project 8
manager (design activity goals).
91
3. E2 Model. . . . the difference between the This is related to the denition of efciency by
input and output knowledge Andreasen et al. identied within section 3.2. 2011
may be represented as the The degree of clarication can be seen as a 1
degree of clarication metric for measuring the knowledge gained 2
(clarication). . . (K+), as discussed in section 5.5.1 and may 3
therefore be used to support measurement of
efciency in a knowledge based activity. 4
4. Design . . . the Design Activity The Performance Measurement and
5
Activity Knowledge output might be a Management (PMM) model, discussed later in 6
Management new plan. . . the conversation, highlighted the situation 7
(DAM) where the control output of the Design 8
Model. Management activity may be to alter goals
and/or resources. This is represented as a 9
new plan for a subsequent activity. 3011
However, where the output is intended to be 1
an explicit plan the activity involved is a 2
planning activity and not a design activity per
se, i.e. the PMM model would contain 3
Planning and Planning Management. 4
5. Design . . . must t the design This was considered later in the discussion 5
Activity development process within an where the Scope of Analysis was used to 6
Management overall model of the business illustrate how the E2 and DAM models may be 7
(DAM) i.e. design development output applied at any level and coherence was
Model. must contribute to the business required to ensure overall goals were related
8
goals. . . to design development goals. 9
6. Need for . . . management of design is This balancing and trade-off was described 4011
Alignment about navigating between within the PMM model where both design 1
and product and process, i.e. it is (product) and design activity (process) goals 2
Coherence. about balance and trade-off. were represented and the decisions taken
You must ensure that you within the management activity were based
3
nish and not just end up with on the effectiveness of the design, e.g. the 4
a beautiful product when its beauty of the product and the design 5
too late. activity, e.g. the adherence to schedule. 6
7. Design . . . the designer in his head The DAM model applies equally to soft goals, 7
Activity uses soft goals, i.e. goals that i.e. at the level of specic activities the design 8
Model. are not specied but are and design management activities may occur
interpreted and created as part of an individual designers mental 9
implicitly. . . activities. The designer, when checking the 50
time, has moved from a design activity to a 51111
design management activity. One or both of
these activities may be directed by soft goals.
154
Industry Appraisal

1 Table 11.2. continued


2 Subject(s) Question/Issue Raised Response/Resolution
3 Discussed
4 8. Design . . . design management must Although design management may endeavour
5 Activity specify achievable goals. . . to specify achievable goals this is not always
Model. possible. In practice the designer uses the
6 design activity resource knowledge (DAR) that
7 is available. This knowledge may contain
8 elements such as previous activity/resource
9 efciencies, design effectiveness, etc. in an
implicit or explicit form. This may be used as
1011 a guide to what will be achievable in the future
1 and help specify achievable goals. However,
2 there may be a great deal of uncertainty
3 depending on the similarity of future activities
to those that have occurred.
4
9. Design . . . how does the model deal The model boundary (scope) may be dened
5 Activity with external inuences . . . for any specic situation. External inuence
6 Model. such as changes in the marketplace may be
7 translated, using higher level activities, to
8 changes in goals/constraints of design/
development. Inputs may be affected by the
9 performance of activities that provide input to
2011 design development. Similarly, changes to
1 resources outside the boundary may be
2 reected in changes to design development
resources.
3
10. Design . . . must remember that not Where the explicit aim of the activity is to
4 Activity just the artefact is an output design the PD goals, the activity is a particular
5 Model. but sometimes goals are type of activity e.g. planning, creating
6 designed or redesigned within specication, etc. and will have a
7 PD. . . management activity (see point 4 above).
However, having established a project plan or
8 created a specication, the subsequent
9 activities may alter the goals via the
3011 management activity. For example, the design
1 specication may contain certain quality
goals that are subsequently altered by a
2 management activity due to trade-offs with
3 activity goals.
4 11. Design . . . how does the model deal The degree of risk in achieving particular
5 Activity with Risk and Probability goals will be represented within the resource
6 Model. knowledge of any activity concerned by
creating and/or altering goals. Further
7 investigation is required to relate risk and
8 probability within the model.
9 12. Relating . . . sometimes you must train The people are represented as resources in
4011 Resources people in Integrated Product this diagram. Within the models described the
1 andGoal Development to make them knowledge represented in the resources
Achievement more effective. inuences the output and therefore the
2 (effectiveness) effectiveness. If the knowledge of the
3 resources is increased, through training, the
4 effectiveness is likely to be improved.
5 The PERFORM approach (discussed later)
shows how to identify a need for specic
6 training to increase your overall effectiveness
7 in relation to the PD goals.
8 13. Relating . . . it would be very difcult The appraiser agreed that this would be
9 Resources to dene the effectiveness of possible if asked to dene relative
50 andGoal a CAD station in my effectiveness in comparing resources.
Achievement organisation . . .
51111 (effectiveness)

155
Application and Key Features

It was found during the appraisals that the issues raised were centred pri- 1
marily on the understanding and insight into design development provided 2
by the modelling formalisms, i.e. E2, DAM and PMM models. The issues raised 3
by the appraisers were mostly addressed by the work presented and in some 4
cases conrmed the need for further investigation. Additional comments 5
helped clarify some key concepts and terminology. 6
The following illustrate the support provided for the modelling formalisms: 7
8
Their comprehensiveness in terms of describing design development is
9
illustrated in a number of points, e.g. points 1, 2, 6, 9 and 11.
1011
The generic nature of the formalisms is illustrated in points 4, 5, 7 and 10. 1
The relationship to existing research is evident in point 3. 2
The support provided for coherence, goal specication and management 3
of resources is described in points 6, 8 and 12 respectively. 4
Point 13 provides support for the relative assessment used within the 5
PERFORM approach. 6
7
8
11.2 Industrial Practice 91
2011
During the discussions with industrial practitioners the appraisers offered 1
comments on how the concepts related to their work, i.e. industrial practice. 2
The comments in Table 11.3 have been taken from the conversations: 3
4
5
6
Table 11.3. Relation to Industrial Practice 7
Subject(s) discussed Comments 8
1. Research Review . . . In Company X1 there has been considerable effort expended seeking 9
a generic model/formalism for Product Development performance but 3011
without success, i.e. it was agreed that a such a model did not exist . . . 1
. . . the company gave up on the use of empirical/statistical analysis as 2
the process changes so much, e.g. a whole new process was introduced in 3
using 3D CAD and therefore previous performance was not applicable 4
and trends were not possible to establish . . .
5
. . . the proliferation of metrics is something which is prevalent within
the industry as the company uses many metrics without an
6
understanding of why they are using them. Some metrics are a legacy of 7
many years past and have no relationship to the type of activities 8
currently carried out . . . 9
4011
2. E2 . . . the model provides clarity on the distinction between efciency and
effectiveness and also illustrates how they relate. . . the fundamental 1
model says it all to me 2
. . . I feel the validity [of the E2 Model] is proven and it shows 3
cohesiveness in relation to an area where there is no model existing . . . 4
5
3. Coherence . . . this is called a cause-effect relationship within the company and is 6
the focus of much work aimed at determining the nature of this cause-
effect. This maps directly to the denition of goal/sub-goal relationships 7
and the need for alignment and coherence. The problem of sub- 8
optimisation is a substantial one within the organisation. . . 9
. . . the organisation set up a separate team to develop and manage a 50
goal breakdown (cause-effect) structure . . . 51111

156
Industry Appraisal

1 Table 11.3. continued


2 Subject(s) discussed Comments
3 . . . without the links (relationships) management may pursue sub-
4 optimisation to make their area of the business perform well but with a
5 sub-optimal effect overall. This occurs frequently in the organisation. . .
6 alignment/coherence is achieved by gut feel in the industry . . .
7
4. DAM and PMM . . . overall the models highlight and clarify the areas of design and
8 design management and the need to understand its efciency and
9 effectiveness. The organisation is currently trying to distinguish design
1011 and design management throughout the design development process.
1 However, this model distinguishes the efciency and effectiveness of
these activities. The organisation is currently not distinguishing and
2 measuring the performance of the management activities very well
3 although they account for 1030% of all activities.
4 The model may be used to distinguish iteration, rework and change as
5 used in the organisation, i.e.
6 Iteration represents the repeating of an activity within the scheduled
7 time (i.e. within the originally specied DAGtime). That is, when the
8 design management effectiveness is assessed and it is found that the
9 time goal has not been reached/exceeded the activity is continued in
an effort to improve design effectiveness.
2011 Rework takes place where the output is released but manufacturing
1 request a change. That is, in relation to the PMM model the decision is
2 taken to stop based on an acceptable level of design and design
3 management effectiveness. The requirement to repeat the activity is a
result of a subsequent activity not accepting the output and altering
4 the goal(s) for the design activity. Reasons for this may vary, e.g. the
5 goals may have been incorrectly specied or the decision to release the
6 output may have been taken without meeting the design goals.
7 Change is considered to take place when the goals are altered by the
8 customer. Therefore it is similar to rework with the exception that the
source of the change is an external one.
9
Although the model is presented at a very fundamental level, i.e. the
3011 design and design management activities taking place as mental
1 activities carried out by a single individual, there is an attempt with
2 Company X to draw this distinction at a very high level.
3 . . . when mapped to a higher level the PMM model accurately describes
4 the phase review process within the organisation where both types of
5 goals are assessed and a decision is reached on whether to stop the
activities, i.e. hand over to manufacturing, or continue and improve the
6 design . . .
7
. . . The models provide a very useful understanding of how decisions are
8 made within design and design management and relate well to the idea
9 of iteration . . .
4011 5. Implications . . . agree that there is a proliferation of metrics which isnt helpful
1 and what is presented provides a framework for developing a
2 consistent array of metrics, ensuring the goodness and relevance of
3 the metrics . . .
4 1 Details of organisations are withheld due to condentiality requirements.
5
6
7
8 The comments highlight the ability of the work to clarify and provide
9 insight into design development from the viewpoint of performance meas-
50 urement and management. In particular the work is seen to relate closely to
51111 industry needs and ndings, in particular:

157
Application and Key Features

Point 1 provides evidence of the design problem identied in Part 1, i.e. 1


that a formalism for design development performance is lacking in indus- 2
try. Further this point supports the ndings of a review of the eld in 3
terms of weakness of empirical research and the problems of establishing 4
relevant metrics. 5
Point 2 provides strong support for the clarity and validity of the E2 model 6
in presenting a comprehensive model of performance. 7
Point 3 suggests that the aim of the work relates very closely to the aims 8
of those in industry and the performance formalism further claries 9
performance measurement requirements for the appraisers. The ability of 1011
the model to describe and clarify actual industrial practice is also evident 1
and provides an improved understanding of the industrial processes and 2
practices such as iteration, rework and change. 3
4
Point 4 suggests that the work presented here provides a basis for estab-
5
lishing metrics in practice.
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

158
12
1
2
3
4
The PERFORM
System

Section 7.8 identied the use of both experienced facili-


tator(s) and a software system to support the implemen-
tation of the PERFORM approach. The software provides
a key tool to allow the analysis and presentation phases
to be largely automated. Implementation of a software
5 system such as this requires a logical denition of all
6 elements and therefore can be seen as a type of logical
7 test and validation of the concepts described within
8 Chapter 7.
9 In addition to providing a means to evaluate the real-
1011 isation of the PERFORM approach it is suggested that
1 such a computer-based tool provides necessary support
2 for the implementation of the approach in all but the
3 simplest cases. In many cases a large number of rela-
4 tionships may exist, being the product of the number
5 of goals and resources that can impact upon those goals.
6 For example, with 6 goals and 25 individual resources
7 then there are 150 relationships to be analysed. Having
8 dened the relationships based on the data representa-
9 tion methods presented in Section 7.4.2, further analysis
2011 of the data would require signicant time using conven-
1 tional calculation methods such as a pocket calculator.
2 In addition, the ability to visualise trends in the results
3 of the analysis would be difcult within such a complex
4 matrix. The PERFORM system provides real-time ana-
5 lysis and presentation of the results in various formats
6 and supports the implementation of what-if scenarios
7 through altering the data. The system is summarised in
8 the following section.
9
3011
1
2

159
Application and Key Features

12.1 System Overview 1


2
The PERFORM system has been implemented using existing software appli- 3
cations. Much of the system is implemented within Microsoft Excel 97 util- 4
ising the Visual Basic object model [169] to build customised modules 5
and using dynamic links to Microsoft PowerPoint 97 for presentation of 6
results. A portable personal computer (laptop) and data projector comprise 7
the hardware platform. 8
The system architecture is presented in Figure 12.1 showing the key ele- 9
ments of the system and the links between these elements. The following 1011
describes these elements in more detail and outlines how they are related 1
within the system. 2
The system libraries contain the following elements: 3
4
Measures: which are applied within the analyser module to establish values
5
of resource effectiveness. These measures have been dened within Section 6
7.4.3. 7
Data-Ordering Templates: used to support the ordering of input data and
8
results in a manner that can be more readily interpreted. 91
Graphical Templates: used within the representer module to generate 2011
different graphical representations (views) of the results (e.g. those in 1
Figure 9.2). 2
Presentation Templates: to support the presenter module in developing an 3
overall presentation of results. 4
Four program modules have been developed within the system: 5
6
Relationship Modeller: is a computer-based implementation of the
7
PERFORM matrix and contains all the elements described in Table 7.3. 8
9
RELATIONSHIP 3011
MODELLER 1
LIBRARIES
Matrix 2
Measures
3
4
ANALYSER I 5
Data Ordering N 6
Templates Calculator
T 7
List Generator 8
E
Graphical 9
Templates R USER 4011
F 1
Presentation REPRESENTER
A 2
Templates Graph 3
Generator C
4
E
5
6
PRESENTER 7
Slide 8
Generator 9
50
Figure 12.1 PERFORM system srchitecture. 51111

160
The PERFORM System

1 Analyser: incorporates all the analysis, data ltering and ordering rules for
2 the system and transforms the input data to output based on the measures
3 and data ordering templates.
4 Representer: takes output data from the analyser and applies graphical tem-
5 plates to produce graphical output to aid interpretation.
6 Presenter: applies a presentation template to all the graphical output to
7 produce a logically structured presentation.
8
9 An interface implemented via Microsoft Excel, provides a means by which
1011 the user may enter the various elements of the PERFORM matrix. The entry
of values other than those dened in Section 7.4.2 for Ease, Degree of
1
Exploitation, Priority and Impact is prevented by the relationship modeller
2
through the implementation of data input rules. The interface also provides
3
a visual presentation of the data to the user. Figure 12.2 illustrates the inter-
4
face including the representation provided for elements such as Ease, Degree
5 of Exploitation, Priorities and Impact1. A number of command buttons are
6 provided that initiate the program modules. The Hide and UnHide
7 buttons allow the user to make rows and columns that are not in use invisi-
8 ble/visible. That is, when asked to hide, the system will automatically select
9 and hide those rows and columns without data. The PERFORM! button
2011 triggers the analyser and representer modules to produce graphical output
1 results from the data within a matter of seconds.
2 The use of an earlier version of the system in an industrial environment
3 was reported in Chapter 4. This demonstrated that the general principles
4 incorporated within the system supported industrial practice. The system
5 described above has evolved from the earlier version based on the results of
6 the industrial trial and the increased insight developed within Part 2 of this
7 book, in particular:
8
9 The system is based on the concept of variable resource exploitation as
3011 opposed to changes in the impact of resources. This required a change in
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111 Figure 12.2 PERFORM system interface.

161
Application and Key Features

the software and results in a system which, although more complex in 1


structure, provides a more simplied approach for the user. 2
The system is based on a more generic concept of the relationship between 3
resources and goals, i.e. it is not limited to the use of the Design 4
Coordination concept. The denition of resources and groupings is exi- 5
ble within the system, i.e. the system has the capability to dene up to 150 6
resources and 10 groupings. Having dened the particular resources and 7
groupings the system automatically eliminates any redundant cells from 8
the analysis. 9
The analysis and formatting of graphs within the system has been com- 1011
pletely automated with a single command required from the user. The orig- 1
inal system required a number of commands to be executed. This has 2
improved the exibility of the system and the ability to support what-if 3
scenarios and generate results within a matter of seconds. 4
5
The implementation of the PERFORM approach illustrates the realisation of 6
the Resource Impact (RI) model and key principles dened in Chapter 7 7
within a logical computer-based environment, i.e.: 8
The relationship between degree of exploitation and effectiveness, as 91
described in the RI model, is modelled within the system using the 2011
PERFORM matrix and the measures dened in Section 7.4.3. 1
2
The PERFORM interface provides a means to dene and illustrate the mul-
3
tiple impacts of resources on goals.
4
All analysis measures dened in Section 7.4.3 are dened as formulae 5
within the software to support automated analysis. 6
The graphical representations discussed within Section 7.5 are imple- 7
mented and their formatting automated. 8
The system therefore illustrates the realisation of the elements dened within 9
Chapters 7 and 8 in a dynamic model relating ease, exploitation, goals/ 3011
priorities, and impact. Manipulation of the model is provided via the system 1
interface allowing sensitivity analysis and the generation of what-if scenarios. 2
Testing of the system with sample data demonstrated the ability of the system 3
to maintain and support such a model. 4
5
6
12.2 Application of PERFORM in Industry 7
8
The previous section described the realisation of the PERFORM approach 9
within a computer-based system. The basic approach is that initially pre- 4011
sented in Chapter 4 and formalised in Chapter 7. An additional industrial case 1
study was carried out to evaluate the new approach that overcomes some of 2
the weaknesses as described in Section 4.2.4. The following section presents 3
an overview and results of the case study carried out in a design development 4
business unit of a global organisation. 5
6
7
12.2.1 Approach 8
9
The case study was carried out at Company B, a company engaged in the 50
design of engineering products of widely varying scale and complexity for 51111

162
The PERFORM System

1 both internal and external customers. The participants were rstly provided
2 with a presentation of the overall concepts and approach. The analysis data
3 was then created through a facilitated workshop involving the Business Unit
4 Manager and Quality/Information Services Manager. The overall approach
5 used is that as dened in Chapter 7 and therefore only the results are reported
6 here.
7
8
9 12.2.1.1 Specication
1011
1 The scope was dened as the activities of a specic business unit in the
2 organisation and focused on a particular product type that constituted the
3 majority of the units work. Approximately 70 people, many with specialist
4 technical product knowledge, are employed full-time within the unit although
5 particular projects may involve up to 200 staff on a temporary basis. The goals
6 and their priorities are presented in Table 12.1.
7 The organisation carries out specialist design work and the resources spec-
8 ied for the analysis reect this area of work. The list of resources is presented
9 in Table 12.2. Similar resource names are used within the area of Expertise
2011 (people) and Technical Tools. These are distinguished using P and T
1 respectively in the analysis.
2
3
4 Table 12.1. Goals and priorities Company B
5 Goal Topic Area Description Priority
6 L/M/H
7 G1 Growth Growing the Business Year on Year L
8 G2 CN (People) Serve the Customer Need in terms of People M
9 G3 CN (Studies) Serve the Customer Need in terms of Studies M
3011 G4 External Clients Meet the Needs of External Clients M
G5 Inc. Expertise Incubate (grow) the Expertise within the Department H
1 G6 Customer Achieve Customer Excellence by Closer Relationships M
2 G7 Performance Meet/Exceed Performance Targets (Budgets) H
3 G8 Innovation Demonstrate Innovative Capability L
4 G9 Partnering Form Relationships (formal/informal) with other Companies L
5
6
7 Table 12.2. Resource groups and resources Company B
8 Communications Processes Expertise Management Tools
9 Response Naval Arch. P SAP
4011 Team Talks Signatures P CIP
Learning Forums Dynamics/Shock P CSRs
1 Roadshows Marine Eng. P LCycle Mgmt
2 Corporate Brochures CAD P Quality System
3 Project Mgmt/Risk P Knowledge Mgmt
4 Support P
5 Technical Tools IT Systems
6 Naval Arch. T Hardware
7 Signatures T Applications
8 Dynamics/Shock T Infrastructure
9 Marine Eng. T
CAD-T
50 Project Mgmt/Risk T
51111

163
Application and Key Features

12.2.1.2 Assessment 1
2
The participants carried out the assessment using templates based on the 3
PERFORM matrix to dene: 4
5
Ease of exploitation 6
Potential and Actual Degree of Exploitation 7
Impact of Resources on Goals. 8
9
These results were transferred from paper based templates to the PERFORM
1011
system for analysis. The data is shown in Figure 12.3.
1
2
3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
Figure 12.3 Analysis data (system view) Company B. 51111

164
The PERFORM System

1 12.2.1.3 Analysis/Presentation
2
3 The system described in Section 12.2 provided an automated analysis of the
4 results with direct transformation to graphical output. A sample of the results
5 is provided in Figure 12.4 showing some of the key graphs2. The graphs
6 presented here illustrate:
7
8 (a) The RoI for Management Tools: The results indicate that investment in
9 CIP would provide a greater return than an area such as Knowledge
1011 Management, based on the overall goal of the business unit.
1 (b) The Ease of Implementation versus Effectiveness for Expertise: Further
2 exploitation of the expertise in Naval Architecture and Marine
3 Engineering would be relatively easy, yet provide a signicant impact on
4 effectiveness in relation to the overall goal.
5 (c) The comparison of Potential and Actual Effectiveness of all resources
6 against each of the goals: The results suggest that the resources being used
7 can make the greatest impact on Incubating Expertise and Meeting/
8 Exceeding Performance Targets and that there is signicant scope for
9 improvement in relation to these goals. That is, the difference between
2011 actual and potential effectiveness is substantial.
1 (d) The comparison of Potential and Actual Effectiveness of each individual
2 resource against the overall goal: This graph gives overall results for spe-
3 cic resources and highlights areas where signicant improvements
4 may be made. Also, the ease of further exploiting a resource is included
5
6
7
8 SAP
10.0
7.0
Naval Arch.-P

9 8.0 6.0
6.2
6.0

Signatures-P
3011 Knowledge Mgmt
6.0
CIP 5.0
5.5
Potential Effectiveness

1 4.0
% of Total (Potential)

Dynamics/Shock-P
2.0 4.0
2 0.0 3.2
3.3 Marine Eng.-P

3 3.0
CAD-P
4 Quality System CSRs
2.0

5 1.0
1.3 Project Mgmt/Risk-P

6 0.0 Support

7 LCycle Mgmt 0.0 1.0 2.0 3.0 4.0 5.0


Ease
8 (a) (b)
9
4011 % of Total (Potential)
0 1 2
% of Total (Potential)
3 4 5 6 7
1
No. Ease RESOURCES (R)
0% 5% 10% 15% 20% 25%
1 M Infrastructure
2 M Signatures-P
Growth
2
3 H Marine Eng.-P
4 H Naval Arch.-P
5 M Signatures-T
CN(People) 6 M Dynamics/Shock-T

3 CN(Studies)
7
8
9
M
M
M
Marine Eng.-T
Project Mgmt/Risk-T
CAD-T

4
10 M Naval Arch.-T
11 M Dynamics/Shock-P
Ext Clients 12 H CIP
13 H Applications

5 Inc. Expertise POTENTIAL


ACTUAL
14
15
16
H
H
H
SAP
Hardware
Support

6
17 M Project Mgmt/Risk-P
Customer (ACE) 18 M Quality System
19 H CSRs
Performance 20 H LCycle Mgmt

7 Innovation
21
22
23
H
H
L
Knowledge Mgmt
CAD-P
Corporate Brochures
Potential
Actual

8 Partnering
24
25
26
M
L
L
Team Talks
Response
Roadshows

9
27 M Learning Forums

(c) (d)
50
51111 Figure 12.4 Analysis results Company B.

165
Application and Key Features

in the presentation. The greatest scope for improvement is seen to be in 1


further exploiting the Information Technology (IT) infrastructure that 2
is moderately easy. Further exploiting SAP would not provide as great an 3
improvement but would be easier to do. 4
5
6
12.2.1.4 Review 7
8
The results generated signicant discussion and the participants requested 9
some what-if scenarios to be carried out. These scenarios resulted in a number 1011
of small changes to the input data and nal agreement was reached on the 1
results. Although some target areas were evident and agreed among the par- 2
ticipants they were not conrmed as target areas by them. It was agreed that 3
the results would be presented to other key members of staff (management 4
team) for consideration of improvement actions. 5
6
7
12.2.2 Industrial Results 8
91
The Resource Impact model dened within Chapter 6 provides the underly- 2011
ing principles for the approach dened in Chapter 7 and implemented here 1
in an industrial environment. The earlier version of this approach, described 2
in Chapter 4, has been replaced with this new approach that is based on the 3
further understanding of key principles such as resource exploitation and 4
effectiveness. That is, the relationship between a resource and a goal, i.e. the 5
impact, is assumed to be constant (high, medium or low) while the degree of 6
exploitation of that resource is assumed to be variable and dened as the 7
actual and potential degree of exploitation. The difference in resource effec- 8
tiveness achieved with actual and potential exploitation is considered to be 9
the scope for improvement. This new approach to the analysis of resource 3011
impact is therefore evaluated in the case study. 1
The initial presentation of the concepts, including the modelling formal- 2
ism, provided a new and comprehensive understanding of performance for 3
the participants that they were able to relate to their practices. The partici- 4
pants provided support for the principles presented and no aspect of the mod- 5
elling formalism was considered inappropriate or invalid. 6
The concepts introduced within the analysis approach were implemented 7
and results were obtained as described above. The industrial practitioners 8
understood and were able to apply the concepts presented such as ease/degree 9
of exploitation, impact of resources, etc. The approach successfully estab- 4011
lished the effectiveness of the resources being used and identied the areas 1
where signicant improvements could be obtained. Results were provided in 2
real-time providing a basis for signicant discussion. The following were the 3
key points of feedback from the participants: 4
5
6
12.2.2.1 Strengths 7
8
The approach provided a comprehensive framework in which the partici- 9
pants could analyse their product development activities in a structured 50
manner. 51111

166
The PERFORM System

1 The process of creating the initial data, e.g. specifying/prioritising goals,


2 establishing resources, etc. reveals a great deal about what we do and
3 how we do it. The capture of this data was considered to be of great
4 importance in understanding the business unit.
5 The results provide key information to support decision making for
6 performance improvement and this approach could be widely used within
7 the department as a decision support tool.
8 Although the data created during the analysis is subjective this subjectiv-
9 ity is greatly decreased (in comparison to current practice) through using
1011 such a structured approach.
1 The presentation of the data and graphical results provides the rationale
2 behind decisions on performance improvement. Although many of the
3 results conrmed existing hunches this rationale is difcult to dispute
4 and therefore provides important support in achieving buy-in to perfor-
5 mance improvements.
6
7
8 12.2.2.2 Weaknesses
9
2011 The analysis would have beneted from the participation of more members
1 of the departmental management team who would bring different per-
2 spectives. As stated above, the intention was to present the results to other
3 members of staff. A further check on the results may be carried out at this
4 point and the PERFORM system may be used to generate some further
5 what-if scenarios based on wider participation.
6 Many graphs of results were presented to the participants. Therefore there
7 was a lot of information to absorb in a relatively short time. It would have
8 been more benecial if more time could be spent absorbing these results
9 and understanding exactly what they meant. Further what-if scenarios
3011 would then have been useful.
1 In establishing the impact of resources a signicant number of relation-
2 ships (243) were dened. Its possible that there were inconsistencies across
3 the analysis created from fatigue and a re-check of all data following the
4 analysis would be useful.
5
6 When asked if the overall approach could achieve performance improvement
7 it was conrmed that the approach provided substantial insight into the actual
8 and potential performance of the department and although it could not
9 improve performance per se, it provided an important step in achieving this
4011 improvement. That is, the results provided the basis upon which to select
1 target areas for performance improvement.
2
3
4 12.3 Validation of Results
5
Five different applications have been presented in the preceding chapters of
6
Part 3. Each relates to different and sometimes overlapping elements of the
7
methodology dened in Part 2. The following summarises the outcome of each
8
application exercise:
9
50 The worked example, based on the Delft experiment, demonstrates that
51111 the overall methodology presented in Chapter 8 may be applied in the

167
168
Table 12.3. Mapping between requirements, solution and evaluation approaches
Requirements Performance Formalism Design & Management Coherence Improvement
   
Elements of E2 provides a formalism of a DAM distinguishes design and PMM illustrates alignment The PERFORM Approach, based
Methodology knowledge processing activity design management activities and across goals in managing on the performance formalism and
and denes its performance as their relative elements (e.g. goals). performance. The use of the the RI model, and implemented
efciency and effectiveness, PMM relates these activities and performance formalism within within the PERFORM system
related within the formalism. describes the process of measuring a GBS supports alignment provides a means to identify and
and managing performance. within goals. quantify the inuence of resources
and to establish the best means for
improvement.
   
Evaluation Approaches
Worked Example The formalism provides the The DAM model is used to further The example illustrates how The PERFORM approach is applied
basis for dening efciency and distinguish the various inputs, design and design activity using elements described within
effectiveness measures. The outputs, etc., and determine the goals may be modelled, i.e. the the experiment and additional test
experimental data is used to elements of design and design design goals of form, price and data. The use of the approach to
determine performance management effectiveness measures. the design activity goal of time. identify target areas for
measures based on the The scenario illustrates how the The trade-off in effectiveness improvement is clearly illustrated.
formalism. process of performance measurement for each of those goals
and management is modelled in the demonstrates alignment across
PMM model, i.e. the management of design and design activity goals.
time while carrying out the design
activities.
Metrics The distinction between The distinction between design/
Review/Analysis efciency and effectiveness design activity effectiveness is
remained valid in the review of reected in the metrics review.
metrics, i.e. the metrics that Distinguishing metrics for design
measured performance could be and design activity efciency
distinguished in these two areas. measurement was not possible
although the principle was not
violated. Further insight into metrics
is gained through the performance
formalism.
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
4011
3011
2011
1011

51111
Table 12.3. Continued
Industrial appraisal The questions/feedback provided The DAM and PMM models were The support provided in the The principles underlying the
an evaluation of the shown to be comprehensive and PMM model for coherence was relationship between the
comprehensiveness of the E2 generic in their ability to formalise described in relation to the exploitation of resources and
formalism. The E2 principles design performance. The PMM problem of trading off the effectiveness were described and
were supported. model in particular was seen to beauty of a product with the supported by the appraisers. The
The formalism was also seen to closely reect and clearly describe launch date. This demonstrates use of relative assessment was
reect and clarify industrial industrial practice. alignment across goals. considered feasible in industry.
practice.
System Development Both design and design The PERFORM system provides a
activity goals and their logical realisation of the PERFORM
priorities are modelled within approach and illustrates how the
the system. The system relationships may be modelled and
provides a dynamic model of analysed to identify target areas for
the trade-off through improvement. This realisation
calculating the weighted provides a degree of validation of
effectiveness in relation to the the approach.
overall goal (GO).
Industrial The E2 model was presented as a The industrial participants supported Both design and design activity The implementation of the
Implementation basis for implementing the the principles of the DAM and PMM goals were specied in the PERFORM approach provided
PERFORM approach. It was seen models although more time would be analysis data although many signicant insight into the design
to clarify performance and required for participants to fully were combined goals. All goals development process of the
comprehensively describe it. understand the implication of PMM. were seen to contribute to a organisation and successfully
higher level organisational created results identifying areas
goal. for improvement.

169
Application and Key Features

modelling and analysis of performance and provides a basis for continu- 1


ous performance improvement. The methodology allows the performance 2
in design to be distinguished from that of design management in terms of 3
effectiveness but would require additional data to draw this distinction for 4
efciency. 5
The list of metrics presented by Haffey [168] were shown not to violate the 6
E2 and DAM models and the models provided insights into these metrics. 7
The appraisals provided an evaluation of all elements of the overall method- 8
ology and it was shown that points raised and addressed provided support 9
for different elements of the methodology. In particular the generic and 1011
comprehensive nature of the E2, DAM and PMM models was evident. 1
Further, the relationship between these models and research/industrial 2
practice was assessed, showing that such practice does not violate any of 3
the principles dened within the models. 4
5
The implementation of the PERFORM system provided a logical test and
6
realisation of the principles dened in Chapters 6 and 7. The system forms 7
a dynamic computer-based model of the elements and relationships
8
dened within the PERFORM approach. 91
The industrial implementation of the PERFORM approach provided 2011
further validation of the overall approach and the system used to support 1
it. The case study conrmed the ability of the approach to model the inu- 2
ence of resources on effectiveness and identify areas for improvement. The 3
strengths and weaknesses of the approach were revealed. 4
Chapter 3 highlighted a list of requirements in the form of areas to be 5
addressed in developing the methodology. Table 12.3 provides a summary 6
of how the application approaches relate to the methodology and initial 7
requirements. 8
9
3011
Notes 1
2
1 Note: the data represented here is arbitrary data used to illustrate the functionality of the 3
system. 4
2 The full results of the analysis are condential to the company. 5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

170
13
1
2
3
4
Methodology
Review

The overall methodology presented in this book includes


a formalism for performance modelling presented in
Chapter 5 which provides a basis for analysing perfor-
mance. An area of performance analysis, i.e. the rela-
tionship between the resources used and effectiveness
5 obtained is developed within Chapter 6 resulting in
6 a model of resource impact. The PERFORM approach
7 is presented within Chapter 7 as a means to analyse
8 resource impact within an industrial setting. Chapters 9
9 to 12 describe the application of all the elements of the
1011 methodology using a number of approaches.
1 The methodology developed is distinctive in two key
2 areas, i.e.:
3
4 The formalisation of design development performance
5 using the E2, Design Activity Management (DAM) and
6 Performance Measurement and Management (PMM)
7 models.
8 The analysis of the inuence of resources based on the
9 principles dened within the Resource Impact (RI)
2011 model and the PERFORM approach.
1
The following chapter presents a discussion of the
2
strengths and weaknesses of the work and avenues worth
3
further investigation.
4
5
6
7 13.1 Design Development
8 Performance Formalism
9
3011 The models presented within Chapter 5 are aimed at
1 providing a clear denition of performance in design
2 development. The modelling formalism is focused on the

171
Application and Key Features

type of activities existing in design development and in particular distin- 1


guishing and relating design and its management. The work presents a novel 2
insight into performance in design development. 3
4
5
13.1.1 Generality 6
7
The E2, DAM and PMM models adopt a knowledge processing viewpoint of 8
the activities that are characterised by their inputs, outputs, goals and 9
resources. It is based on a widely used and relatively simple approach to mod- 1011
elling activities, processes or functions, i.e. IDEF0, to which many in indus- 1
try can relate. It may be possible to dene various other activities in terms of 2
the knowledge processed, suggesting that the formalism could be used to 3
model the performance of activities other than those in design development. 4
It has been shown in the review of existing metrics that the models may apply 5
to manufacturing activities in terms of distinguishing manufacturing from 6
its management. Similarly it may be possible to model the performance of 7
the overall business from the knowledge processing perspective, i.e. where the 8
overall goal may be the prot of the business, the resources are those of 91
the complete business, etc. 2011
The general applicability of the formalism means that it does not provide 1
a model of actual performance in a particular situation, only an abstraction 2
of that model. That is, to model performance within a particular department, 3
project, etc. requires the boundary to be dened and the inputs, outputs, goals 4
and resources to be established. The formalism has been used to successfully 5
describe performance in design development and its application beyond this 6
is not tested. 7
8
9
13.1.2 Comprehensiveness 3011
1
The E2 model provides a clear and uncomplicated representation of design 2
development performance utilising the IDEF0 approach. Performance is 3
described comprehensively within the concepts of efciency () and effec- 4
tiveness () and the E2 model formalises each of these elements and their 5
relationship. The measurement of efciency, in accordance with the E2 model, 6
requires an analysis of the amount of knowledge gained in a design or design 7
management activity. An approach or method for analysing the quantity of 8
knowledge has not been explored in this book although the ability to measure 9
knowledge would enhance the understanding of efciency. Although ef- 4011
ciency and effectiveness have been related to performance in existing 1
research, this formalism provides a novel model of their nature and relation- 2
ship that is not evident in other work. It therefore provides the basis for 3
further work in areas such as efciency measurement. 4
The DAM and PMM models provide new insight into design and its man- 5
agement through distinguishing these activities and relating them within a 6
process model that highlights decision making in relation to performance at 7
the activity level. This serves to distinguish the performance of the design 8
(artefact) from the activities involved in its creation. The effectiveness 9
of design/design management is seen to provide a key element in support of 50
decision making within the PMM process model. The PMM model captures 51111

172
Methodology Review

1 the essence of design management, i.e. the continual trade-off between the
2 achievement of design and design management goals. While design and
3 design management have been the subject of research, their performance
4 relationship at the activity level has not been formalised as described here.
5 The role of efciency in supporting control decisions within the PMM
6 model is not fully clear. From the industrial appraisal it was shown that ef-
7 ciency may be used to support planning decisions but may not support the
8 control decisions in the PMM model in the same way as effectiveness. Further,
9 the worked example and review of metrics presented in Chapter 9 highlights
1011 the difculty in distinguishing efciency of design from that of design man-
1 agement using the information available. That is, due to the continued alter-
2 nation (often mental) between design and design management it may be
3 difcult, in practice, to distinguish the resources used in design from those
4 used in design management.
5 Resources are modelled as sources of knowledge within the formalism pre-
6 sented in this work. The behaviour of these resources is not investigated as
7 part of the work. That is, areas such as motivation, leadership, culture, etc.,
8 in relation to the use of human resources in design are not investigated
9 although these have an impact on performance.
2011 The work provides an understanding of how performance measurement,
1 in particular the measurement of effectiveness, supports the design devel-
2 opment process as described in the PMM model. It does not directly pre-
3 scribe metrics with which to measure performance. It is considered here to
4 be impractical to provide a set of metrics that may be used within any
5 situation, as each situation is to some degree unique in design development
6 and it is more appropriate to describe how those metrics may be determined.
7 The work presents a basis for deriving metrics through developing an
8 instance of the models in relation to a particular area of analysis. Having
9 developed such models, metrics may be dened to analyse the relationship
3011 between outputs and goals (effectiveness) and between knowledge gained
1 and resources used (efciency).
2
3
4 13.1.3 Supporting Coherence
5
6 Achieving coherence requires alignment within and across goals as discussed
7 in Part 1. The DAM and PMM models distinguish and relate Design and Design
8 Activity goals. The measurement and management of performance discussed
9 in Chapter 5 provides a description of the decision making involved in the
4011 trade-off between design and design activity goals in an effort to ensure align-
1 ment. This reveals the nature of design management itself and describes how
2 a degree of coherence across design and design activity goals may be achieved.
3 Achieving coherence within design or design activity goals requires the
4 denition and maintenance of relationships between the goals. This may be
5 realised within a goal breakdown structure, i.e. the denition of some form
6 of network of relationships for a given situation. Therefore, the formalism
7 itself does not achieve coherence within goals per se, only when it is applied
8 within a situation where goal relationships are maintained. As identied
9 above, the modelling formalism may be applied in relation to the complete
50 business but this raises the issue of alignment and coherence between design
51111 development and business performance.

173
Application and Key Features

13.2 Design Development Performance Analysis 1


2
The second aspect of the methodology involved the development of key prin- 3
ciples underlying the relationship between resources and the effectiveness 4
achieved in relation to design or design activity goals and their realisation 5
within the PERFORM approach. 6
7
8
13.2.1 Generality 9
1011
The analysis of performance as presented in the PERFORM approach is based 1
on the principles dened in the Resource Impact model relating the degree 2
of exploitation of a resource to the effectiveness obtained in using it. The con- 3
cepts of ease and degree of exploitation and their relationship to impact are 4
dened specically to support the analysis of resource effectiveness. 5
Many of the principles of the PERFORM approach are indicative of more 6
general applications in the area of decision support. That is, the use of the 7
approach can support the process of evaluation and decision making among 8
a set of alternatives (resources) in relation to certain criteria (goals) [15]. 91
Therefore it is feasible to suggest that the PERFORM approach could be 2011
adapted for use in any situation where alternatives and criteria may be speci- 1
ed and relationships dened between them. 2
3
4
13.2.2 Comprehensiveness 5
6
The analysis of performance is focused on a particular element of perfor- 7
mance, i.e. effectiveness and on a specic area of inuence, i.e. that of re- 8
sources. This indicates the scope of the work and identies that the approach 9
developed does not comprehensively address performance in the same way as 3011
that provided by the modelling formalism. However, analysis of effectiveness 1
provides a key element in controlling design development and the inuence of 2
resources was highlighted as being of considerable interest within Chapter 6. 3
Relationships between resources and between goals are not explicitly mod- 4
elled within the current PERFORM approach and system. That is, the use of 5
a particular resource may impact positively or negatively on the use of another 6
resource. For example, the use of DFX and XTD methods may be viewed as 7
complimentary. Further, the achievement of particular goals such as reduc- 8
ing rework may impact on the achievement of goals such as cost of develop- 9
ment. The ability to model these relationships would enhance the approach. 4011
1
2
13.2.3 Supporting Coherence 3
4
The PERFORM approach may be applied at different levels of detail and the 5
relationship between each level of analysis is based on the use of multiple 6
matrices as used in Enhanced Quality Function Deployment and discussed in 7
Section 7.7. This provides alignment within the goals established at each level 8
of the analysis. 9
Within an application of the approach both design and design activity 50
goals may be dened in the specication stage. Through analysing resource 51111

174
Methodology Review

1 effectiveness against the overall goal (GO) a degree of alignment is achieved


2 across the design and design activity goals as their respective priorities are
3 taken into account.
4
5
6 13.2.4 Sensitivity
7
The Resource Impact model illustrates that the relationship between the
8
exploitation of a resource and the resulting effectiveness obtained may be a
9 complex one, although it is assumed to be linear as a basis for the PERFORM
1011 approach. Such an assumption may introduce a degree of error within the
1 analysis and identication of the scope for improvement as discussed in
2 Section 6.3.3. However, the ability to conduct what-if scenarios allowing a
3 degree of sensitivity analysis mitigates the effect of this error. The denition
4 of actual impact proles presents a potential area of research building on the
5 work presented here.
6
7
8 13.2.5 Subjectivity
9
2011 The collection of objective data on design performance has limited benets
1 and is conned to those areas of design that are readily measurable. The
2 impact of resources on goals is not such a readily measurable area and there-
3 fore the PERFORM approach relies on the knowledge of individuals involved
4 in design to act as measuring instruments. It is suggested that the impact
5 may be estimated and captured using the judgement of individuals with
6 appropriate knowledge:
7 Although one cannot capture human preferences and approximate reasoning with
8 the precision of algebraic quantities, there is a positive enhancement and amplifica-
9 tion of judgement when one is able to formulate and use an appropriate prudential
3011 algebra [128].
1
2 Such a prudential algebra supports the interaction between human judgement
3 and a formal model and reects the implementation of PERFORM, i.e.
4 the interaction of individuals in the analysis team with the model dened
5 within the PERFORM matrix. The numerical representation of judgements
6 within PERFORM and the use of a multidisciplinary team supports a com-
7 bination of individual judgement tempered with a level of objectivity [127].
8 Subjectivity is minimised through the use of group consensus as opposed to
9 individual input and the incorporation of multiple disciplines within the
4011 analysis team membership [16, 23, 170]. The use of the approach in a complex
1 analysis with a single individual may therefore increase the risk of error
2 unless that individual has a comprehensive understanding of the resource/
3 goal relationships being analysed.
Although subjective judgements are used to establish the data for the
4
PERFORM matrix, actual relationships, if determinable, could be modelled
5
within the system to remove this area of subjectivity.
6
7
8 13.2.6 General Characteristics
9
50 The specic principles identied in relation to the analysis of impact devel-
51111 oped in Chapter 6 have provided the basis for the PERFORM approach. Some

175
Application and Key Features

desirable characteristics of systems to support performance measurement 1


were generalised in Chapter 4. The PERFORM approach is briey reviewed 2
against these here: 3
4
1. Measurement should relate to the Strategy and Specic Objectives of the
5
Organisation. The objectives of the organisation, in relation to the scope
6
of analysis, are explicitly dened within the approach and represented here
7
as goals. The relationship to strategy is not dened as part of the approach.
8
However, the overall business strategy should be manifest in the high level
9
goals of the organisation. Alignment between these goals and the product
1011
development goals ensure the analysis is directly aligned with strategy.
1
2. Simplicity and Ease of Use. The use of many proven QFD tools and tech- 2
niques such as Matrices, Afnity Diagrams, etc. combined with facilitation 3
and computer support to provide graphical outputs promotes simplicity. 4
A complex set of relationships may be modelled within the PERFORM 5
matrix, which may be broken down into sections to support ease of 6
analysis, i.e. problem decomposition. 7
3. Allow Comparison and Focus on Improvement: One of the key outputs from 8
the PERFORM approach is the identication of key target areas where per- 91
formance improvement efforts should be focused. PERFORM therefore acts 2011
as a predictive tool in terms of estimating future effectiveness. Further use 1
of the approach, having addressed the target areas, supports continuous 2
improvement. 3
4. Users Inuence on Criteria. The users of PERFORM have complete control 4
in dening the goals and prioritising them. Therefore, they establish the 5
criteria by which resource effectiveness may be assessed before carrying 6
out the assessment. The results generated from the approach are owned 7
by the analysis team members and not dened by external parties. 8
5. Variety of Measures/Types of Measures. The PERFORM approach focuses 9
on measuring effectiveness. Further measures and analysis of efciency is 3011
required for a full understanding of performance. The models dened 1
within Chapter 5 provide a basis for this analysis, although it requires 2
further investigation as discussed above. 3
6. Rapid Feedback. The use of automated analysis, data transformation and 4
graphical output via a data projector provides almost instantaneous feed- 5
back of results during the analysis process. This was identied as a partic- 6
ular strength of the approach in evaluation. 7
8
9
4011
13.3 Further Investigation 1
2
Through carrying out the work and writing this book a number of additional 3
insights into the area of design development performance were obtained. 4
These are briey presented here as areas for further investigation. 5
6
7
13.3.1 Complexity and Novelty 8
9
An analysis of metrics and their relation to the E2 and DAM models identi- 50
ed novelty and complexity as key areas of measurement. The measurement 51111

176
Methodology Review

1 of complexity and novelty does not reect performance per se, i.e. it does not
2 represent efciency or effectiveness. Rather, it highlights inuences on per-
3 formance. It is reasonable to assume that a highly complex and/or novel
4 design project will require more resources to complete than one that is more
5 simple and familiar. The degree of novelty in a design activity may be reected
6 in the nature of the knowledge processed during that activity. That is, the
7 knowledge required to carry out design development may not be present
8 within existing resources and new knowledge must be obtained. This can be
9 achieved through training existing designers, introduction of new methods,
1011 etc. The complexity of the design development activities may be related to the
1 quantity of knowledge processed (i.e. K+) during the activity. That is a highly
2 complex project may require more knowledge to be processed in comparison
3 to a less complex one. The application of metrics within design development
4 and subsequent comparison or benchmarking with other projects, organisa-
5 tions, etc. should take account of novelty and complexity as key inuencing
6 factors.
7 Two key areas of research are evident, i.e. rstly, to establish a means for
8 measuring complexity and novelty in design development and secondly, to
9 determine the relationship between these phenomena and aspects of design
2011 development performance. Achieving these tasks would greatly enhance our
1 ability to plan in design development and more accurately compare perfor-
2 mance across projects/organisations, i.e. benchmarking.
3
4
5 13.3.2 Resource Impact Profiles
6
7 It has been identied in Section 6.3 that resource impact proles are unlikely
8 to be linear. The work presented here provides a framework for the further
9 investigation of the nature of these graphs. The creation of such graphs may
3011 require detailed analysis of resource use over a considerable period of time
1 to determine the true nature of the relationship.
2 In certain cases the impact prole may be such that the rate of increase
3 in effectiveness may decrease from a certain point in the exploitation of
4 the resource (Figure 13.1). For example, the introduction and initial exploita-
5 tion of a CAD system may have a signicant impact on effectiveness in
6 relation to a time goal. However, beyond a certain degree of exploitation of
7 the CAD system functionality (A-B), additional exploitation may not provide
8 corresponding increases in effectiveness of the same magnitude (B-C). This
9 may be a reection of the degree of ease in exploiting the resource, i.e. the
4011 amount of resources required to further exploit the CAD system from point
1 B in terms of training, customisation, etc. may be much greater than in the
2 early stages.
3 Further research is required to determine the nature of impact proles for
4 different resource/goal combinations. This may involve the analysis of per-
5 formance over a considerable period of time to establish sufcient data,
6 although the weaknesses of such empirical analysis in design development
7 have already been highlighted and should be addressed. It may be possible
8 through such research to determine the actual relationships between
9 resources and the effectiveness that may be achieved in using them. Such rela-
50 tionships would enhance the model developed within the PERFORM matrix
51111 as discussed above.

177
Application and Key Features

100% 1
C 2
3
4
B 5
6
7
8
9
1011
Effectiveness ()
1
2
3
4
5
6
7
8
91
A 2011
0 100% 1
Exploitation (Ex) 2
Figure 13.1 Relating ease to the impact prole. 3
4
5
6
7
8
13.3.3 Knowledge-Based Efficiency 9
3011
The efciency of a knowledge-based activity is considered here to exist irre- 1
spective of whether it is measured or not, i.e. it is an inherent property relat- 2
ing an activity and its resource. The selection and application of metrics to 3
determine efciency allow particular views of efciency to be created, e.g. cost 4
based efciency, time based efciency, etc. Further work is required to estab- 5
lish a means for assessing the quantity of knowledge gained in an activity (K+). 6
7
8
13.3.4 Use of Efficiency 9
4011
The role of efciency in supporting design and/or design management activ- 1
ities is not clear from this work. From the worked example and industrial 2
appraisals it would seem that its primary role is in supporting planning type 3
decisions such as the allocation of resources. That is, estimating the level of 4
resources required to achieve particular design activity goals may be sup- 5
ported by knowledge of the previous efciency of the resources. Efciency is 6
an important component of performance in design development providing 7
insight into the effectiveness of design management activities. Establishing 8
efciency within the knowledge processing activities of design development 9
requires the quantity of knowledge processed or used to be established. This 50
presents an important area for further investigation. 51111

178
Methodology Review

1 13.3.5 Establishing Performance Trends


2
3 The PERFORM approach has not yet been extensively used in industry.
4 Further use in industry may allow particular trends to be identied and norms
5 to be established to support a degree of benchmarking between organisations.
6 The PERFORM system could be modied to incorporate such norms and
7 allow comparison of specic results in relation to these norms.
8
9
1011 13.3.6 The PERFORM Approach and System
1
2 The following are potential developments which may be related to the
3 PERFORM approach and system to increase its functionality and extend its
4 area of application in industry:
5
6 The PERFORM system could be integrated with the existing management
7 information system (MIS) of the organisation to provide objective perfor-
8 mance data as a starting point. That is, information on current performance
9 against the goals specied in the analysis could be established. The degree
2011 of change in performance could then be tracked following the analysis and
1 identication of areas for improvement. This could provide objective data
2 on the improvements obtained from implementing PEFRFORM.
3 It may be possible to deploy the PERFORM system via a company Intranet
4 following the initial analysis. This would allow repeat analyses to be carried
5 out from individual desktops although the degree of consensus achieved
6 from using the approach as described in Chapter 7 may be at risk.
7 It would be desirable to further develop the approach and system to include
8 relationship matrices, which allow relationships between resources and
9 between goals to be dened. The principles of using such matrices are
3011 described within the Quality Function Deployment approach, i.e. the roof
1 of the house [112]. However, the integration of the values placed in such
2 matrices within the overall analysis requires further work.
3 The charts currently generated within PERFORM are based on the per-
4 ceived needs of users in terms of how they wish to view results and they
5 have been considered acceptable in the evaluation. However, additional
6 charts may be dened and implemented within PERFORM to support any
7 new user requirements.
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

179
14
1
2
3
4
Summary and
Insights

The overall aim of the work presented in this book was


to develop a methodology for performance modelling
and analysis that supports performance improvement
in design development. This methodology is presented in
Chapter 8 providing a new insight into design develop-
5 ment performance.
6
7
8 14.1 Key Elements
9
1011 Figure 14.1 presents an overall summary of the work, high-
1 lighting the development of key elements and their rela-
2 tionships. The following chapter presents a brief summary.
3
4
5 14.1.1 Nature of Performance in Design
6 Development
7
8 The process of improving performance in design through
9 its assessment, analysis, etc. is in some ways comparable
2011 with that aimed at improving any business process. For
1 example, the need for coherence in performance through-
2 out an organisation is identied as a common require-
3 ment in performance analysis. However, the nature of
4 design development is such that many of the concepts
5 applied in other processes are not appropriate within this
6 area. The metrics used to establish efciency in manufac-
7 turing are less appropriate for activities in design devel-
8 opment, which are characterised by the fact that they are
9 knowledge-based, non-repeatable, novel, etc. Further,
3011 performance in design development is characterised by
1 two areas, the performance of the design (artefact) and
2 of the activities involved in its creation.

181
182
derived Current Practice/
from
Existing Research
Industrial Trial Primary elements
Nature of Design Methodology
Development Performance derived derived considers
from from
considers
Secondary elements
considers Problems/Deficiencies in considers Requirements for
Resource
Performance Modelling and Analysis Performance Modelling and Analysis
Practice
Strengths/Weaknesses
and Future Work considers Relationship
clarifies meets
2
identifies
E Efficiency and Effectiveness
PERFORM DAM Design Activity Management
Methodology for Performance System
Evaluation PMM Performance Measurement and
Modelling and Analysis
Management
based on based on based on
RI Resource Impact
realised in
Subject Worked derived
Experts Example from

Industrial realised considers


derived PERFORM
Performance Modelling Case Study in
from Approach based on
Formalism based on
based on
incorporates
Principles of Analysing
RI based on Resource Impact
Model
E2 DAM PMM
considers
Model Model Model

Figure 14.1 Summary of the methodology.

9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
91

4011
3011
2011
1011

51111
Summary and Insights

1 14.1.2 Requirements for Performance Modelling


2 and Analysis
3
4 The requirements for performance modelling and analysis in design devel-
5 opment were derived from an initial understanding of its nature. These
6 requirements include the need for a comprehensive formalism as the basis
7 for work in this area and the need to distinguish/relate design and design
8 activity performance. There is also a need to reect general requirements
9 of performance analysis identied in the literature, such as the need for
1011 coherence and analysing inuencing factors.
1 An approach to analysing performance was adopted from current research
2 and a computer-based system was developed to support this approach. This
3 was used in industry to gain insight into the issues and understanding of
4 industry. The general approach was found to meet industry needs but there
5 existed confusion on terminology and the nature of resource-goal relation-
6 ships, a lack of ability to customise resources/groups and limited exibility in
7 terms of performing what-if scenarios.
8
9
2011
1 14.1.3 Problems/Deficiencies in Performance Modelling
2 and Analysis
3
4 Although there is signicant research in the general area of performance and
5 in particular business processes such as manufacturing, the performance of
6 design development is comparatively undeveloped from a research perspec-
7 tive. There is a lack of consistency in the denition of performance and
8 research in this area is carried out in the absence of a fundamental and com-
9 prehensive formalism. Design research has produced a range of activity/
3011 process models of design but they lack a comprehensive treatment of perfor-
1 mance at the activity level. Approaches to performance analysis have been
2 reviewed and their deciencies highlighted in relation to the support they
3 provide. These deciencies included the lack of a common framework to
4 relate different areas of performance and enhance understanding, the ill-
5 dened relationship between design and its management, limited knowledge
6 of how to achieve coherence and unsuitability of empirical studies in a design
7 environment.
8
9
4011 14.1.4 Methodology for Performance Modelling and Analysis
1
2 The methodology for performance modelling and analysis is derived from
3 both a Performance Modelling Formalism and the PERFORM Approach to
4 analysing performance and is presented in Figure 8.1 illustrating its applica-
5 tion to continuous performance improvement.
6 The performance modelling formalism provides a new and fundamental
7 insight into design development performance and incorporates:
8 2
The E Model, which describes the nature of performance at a fundamen-

9 tal level. The model is based on a knowledge processing view of design


50 where inputs are transformed to outputs using resources and under
51111 the direction of goals/constraints. Efciency and effectiveness describe

183
Application and Key Features

performance in any situation and these are distinguished and related within 1
the model. 2
The DAM model, which distinguishes design and design management 3
activities within a managed activity. Design activities are aimed at achiev- 4
ing design goals while design management activities address design activ- 5
ity goals. From this model it is possible to distinguish effectiveness and 6
efciency further in terms of design and design management. 7
The PMM model, which describes the process of measuring and managing 8
performance within a managed activity. Design and design management 9
effectiveness are measured and provide key information in support of 1011
controlling decisions such as the allocation of resources, changing of goals 1
or stopping the activity. 2
3
The models and principles that comprise the formalism may be applied at any 4
level in design, from an overall project/program level, where design and its 5
management are formally dened, to that of fundamental activities including 6
those that are carried out mentally by an individual designer. Design and its 7
management are distinguished at all levels. 8
The nature of design development performance, revealed through the 91
development of the formalism, provided a basis upon which to establish an 2011
approach for its analysis. The Resource Impact model has been developed to 1
illustrate the inuence of resources on performance and Principles of Analys- 2
ing Resource Impact have been established. The PERFORM Approach provides 3
a structured process for analysing design development performance and 4
identifying the best means by which it may be improved. 5
The PERFORM System has been developed within this work to support and 6
automate parts of the PERFORM Approach, e.g. to automatically calculate 7
results of the analysis and generate graphical output. The system allows real- 8
time analysis and feedback of results to participants in the PERFORM 9
approach. 3011
1
2
14.1.5 Industrial Relevance 3
The methodology provides a basis for the treatment of design development 4
performance incorporating many elements. A number of approaches were 5
adopted for checking the ideas, addressing different and sometimes overlap- 6
ping elements: 7
8
Worked Example: Based on data collected during an experiment to support 9
design protocol studies, a worked example was provided of the complete 4011
methodology in terms of its application to formalise and analyse perfor- 1
mance in a particular situation. The example illustrated the application of 2
the methodology and its potential to identify areas to be targeted for per- 3
formance improvement. A full application of the methodology in an indus- 4
trial environment could only be done over a considerable period of time 5
to allow results of performance improvement efforts to be realised. 6
Metrics Review and Analysis: The literature on performance was used to 7
identify the range of metrics reecting aspects of performance. These 8
metrics were related to the E2 and DAM models to establish if they could 9
be described in relation to the models and if the principles dened within 50
the models were violated in any way by such metrics. These principles were 51111

184
Summary and Insights

1 supported by the review and an increased insight into the metrics was
2 obtained. That is, metrics could be classied into areas such as design effec-
3 tiveness, design management effectiveness, inuences on performance, etc.
4 Industrial appraisal: Experts in design development, performance man-
5 agement, risk management, etc. provided a basis for critical appraisal of
6 the ideas. The review of these appraisals provides substantial support for
7 the work, primarily focused on the modelling formalism, while also allow-
8 ing the relationship of the work to industrial practice to be revealed and
9 reported.
1011 System Development: The implementation of the PERFORM System pro-
1 vided a realisation of key elements of the PERFORM Approach, based on
2 the Principles for Analysing Resource Impact, within a computer-based
3 model. This provided a logical test of the concepts developed in the
4 approach and allowed the utility of the system to be demonstrated within
5 the industrial implementation of the PERFORM approach.
6
7
Industrial Implementation: The PERFORM approach was implemented
8 within an industrial case study and results were established using the
9 approach that indicated areas of focus for performance improvement. The
2011 review of the case study provided support for the approach in practice, i.e.
1 the participants commented that the approach provided signicant insight
2 into their design development performance and provided an important
3 step towards realising performance improvement.
4 The various means of testing the ideas presented in this book allowed
5 Strengths and Weaknesses of the Methodology to be dened and areas of
6 Future Work to be proposed. The strengths of the work included its compre-
7 hensiveness and ability to clearly distinguish and relate performance of design
8 and design activities. A weakness of the methodology was that the approach
9 focuses on effectiveness and that further analysis of efciency is required for
3011 a complete view of performance. The strengths/weaknesses and areas of future
1 work are further dened in Chapter 13.
2
3
4 14.2 Axioms of Performance
5
6 From the work presented in this book the following Axioms1 of Performance
7 can be argued.
8
9 Axiom 1: Activity performance
4011
1 Activities are the fundamental means that create performance.
2
3 Activities are the fundamental elements that transform input to output and
4 are the basic components of processes, phases, projects and higher-level activ-
5 ities. Other aspects of performance inuence it, such as the type, denition
6 and behaviour but its actual creation is through an activity. That is, no other
7 aspect related to performance creates it.
8
9 Axiom 2: Efciency and effectiveness
50
51111 All performance can be described by efciency and/or effectiveness.

185
Application and Key Features

That is, no matter the metric(s) or aspect(s) under consideration, all meas- 1
ures of performance, no matter how general or specic, will indicate either 2
efciency or effectiveness. Thus, while there may be multiple metrics for 3
a particular activity or process (amalgamation of activities) they can be 4
categorised into three fundamental types of efciency, effectiveness or a com- 5
bination of these. Of course once any goal is dened, including an efciency 6
goal, the relevant performance is an indication of effectiveness. 7
8
Axiom 3: Activity and management 9
1011
Activities and their management are inextricably linked. 1
2
Carrying out an activity will always involve an element of management. Thus, 3
every activity, even at an individual cognitive level, will involve its manage- 4
ment. Performance measurement must ensure that the correct metric is being 5
used for the actual activity or its management. Conicts and misdirected effort 6
ensues if these are not clearly and correctly identied. 7
8
It is suggested that some of the fundamental Axioms of Performance are either 91
neglected or misunderstood when considering performance management 2011
practice. 1
2
3
14.3 Implications for Design Development 4
5
Much of the work carried out in the area of design development is ultimately 6
aimed at improving the performance of the process. That is, it is aimed 7
at developing better understandings, approaches, tools, etc. which may ulti- 8
mately have an impact on the performance of design development in 9
industry. To predict or determine the nature of this impact requires an under- 3011
standing of the phenomenon of performance. This book contributes to this 1
understanding through presenting a methodology, which incorporates a new 2
and more comprehensive formalism for design performance modelling and 3
an approach for analysing the inuence of resources on a particular area of 4
performance. 5
6
7
Note 8
9
1 Self evident truths (The Concise Oxford Dictionary, 1995) 4011
1
2
3
4
5
6
7
8
9
50
51111

186
1
2
3
References
4
5
6
7
8
9
1011
1
2
3 1. Elzinga D.J., et al., Business Process Management: Survey and Methodology. IEEE
4 Transactions on Engineering Management, 1995. 42(2): p. 119128.
2. Leifer R. and W.J. Burke, Organisational Activity Analysis: A Methodology for Analysing and
5
Improving Technical Organisations. IEEE Transactions on Engineering Management, 1994.
6 41(3): p. 234244.
7 3. Araujo C.S. and A.H.B. Duffy. Assessment and Selection of Product Development Tools. in
8 International Conference on Engineering Design. 1997. Tampere Finland.
9 4. Design Council, L., Design Council, Annual Review 1997. 1997.
5. Wheelwright S.C. and K.B. Clark, Revolutionizing Product Development. 1991, New York:
2011 Free Press.
1 6. Duffy A.H.B. and F.J.O. Donnell. A Design Research Approach. in Workshop on Research
2 Methods in AI in Design. 1998. Lisbon, Portugal.
3 7. Neely A., M. Gregory, and K. Platts, Performance Measurement System Design: A Literature
4 Review and Research Agenda. International Journal of Production and Operations
Management, 1995. 15(4): p. 80116.
5 8. Gilbreath R.D., Working with Pulses Not Streams: Using Projects to Capture Opportunity, in
6 Project Management Handbook, D.I. Cleland and W.R. King, Editors. 1988, Van Nostrand
7 Reinhold: New York. p. 315.
8 9. Cusumano M.A. and K. Nobeoka, Strategy, Structure, and Performance in Product
9 Development: Observations from the Auto Industry, in Managing Product Development,
T. Nishiguchi, Editor. 1996, Oxford University Press, Inc. p. 75120.
3011 10. Roozenburg N.F.M. and N.G. Cross, Models of the Design Process: Integrating Across the
1 Disciplines. Design Studies, 1991. 12(4): p. 215220.
2 11. French M.J., Conceptual Design for Engineers. 3rd ed. 1998, London: Springer.
3 12. Pahl G. and W. Beitz, Engineering Design: A Systematic Approach. 2nd Revised Edition ed,
4 ed. K. Wallace. 1996, London: Springer-Verlag.
13. Smithers T., et al., Design as Intelligent Behaviour: An AI in Design Research Programme.
5 Articial Intelligence in Engineering, 1990. 5(2).
6 14. Andreasen M.M. and L. Hein, Integrated Product Development,. 1987: IFS (Publications) Ltd
7 and Springer-Verlag London UK.
8 15. Roozenburg N.F.M. and J. Eekels, Product Design: Fundamentals and Methods. A Wiley
Series in Product Development: Planning, Designing, Engineering. 1995, Chichester,
9 England: John Wiley and Sons. 408.
4011 16. Ulrich K.T. and S.D. Eppinger, Product Design and Development. 1995: MgGraw-Hill, Inc.
1 17. Angelis V.A., The Organisational Performance Re-considered: The Case of the Technical and
2 Social Subsystems Synergy. Business Process Management Journal, 1999. 5(3): p. 275288.
3 18. Neely, A., The Performance Measurement Revolution: Why Now and What Next?
International Journal of Operations and Production Management, 1999. 19(2): p. 205228.
4 19. Pritchard R.D., Measuring and Improving Organisational Productivity: A Practical Guide.
5 1990, New York: Praeger Publishers.
6 20. Kaplan R.S. and D.P. Norton, Using the Balanced Scorecard as a Strategic Management
7 System. Harvard Business Review, 1996. January-February.
8 21. Mehra S., Perpetual Analysis and Continuous Improvements: A Must for Organisational
Competitiveness. Managerial Finance, 1998. 24(1).
9 22. Sinclair D. and M. Zairi, Effective Process Management Through Performance Measurement:
50 Part 3 in integrated model of total quality-based performance measurement. Business
51111 Process Re-engineering and Management Journal, 1995. 1(3): p. 5065.

187
References

23. Drongelen C.K-van., Systematic Design of R&D Performance Measurement Systems. 1999, 1
University of Twente: Enschede. 2
24. Kald M. and F. Nilsson, Performance Measurement at Nordic Companies. European
Management Journal, 2000. 18(1): p. 113127. 3
25. Rolstadas A., ed. Benchmarking Theory and Practice. 1995, Chapman and Hall. 4
26. Kaplan R.S. and D.P. Norton, The Balanced Scorecard: Translating Strategy intoAaction. 5
1996, Boston: Harvard Business School Press. 6
27. Duffy A.H.B. and F.J.O. Donnell. A Model of Product Development Performance. in Designers:
Key to Successful Product Development. 1997: Springer.
7
28. Bititci U.S., Measuring Your Way to Prot. Management Decision, 1994. 32(6): p. 1624. 8
29. Dixon J.R., A.J. Nanni, and T.E. Vollman, The New Performance Challenge Measuring 9
Operations for World-Class Competition. 1990, Homewood, IL: Dow Jones-Irwin. 1011
30. Kaplan R.S. and D.P. Norton, The Balanced Scorecard Measures That Drive Performance. 1
Harvard Business Review, 1992. 70(1): p. 7179.
31. Chissea V., P. Coughlan, and C.A. Voss, Development of a Technical Innovation Audit.
2
Journal of Product Innovation Management, 1996. 13(2): p. 105136. 3
32. Meyer M.W. and V. Gupta, The Performance Paradox. Research in Organisational Behaviour, 4
1994. 16: p. 309369. 5
33. de Haas M. and A. Kleingeld, Multilevel Design of Performance Measurement Systems: 6
Enhancing Strategic Dialogue Throughout the Organisation. Management Accounting
Research, 1999. 10: p. 233261. 7
34. Tsang A., A. Jardine, and H. Kolodny, Measuring Maintenance Performance: A Holistic 8
Approach. International Journal of Operations and Production Management, 1999. 19(7): 91
p. 691715. 2011
35. Finger S. and J.R. Dixon, A Review of Research in Mechanical Engineering Design. Part 1:
Descriptive, Prescriptive, and Computer-Based Models of Design Processes. Research in
1
Engineering Design, 1989. 1: p. 5167. 2
36. Ullman D.G., T.G. Dietterich, and S.L. A, A Model of the Mechanical Design Process Based 3
on Empirical Data. AI EDAM, 1988. 2(1): p. 3352. 4
37. Akin O. and C. Lin, Design Protocol Data and Novel Design Decisions. Design Studies, 1995. 5
16(2): p. 211236.
38. Dorst K. and J. Dijkhuis, Comparing Paradigms for Describing Design Activity. Design 6
Studies, 1995. 16(2): p. 261274. 7
39. Hubka V., Principles of Engineering Design. WDK. 1980, Zurich: Springer-Verlag. 118. 8
40. Pugh S., Total Design: Integrated Methods for Successful Product Engineering. 1990: Addison- 9
Wesley Publishers Ltd.
41. Zhang Y., Computer-based Modelling and Management for Current Working Knowledge
3011
Evolution Support, in Department of Design, Manufacture and Engineering Management. 1
1998, University of Strathclyde: Glasgow. 2
42. VDI-2221, Systematic Approach to the Design of Technical Systems and Products. 1987, 3
Dusseldorf: Verein Deutscher Ingenieure. 4
43. Grifn A., The Effect of Project and Process Characteristics on Product Development Cycle
Time. Journal of Product Innovation Management, 1997. 34(1): p. 2435.
5
44. Brookes N.J. and C.J. Backhouse, Measuring the Performance of Product Introduction. 6
Journal of Engineering Manufacture, 1998. 212(1): p. 111. 7
45. Lawson M. and H.M. Karandikar, A Survey of Concurrent Engineering. Concurrent 8
Engineering: Research and Applications, 1994. 2: p. 16. 9
46. Tomiyama T., Concurrent Engineering: A Successful Example for Engineering Design
Research, in The Design Productivity Debate, A.H.B. Duffy, Editor. 1998, Springer-Verlag 4011
Publications. p. 175186. 1
47. Cantamessa M. Design Best Practices at Work: an empirical research upon the effectiveness 2
of design support tools. in ICED97. 1997. Tampere. 3
48. Grifn A., PDMA Research on New Product Development Practices: Updating Trends and
Benchmarking Best Practices. 1997, Institute for the Study of Business Markets, The
4
Pensylvania State University. 5
49. Schrijver R. and R.d. Graaf, Development of a Tool for Benchmarking for Concurrent 6
Engineering. Concurrent Engineering: Research and Applications, 1996. 4(2): p. 159170. 7
50. Brown S.L. and K.L. Eisenhardt, Product Development: Past Research, Present Findings, and 8
Future Directions. Academy of Management Review, 1995. 20(2): p. 343378.
51. Hales C., Managing Engineering Design. 1993: Longman Scientic and Technical. 9
52. Cooper, R.G., New Products: The Factors that Drive Success. International Marketing Review, 50
1994. 11(1): p. 6076. 51111

188
References

1 53. PM 2000: Performance Measurement Past, Present and Future. 2000. Cambridge.
2 54. Chang Z.Y. and K.C. Yong, Dimensions and Indices for Performance Evaluation of a Product
Development Project. International Jounal of Technology Management, 1991. 6(1/2):
3 p. 155167.
4 55. McGrath M.E., The R&D Effectiveness Index: Metric for Product Development Performance.
5 Journal of Product Innovation Management, 1994. 11: p. 201212.
6 56. Morbey G.K., R&D: Its Relationship to Company Performance. Journal of Product Innovation
Management, 1988. 5: p. 180190.
7 57. Terwiesch, C., C. Loch, and M. Niederkoer, When Product Development Makes a Difference:
8 A Statistical Analysis in the Electronics Industry. Journal of Product Innovation Management,
9 1998. 15(1): p. 315.
1011 58. Eccles R.G., The Performance Measurement Manifesto. Harvard Business Review, 1991.
1 69(January-February).
59. Johnson H.T. and R.S. Kaplan, Relevance Lost the Rise and Fall of Management Accounting.
2 1987, Boston, MA: Harvard Business School Press.
3 60. Nickell S., The Performance of Companies. 1995, Blackwell Publishers: Oxford. p. 120.
4 61. Bititci U.S., T. Turner, and C. Begemann, Dynamics of Performance Measurement Systems.
5 International Journal of Operations and Production Management, 2000. 20(6).
6 62. Dixon J.R., Design Engineering: inventiveness, Analysis, and Decision Making,. 1966, Usa:
McGraw-Hill Book Co.
7 63. LockamyIII, A., Quality-focused Performance Measurement Systems: A Normative
8 Model. International Journal of Operations and Production Management, 1998. 18(8):
9 p. 740766.
2011 64. Montoya-Weiss M.M. and R. Calantone, Determinants of New Product Performance: A
Review and Meta-Analysis. Journal of Product Innovation Management, 1994. 11: p. 397417.
1 65. Cordero R., The Measurement of Innovation Performance in the Firm: An Overview. Research
2 Policy, 1989. 19: p. 185192.
3 66. Dwight R., Searching for Real Maintenance Performance Measures. Journal of Quality in
4 Maintenance Engineering, 1999. 5(3): p. 258275.
5 67. Neely A., et al., Getting the Measure of Your Business. 1996, Horton Kirby: Findlay
Publications.
6 68. Rolstadas A., Enterprise Performance Measurement. International Journal of Operations and
7 Production Management, 1998. 18(9/10): p. 989999.
8 69. Clark K.B. and T. Fujimoto, Product Development Performance: Strategy, Organisation and
9 Management in the World Auto Industry. 1991: Harvard Business School Press.
3011 70. Doz Y., New Product Development Effectiveness: A Triadic Comparison in the Information-
Technology Industry, in Managing Product Development, T. Nishiguchi, Editor. 1996, Oxford
1 University Press, Inc. p. 1333.
2 71. Emmanuelides P.A., Towards an Integrative Framework of Performance in Product
3 Development Projects. Journal of Engineering and Technology Management, 1993. 10:
4 p. 363392.
72. Moseng B. and H. Bredrup, A Methodology for Industrial Studies of Productivity Performance.
5 Production Planning and Control, 1993. 4(3).
6 73. Drongelen C.K.-v. and A. Cook, Design Principles for the Development of Measurement
7 Systems for Research and Development Processes. R&D Management, 1997. 27(4): p. 345357.
8 74. Christiansen R.T., Modelling Efciency and Effectiveness of Coordination in Engineering
9 Design Teams,. 1993, Stanford University.
75. Grifn A., The Impact of Engineering Design Tools on New Product Development Efciency
4011 and Effectiveness. 1996, Institute for the Study of Business Markets, The Pensylvania State
1 University: Univesity Pk., PA.
2 76. McDonough E.R. and A. Grifn, The Impact of Organisational Tools on New Product
3 Development Efciency and Effectiveness. 1996, Institute for the Study of Business Markets,
The Pensylvania State University: University Pk., PA.
4 77. Wakasugi R. and F. Koyata, R&D, Firm Size, and Innovation Outputs: Are Japanese Firms
5 Efcient in Product Development. Journal of Product Innovation Management, 1997. 14:
6 p. 3683832.
7 78. Araujo C.S. and A. Duffy. Product Development Tools. in 3rd International Congress of Project
8 Engineering. 1996. Barcelona.
79. Beitz W., Design Science The Need for a Scientic Basis for Engineering Design Methodology.
9 Journal of Engineering Design, 1994. 5(2): p. 129133.
50 80. Duffy A.H.B., Design Productivity, in The Design Productivity Debate, A.H.B. Duffy, Editor.
51111 1998, Springer-Verlag Publications.

189
References

81. Grifn A. and A.L. Page, An Interim Report on Measuring Product Development Success and 1
Failure. Journal of Product Innovation Management, 1993. 10: p. 291308. 2
82. Bain D., The Productivity Prescription The Managers Guide to Improving Productivity and
Prots. 1982, New York: McGraw-Hill. 3
83. Goldschmidt G., The Designer as a Team of One, in Analysing Design Activity, N. Cross 4
H. Christiaans, and K. Dorst, Editors. 1996, John Wiley and Sons. 5
84. Blessing L.T.M., A Process-Based Approach to Computer-Supported Engineering Design. 1994, 6
University of Twente: Enschede, The Netherlands.
85. Pugh S., Design Activity Models: Worldwide Emergence and Convergence. Design Studies,
7
1986. 7(3): p. 167173. 8
86. Radcliffe D.F., Concurrency of Actions, Ideas and Knowledge within a Design Team, in 9
Analysing Design Activity, N. Cross, H. Christiaans, and K. Dorst, Editors. 1996, John Wiley 1011
and Sons. 1
87. Cross N., H. Christiaans, and K. Dorst, eds. Analysing Design Activity. 1996, John Wiley and
Sons: Chichester, West Sussex. 463.
2
88. Bititci U., Modelling of Performance Measurement Systems in Manufacturing Enterprises. 3
International Journal of Production Economics, 1995. 42: p. 137147. 4
89. LockamyIII A. and M.S. Spencer, Performance Measurement in a Theory of Constraints 5
Environment. International Journal of Production Research, 1998. 36(8): p. 20452060. 6
90. Martin R., Do we practise quality principles in the performance measurement of critical
success factors? Total Quality Management, 1997. 8(6): p. 429444. 7
91. APQC, Corporate Performance Measurement Benchmarking Study Report. 1996, American 8
Productivity and Quality Centre: Houston. 91
92. Clark K.B., High Performance Product Development in the World Auto Industry. International 2011
Journal of Vehicle Design, 1991. 12(2): p. 105131.
93. Loch C., L. Stein, and C. Terwiesch, Measuring Development Performance in the Electronics
1
Industry. Journal of Product Innovation Management, 1996. 13(220). 2
94. Cantamessa M., Design Best Practices, Capabilities and Performance. Journal of Engineering 3
Design, 1999. 10(4): p. 305328. 4
95. Cooper R.G., The Strategy-Performance Link in Product Innovation. R & D Management, 5
1984. 14(4): p. 247259.
96. Grifn A., Evaluating QFDs Use in US Firms as a Process for Developing New Products. 6
Journal of Product Innovation Management, 1992. 9: p. 171187. 7
97. Cooper R.G. and E.J. Kleinschmidt, Determinants of Timeliness in Product Development. 8
Journal of Product Innovation Management, 1994. 11(5): p. 381396. 9
98. Zirger B.J. and J.L. Hartley, The Effect of Acceleration Techniques on Product Development
Time. IEEE Transactions on Engineering Management, 1996. 43(2): p. 143152.
3011
99. Bititci U.S., A.S. Carrie, and P. Suwignjo. Quantitative Models for Aggregating and 1
Prioritising of Performance Measures. in 10th International Working Seminar on Production 2
Economics. 1998. Austria. 3
100. Saaty T.L. and L.G. Vargas, The Logic of Priorities, in International Series in Management 4
Science/Operations Research. 1982, Kluwer Nijhoff Publishing.
101. Elmasri R. and S.B. Navathe, Fundamentals of Database Systems. 1989, California USA: The
5
Benjamin/Cummings Publishing Company Inc. 6
102. Globerson S., Issues in Developing a Performance Criteria System for an Organisation. 7
International Journal of Production Research, 1985. 23(4): p. 639646. 8
103. KPMG, Information for Strategic Management A Survey of Leading Companies. 1990: 9
London.
104. Maskell B., Performance Measures of World Class Manufacturing. Management Accounting, 4011
1989(May). 1
105. NIST, 1999 Criteria for Performance Excellence. 1999. 2
106. Guide to the Business Excellence Model. 1998: British Quality Foundation. 3
107. Ritchie L. and B.G. Dale, An Analysis of Self-assessment Practices Using the Business
Excellence Model. Proceedings of Institution of Mechanical Engineers, 2000. 214(B):
4
p. 593602. 5
108. DTI, Innovation your move. 1994, Department of Trade and Industry. 6
109. Duffy A.H.B., et al., Design Coordination for Concurrent Engineering,. International Journal 7
of Engineering Design, 1993. 4(4): p. 251265. 8
110. Duffy A.H.B., M.M. Andreasen, and F.J.O. Donnell. Design Co-ordination. in International
Conference on Engineering Design. 1999. Munich. 9
111. Duffy A.H.B. Ensuring Competitive Advantage with Design Coordination,. in The Second 50
International Conference on Design to Manufacture in Modern Industry (DMMI95) Bled 51111

190
References

1 Slovenia 2930 May 1995. 1995. Faculty of Mechanical Engineering University of Maribor
2 Maribor Slovenia.
112. Cohen L., Quality Function Deployment: how to make QFD work for you. Engineering Process
3 Improvement Series. 1995: Addison-Wesley Publishers Ltd.
4 113. ODonnell F.J. and A.H. Duffy, DC Performance Analysis in Company A. 1996, CAD Centre,
5 University of Strathclyde: Glasgow.
6 114. ODonnell F.J. and A.H. Duffy, DC Performance Analysis in Company A: 1996 1999 review.
1996, CAD Centre, University of Strathclyde: Glasgow.
7 115. Colquhoun G.J., R.W. Baines, and R. Crossley, A State of the Art Review of IDEF0.
8 International Journal of Computer Integrated Manufacturing, 1993. 6(4): p. 252264.
9 116. Persidis A. and A. Duffy. Learning in Engineering Design,. in Selected and Reviewed Papers
1011 and Reports from the IFIP TC/WG5.2 Third Intelligent Workshop on Computer Aided Design
1 Osaka Japan September 1989. 1991. Elsevier Science Publishers B.V. (North-Holland)
Amsterdam The Netherlands.
2 117. Andreasen M.M., Modelling The Language of the Designer. Journal of Engineering Design,
3 1994. 5(2): p. 103115.
4 118. Frankenberger E., P. Badke-Schaub, and H. Birkhofer, eds. Designers: Key to Successful
5 Product Development. 1997, Springer. 319.
6 119. Eynard B., P. Girard, and G. Doumeingts. Control of Engineering Processes Through
Integration of Design Activities and Product Knowledge. in Proceedings of the 1999
7 International CIRP Design Seminar. 1999. University of Twente, Enshede, The Netherlands.
8 120. Kosanke K., CIMOSA Overview and Status. Computers in Industry, 1995. 27(2): p. 101109.
9 121. Coyne R., Logic Models of Design. 1988, London: Pitman Publishing.
2011 122. Koopman P.J., A Taxonomy of Decomposition Strategies Based on Structures, Behaviours and
Goals, in DE-Vol. 83, 1995 Design Engineering and Technical Conferences. 1995, ASME.
1 123. Maher M.L., Engineering Design Synthesis: A Domain Independent Representation. AI EDAM,
2 1988.
3 124. Zhang Y., K.J. MacCallum, and A.H.B. Duffy, A Product Knowledge Modelling Scheme for
4 Function Based Design, in AID 96 Workshop on Function Modelling in Design. 1996:
5 Stanford, USA.
125. Baykan C., Design Strategies, in Analysing Design Activity, N. Cross, H. Christiaans, and K.
6 Dorst, Editors. 1996, John Wiley and Sons.
7 126. Bayus B.L., Speed-to-Market and New Product Performance Trade-offs. Journal of Product
8 Innovation Management, 1997. 14(6): p. 485497.
9 127. Edwards, W. and J.R. Newman, Multi-attribute Evaluation, ed. R.G. Niemi. 1982: Sage
3011 Publications.
128. Zeleny M., Multiple Criteria Decision Making. McGraw-Hill Series in Quantitative Methods
1 for Management, ed. M.K. Starr. 1982, New York: McGraw-Hill Book Company.
2 129. Pugh S. Concept Selection a method that works,. in Proceedings of the International
3 Conference on Engineering Design Rome Italy 1981. 1981.
4 130. Sim, S.K. and A.H.B. Duffy, A Foundation for Machine Learning in Design. Articial
Intelligence for Engineering Design Analysis and Manufacturing, 1998. 12: p. 193209.
5 131. Eppinger S.D., et al., A Model-based Method for Organising Tasks in Product Development,.
6 Research in Engineering Design, 1994. 6(1): p. 113.
7 132. Gruninger M. and M.S. Fox, An Activity Ontology for Enterprise Modelling. 1994.
8 133. Murmann P.A., Expected Development Time Reductions in the German Mechanical
9 Engineering Industry. Journal of Product Innovation Management, 1994. 11(3): p. 236
252.
4011 134. Osmond E. and D.F. Sheldon. Whole Life Cost Reduction by Design,. in International
1 Conference on Design for Competitive Advantage Making the most of design. Coventry 2324
2 March 1994. 1994. Institute of Mechanical Engineers.
3 135. Smith P.G. and D.G. Reinertsen, Developing Products in Half the Time. 1991, New York: Van
Nostrand Reinhold.
4 136. Cordero R., Managing for Speed to Avoid Product Obsolescence: A survey of techniques.
5 Journal of Product Innovation Management, 1991.
6 137. Lockyer K., Critical Path Analysis and other Project Network Techniques. Fourth Edition ed.
7 1988, London: Pitman Publishing.
8 138. Boehm B.W., Software Engineering Economics. 1981: Prentice Hall.
139. Kusiak A. and J. Wang, Decomposition of the Design Process,. Journal of Mechanical Design,
9 1993. 115: p. 687695.
50 140. Maher M.L., Structural Design by Hierarchical Decomposition. 1989, Engineering Design
51111 Research Centre, Carnegie Mellon University: Pittsburgh, PA.

191
References

141. Srinivas K., et al., MONET: a multi-media system for conferencing and application sharing 1
in distributed systems,. 1992, Concurrent Engineering Research Center West Virginia 2
University: Morgantowm W.Va. USA.
142. Kusiak A. and J. Wang. Qualitative Analysis of the Design Process. in DE-Vol 66 Intelligent 3
Concurrent Design: Fundamentals, methodology, modelling and practice. 1993. New Orleans, 4
Louisiana: ASME, New York. 5
143. Prasad B., Concurrent Engineering Fundamentals: Integrated Product and Process 6
Organisation. Prentice Hall International Series in Industrial and Systems Engineering.
Vol. One. 1996, New Jersey: Prentice Hall PTR. 478.
7
144. AitSahila F., E. Johnson, and P. Will, Is Concurrent Engineering Always a Sensible 8
Proposition? IEEE Transactions on Engineering Management, 1995. 42(2): p. 166170. 9
145. Molina A., et al., A Review of Computer-Aided Simultaneous Engineering Systems. Research 1011
in Engineering Design, 1995. 7: p. 3. 1
146. Smith R.P. and S.D. Eppinger. Characteristics and Models of Iteration in Engineering Design.
in International Conference on Engineering Design, ICED 93. 1993. The Hague.
2
147. The Concise Oxford Dictionary,. Seventh ed, ed. J.B. Sykes. 1995: Oxford University Press. 3
148. Balachandran M., Knowledge Based Optimum Design, ed. C.A. Brebbia and J.J. Connor. 1993: 4
Computational Mechanics Publications. 5
149. Stauffer L.A. and D.G. Ullman, Fundamental Processes of Mechanical Designers Based on 6
Empirical Data. Journal of Engineering Design, 1991. 2(3).
150. Gaither N., Production and Operations Management ; A problem-solving and decision making 7
approach. 1980, Hinsdale, Illinois: The Dryden Press. 8
151. Nonaka I. and H. Takeuchi, The Knowledge Creating Company. 1995, Oxford: Oxford 91
University Press. 2011
152. Buzzell R.D. and B.T. Gale, The PIMS Principles. 1987: The Free Press.
153. Henderson R. and W. Mitchell, The Interactions of Organisational and Competitive
1
Inuences on Strategy and Performance. Strategic Management Journal, 1997. 18(Summer 2
Special Issue): p. 514. 3
154. Mintzberg H., Crafting Strategy, in Strategy: Seeking and Securing Competitive Advantage, 4
C.A. Montgomery and M.E. Porter, Editors. 1991, Harvard Busines School. p. 403420. 5
155. Johnson H., Relevance Regained: From Top-Down Control to Bottom-Up Empowerment. 1992,
New York: Free Press. 6
156. Andreasen M.M., et al. The Design Co-ordination Framework: key elements for effective 7
product development. in The Design Productivity Debate. 1998: Springer-Verlag Publications. 8
157. Kusiak A. and K. Park, Concurrent Engineering: decomposition and scheduling of design 9
activities,. International Journal of Production Research, 1990. 28(10): p. 18831900.
158. Fadel F.G., M.S. Fox, and M. Gruninger. A Generic Enterprise Resource Ontology. in Third
3011
IEEE Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises (WET 1
ICE 94). 1994. Morgantown, West Virginia. 2
159. Evans S., Implementation Framework for Integrated Design Teams. Journal of engineering 3
design, 1990. 1(4): p. 355363. 4
160. Shepherd R. and T. McCarthy, AutoCAD2000 productivity study, in Engineering Designer.
1999. p. 47.
5
161. Cantamessa M., Investigating Productivity in Engineering Design: a Theoretical and 6
Empirical Perspective, in The Design Productivity Debate, A.H.B. Duffy, Editor. 1998, 7
Springer-Verlag Publications. 8
162. Cross N., Engineering Design Methods,. 1989: John Wiley and Sons. 9
163. Hubka V., Design Tactics = methods + working principles for design engineers. Design
Studies, 1983. 4(3). 4011
164. Jones J.C., Design Methods. 1992, New York: Van Norstrand Reinhold. 1
165. Neely A. and J. Mills, Manufacturing in the UK Report on a Survey of Performance 2
Measurement and Strategy Issues in UK Manufacturing Companies. 1993, Manufacturing 3
Engineering Group: London.
166. Green G., Towards Integrated Design Evaluation: validation of models. Journal of
4
Engineering Design, 2000. 11(2): p. 121132. 5
167. Kaye M., Continuous Improvement: the ten essential criteria. International Journal of Quality 6
& Reliability Management, 1998. 16(5): p. 485506. 7
168. Haffey M., Approaches to the Denition of Performance Metrics in Design Development. 2000, 8
CAD Centre, University of Strathclyde: Glasgow.
169. Jacobson R., Microsoft Excel 97/Visual Basic Step by Step. 1997. 9
170. Brown M.G. and R.A. Svenson, Measuring R&D Productivity. Research-Technology 50
Management, 1988. July-August: p. 1115. 51111

192
1
2
3
Appendix: A Review
4
5
6
of Metrics in Design
7
8
9
Development
1011
1
2
3 The following presents detailed results of the review of metrics in design
4 development referred to in Chapter 10. The metrics are selected from a report
5 by Haffey [168] which reviews approaches to the denition of metrics in this
6 area. The results of the review are presented in a table with comments inserted
7 to illustrate particular assumptions that were made in relation to each metric.
8 A is used to denote the area that the metric relates to i.e. one of the
9 following:
2011 1. Design Efciency: relating the knowledge gained in the design activity to
1 the knowledge used.
2
2. Design Management Efciency: relating knowledge gained in the design
3
management activity to the knowledge used.
4
5 3. Design Effectiveness: comparing the output and goal knowledge of the
6 design activity.
7 4. Design Management Effectiveness: comparing the output and goal knowl-
8 edge of the design management activity.
9 The following points should be also be considered in interpreting the results:
3011
1 1. A metric such as customer satisfaction is considered to ascertain the degree
2 to which the nal product meets or exceeds the initial requirements of the
3 customer. Within the scope of the models presented here this can be seen
4 to (partially) measure design and/or design management effectiveness
5 where it is assumed that customer satisfaction is measured in relation to
6 design (artefact) or design activity goals. For example, the metric may illus-
7 trate whether the product functionality satises the customer (design effec-
8 tiveness) or how satised the customer is with the date that the product
9 was delivered (design management effectiveness). It is assumed here that
4011 the goals were accurately dened from the customer requirements and
1 therefore meeting the goals also means meeting the requirements.
2 2. Some of the metrics listed in the review are not directly attributable (NDA)
3 to design development activity performance. For example the values
4 obtained using a metric such as market share achieved could be partly
5 attributed to artefact performance and considered to reect design effec-
6 tiveness. However the values could equally be attributed to the corporate
7 image of the organisation promoting the product, the effectiveness of their
8 supply chains, etc. The metric is therefore considered too vague to be
9 directly attributed to design development performance.
50 3. In some cases the exact meaning of the metric is unclear from the source
51111 and therefore not possible to categorise or analyse it fully in this review.

193
Appendix

4. A metric measures design development performance if it can be seen to 1


relate to effectiveness (design or design management) and/or efciency 2
(design or design management). Where this is not the case an effort is made 3
to dene the metric in relation to the broader work presented here, e.g. it 4
may be a measure of design activity output (DAO). 5
5. The review is not intended as an in-depth analysis of the metrics presented 6
by Haffey, rather it presents a selection of the metrics in order to illustrate 7
their relation to the model. Although a more detailed review would provide 8
additional insight into the metrics it would provide little additional 9
contribution to the evaluation of this work. 1011
6. Citations to the literature from which the metrics dened are contained 1
within the table as cited by Haffey. The list of sources used is presented at 2
the end of this section. 3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

194
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
4011
3011
2011
1011

51111
Appendix

Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
ASSUMPTIONS/COMMENTS
Customer
Success measures for new to world strategies [16]
Degree to which products are accepted by customer and satisfy them   acceptance and satisfaction in relation
after use to design and design activity goals.
Number of NDA
customers
Level of customer satisfaction [14]   satisfaction in relation to both design
and design activity goals

Customer
satisfaction [3]
Customer satisfaction survey
Product
Function  functions expressed as design goals
Usability  usability expressed as design goals
Reliability  reliability expressed as design goals
Performance  performance expressed as design goals
Serviceability  serviceability expressed as design
goals
Ergonomics  ergonomics expressed as design goals
Aesthetics  aesthetics expressed as design goals
Timeliness of availability of  on-time means meeting time goal
product
Perceived Quality  quality refers to product not delivery
time
Product return rates  where returns due to failure to meet a
design goal
Cust. complaint levels  complaints reflect failure to meet design
goals.
Customer Return Factor NDA
Capture & loss of new NDA
customers
Contracts and markets NDA
Customer loyalty NDA

195
196
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt

Level of customer acceptance [14]


Met revenue goals NDA
Met share goals NDA
Met unit volume goals NDA
Customer satisfaction   satisfaction in relation to both design
and design activity goals
Product life length  product life expressed as a design goal
Met revenue growth NDA
Market position NDA
Field return rate  returns due to not performing functions
- functions expressed as design goals
Customer retention NDA
Number of customers NDA
Worth to retailer NDA
Sales force accept NDA
Purchase intent rate NDA
Sales Vs plan NDA
Price/Value to customer NDA
Purchase repeat rate NDA
Sales in year X (5) NDA
Customer trial rate NDA
Test market rating NDA

Customer
acceptance [14]
* Desired Revenue Goals NDA
Revenue growth NDA
Market share goals met NDA
*Customer Satisfaction   satisfaction in relation to both design
and design activity goals
*No of Customers NDA
*Price/value of customers NDA

Meeting the needs of the customer


[14] Market share NDA
Appendix

9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
91

4011
3011
2011
1011
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
4011
3011
2011
1011

51111
Appendix

Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Customer   acceptance in relation to both design
acceptance and design activity goals
Customer   satisfaction in relation to both design
satisfaction and design activity goals
No of customers adopting to product NDA
Test market trial NDA
rate

Customer Value [3]


Product  performance expressed as design goals
performance
Features  features expressed as design goals
Expected Features  features expressed as design goals
Typical Features  features expressed as design goals
Exciting Features  features expressed as design goals
Ease of Use  ease of use expressed as design goals
Reliability  reliability expressed as design goals
Performance  performance expressed as design goals
Longevity  longevity expressed as design goals
Length of Service  length of service expressed as design
goals
Mainten  maintenance expressed as design
ance goals
Reqts   could be design and design activity
goals
Product Price NDA
Initial Cost
Material
Cost
Material in  material cost expressed as design goal
product
Scrap Costs NDA - could be manufacturing
performance
Labour
Costs

197
198
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Value added   resource cost
lab.
Wasted   resource cost
labour
Overhead Costs   resource cost
Distribution Costs NDA
Life Costs
Maintenance Cost  maintenance cost expressed as design
goals
Warrant  warranty cost expressed as design
y Costs goals
Replacement Cost  replacement cost expressed as design
goals
Delivery, Installation NDA
& Service
Product Availability
Anticipation of market NDA
Time to   where reaching market means an
Market assumed output
Ease of Installation  ease of installation expressed as design
goals
Ease of Service  ease of service expressed as design
goals
Firm/Organisation

Firm level metrics [8] [14]


% current sales by new products [24] NDA
% resources going to un/successful ventures NDA
[5]

Metrics at organisation level [20]


No of projects   where this is a measure of output over
completed a time period
Customer   satisfaction in relation to both design
satisfaction and design activity goals
% sales due to new NDA
products
Appendix
Appendix

Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Time to market   where reaching market means an
assumed output
Program

Internal Business Perspective [25]


agreed milestones/objectives met  where milestones/objectives are design
activity goals
No.   where this is a measure of output over
projects/products a time period
completed
Speed   time based efficiency
Efficiency/keeping  where budget is expressed in design
within budget activity goals - to be clarified
Quality of  where quality is expressed as a design
output/work goal
Behaviour (in group) Influence - aspect of resource
Planning accuracy  where accuracy of plan is expressed as
a design goal

Internal Business Performance [24]


Hours with customer influence - resource measure
on new work
Tender success rate NDA
Rework  if rework due to design error

Safety incident NDA


index
Project performance NDA
index
Project closeout NDA
cycle

Productivity indicators (R&D) [4]


Resources
% key skill areas influence - resource attribute
learned by personnel
Project
management

199
200
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
% tech specs met or  where tech spec is list of design goals
exceeded, averaged
across completions
% completion dates met  where dates are expressed as design
or exceeded activity goals
People
management
% fully satisf.or above influence - resource attribute
personnel resigning/year
Planning No Eng. change orders influence - where spec changes are
due to spec changes requested by external "customer"
before product release
New tech.study and No. patents/total no of  where patenting is a design goal
development R&D employees
Outputs
No of complaints/   where complaints are for not meeting
product year ,ave across design or design activity goals
project, with 3 month
rolling ave.
Division
results/outcomes
%sales from products NDA
released in last 3 years
Score on annual R&D NDA
scorecard survey
Annual sales/total NDA
budget

Program level measures [14]


Bold - Survey of actual core Program hit 5 year  where program objectives are
measures objectives expressed as design activity goals
used by companies Program exceeds  where program objectives are
our objectives expressed as design activity goals
Impact of new product program on corporate   requires alignment of corporate goals
performance with NPP goals
ROI for new product development process NDA
New product NDA
program profitability
Appendix
Effectiveness Efficiency
Appendix

Design Design Design Design


Mgmt. Mgmt
New product NDA
program sales
Subjective importance of new product NDA
program

Management Complexity [2 & 10]


35% of total design effort Communication influence - resource attribute
spent on Patterns
communication. Team size influence - resource attribute

Design Management Complexity [30]


Number of influence - resource measure
Designers
Number of CAD influence - resource measure
Tools
Design Complexity influence - level of knowledge
processed (K+)

Design team complexity [2]


Team Experience/ influence - resource attribute
Individual influence - resource attribute
Experience

Project / Product
Cycle Time [15 & 17]
Product Complexity influence - level of knowledge
processed (K+)
Management
Complexity
Team size [2] influence - attribute of a resource
Communication Pattern [2] influence - attribute of a resource
Degree of Change influence

Cycle Time [15]


Projects inherent influence - level of knowledge
complexity processed (K+)

201
202
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Degree of change in product from generation (Newness in product or influence - level of knowledge
process) processed (K+)

Development Time [15]


% change across product generations influence - level of knowledge
processed (K+)
complexity influence - level of knowledge
processed (K+)
Formal process influence - use (or not) of an approach
utilised (yes/no)

Quality [6]
Relating quality and rework Rate of production   time based efficiency
[2] of drawings
No. of engineering change requests  assuming changes due to design errors
Average time   efficiency of "change resolving" activity
needed to resolve

Product quality [13]


% Re-use influence - indicator of novelty and level
of knowledge processed (K+)
% DFM/DFA influence - degree of use of a method
No of new parts influence - novelty
No of tests passed /  if testing achievement of Design goals
failed
No of design influence - attribute of an approach
reviews

Innovation and learning perspective [25]


# Patents  where establishing a patent is
expressed in design goals
# Ideas/findings  where no. of ideas/findings is
expressed as a design goal
Creativity  where creativity/innovation is expressed
/innovation level as a design goal
Network building Influence establishing links with other
areas of expertise
Appendix
Effectiveness Efficiency
Appendix

Design Design Design Design


Mgmt. Mgmt

Innovation and learning perspective [20]


Technology # patentable discoveries / spent  where establishing high technology in
leadership products is a design goal
Long term focus % Budget spent int. & ext. on  influence - quantity of a resource
basic and applied research
High absorptive % projects in co-operation with 3rd influence - use of external resource
capacity party
Learning % of project evaluation ideas applied in new influence - use of knowledge gained in
Organisation projects previous projects

Degree of change [15]


Degree of change in influence - novelty, level of knowledge
product processed (K+)
Degree of change in manufacturing process influence - novelty, level of knowledge
processed (K+)

Product Complexity [15]


[19 & 29] all suggest No. functions influence - complexity, level of
performed knowledge processed (K+)
complexity should be based No. technologies/functional specialities influence - complexity, level of
on functional decomposition. knowledge processed (K+)

Product Complexity [15]


Functional influence - complexity, level of
Complexity knowledge processed (K+)
Technical Difficulty
[2]
Effort to design   resource used
Size of function influence - complexity, level of
knowledge processed (K+)
Function repetition influence - novelty, level of knowledge
processed (K+)

Key Performance measures [26]


Clarke & Fujimoto (5) Engineering Eng. hours per normalised project   resource to achieve output (project)
productivity

203
204
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Time to market for normalised project   where reaching market means an
assumed output
Total product quality conformance quality  conformance means meeting design
goals
product integrity  where integrity is expressed as design
goal(s)
a high level measure of product performance  where product performance is
expressed in design goals
consistency with an overall company identity unclear from source
Cusumano & Nobeoka [9] No of products   where this is a measure of output over
introduced a time period
Design  where manufacturability is expressed
manufacturability as a design goal
Market share/growth NDA
Return on R&D NDA

Metrics at individual level[20]


Working Speed  
Accuracy  where accuracy is expressed as a
design goal
Ability to work in influence - resource attribute
teams
Achievement of  where milestones are expressed as
Mile stones design activity goals

Project level Metrics [8]


New product NDA
profitability
Time to market   where reaching market means an
assumed output
Market share NDA
achieved

Key PD performance measures


Product/Project
success [14]
Development cost   assuming development means
achieving a predefined output
Appendix
Appendix

Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Launch time schedule met  where schedule is expressed as design
activity goals
Speed to market   where reaching market means an
assumed output
Product performance level  where product performance is
expressed in design goals
Quality Guidelines met  where quality guidelines are expressed
in design goals
*Innovativeness  where innovativeness is expressed as a
design goal
*Quality guidelines met  where quality guidelines are expressed
as a design goal

Success measures for new to world strategies [16]


Degree to which project met market share NDA
goals
Degree to which project met revenue goals NDA
Degree to which project met revenue growth NDA
goals
Degree to which project met unit volume NDA
goals
Degree to which project met performance  where project performance is
spec expressed in design activity goals
Degree to which project met quality spec  where quality spec is expressed in
design goals
Development cost of project   where development means achieving a
predefined output
Degree the project provides a competitive NDA
advantage
Level of innovativeness of project  where innovativeness is expressed in
design goals
Projects ability to launch on time 
Speed to market: development length   where predefined output is assumed

Product Performance [14]


Performance  where performance is expressed as a
design goal

205
206
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Speed to market   where reaching market means an
assumed output
Completed in  where budget is expressed as design
budget activity goals
Subjectively unclear from source
successful
Technically  where technical success is defined as a
successful design goal
Got to market on 
time

Measure for product improvement Strategies [16]


Profits NDA
Competitive NDA
advantage
Customer satisfaction in relation to both design
satisfaction and design activity goals
Market share NDA
Met revenue growth NDA
targets

Product (project) level measures [14]


Ease to automate the production process  where ease of automation is expressed
as a design goal (i.e. dfx)
Competitive reaction unclear from source
Provides a sustainable competitive advantage unclear from source
Meets cost goals  where cost goals are expressed in
design activity goals
Cost of developing   cost based efficiency
product
Development  
efficiency
Ease of  where ease of manufacture is
manufacture expressed as a design goal (i.e. dfm)
Launched in budget  where budget is expressed in design
activity goals
Level of innovation achieved  where innovativeness is expressed in design goals
Appendix
Effectiveness Efficiency
Appendix

Design Design Design Design


Mgmt. Mgmt
achieved design goals
Launched on time  where on-time launch is expressed as a
design activity goal
Technical perf. of product (performs to spec)  where spec. is expressed in design
goals
Relative product  where product performance is
performance expressed as design goals
Probability of Unclear from source
success
Development project progress Vs milestones  where progress against milestones is
expressed in design activity goals
Met quality  where quality is expressed in design
guidelines goals
Speed to market   where reaching market means an
assumed output
Management subjective assess. of success NDA
Ability to accrue political support within firm NDA
Team satisfaction influence - resource attribute
Product rewarded denoting tech. excellence  where technical excellence is a result of
achieving design goals
Technical success  where technical success is a result of
of product achieving design goals
Impact on sales on other products % Unclear from source
cannibalised
Product yield rate through manufacturing  where product yield is expressed as
process design goals

Process
Development process [15]
Type of process Formal/Informal (Stage influence - use of an approach
used gate)
Delivery of customer   customer needs expressed as
needs design/design activity goals

Process quality [13]


Turnover NDA

207
208
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
% Concurrency influence - attribute of an approach
(CE)
No of requested changes + (no of specific unclear
changes)
Time lost to other unclear
projects
No of on time influence - attribute of an approach
milestones (design management)

Process Control [31]


Manufacturing
efficiency
Manufacturing Yield   manufacturing not design development
Scrap rates  manufacturing not design development
Line rejects  manufacturing not design development
Defects  manufacturing not design development
Rework levels  manufacturing not design development
Cycle times   manufacturing not design development
Order intervals manufacturing not design development
Inventory levels  manufacturing not design development
Inventory turnover  manufacturing not design development
Assets NDA

Process Capability Prediction [3]


Accuracy of
Prediction
Cost  where cost prediction accuracy is a
design goal
Development schedule  where development schedule prediction
accuracy is a design goal
Manufacturing schedule  where development schedule prediction
accuracy is a design goal
Product reliability  where reliability prediction accuracy is a
design goal
Product performance  where performance prediction accuracy is a
design goal
Appendix
Appendix

Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
is a design goal
Sales forecast not design
Variation in
Prediction
Variation between Programs NDA
Variation between Quarters NDA

Process productivity and resource Consumption


[3]
Indicators of
efficiency
Staffing   use of (people) resources
Materials   use of (material) resources
Energy   use of (energy) resources
Capital   use of (capital) resources
Equipment   use of (equipment) resources
Waste creation  
levels
Expense/ Revenue
ratios
By Function NDA
By Product NDA
General
Staffing levels  
Task volumes  
Equipment use  
Early use of 0/1 for using prototype in specification stage of influence - use (or not) of method
prototypes project
Concurrence of Time project was simultaneously in more than 1 influence - attribute of an approach
project phases phase, in % of total duration
Change of No. of changes in detailed spec influence - changes due to external
specifications over project duration activities
Number of No. of milestones used in project influence - attribute of an approach
Mile stones
Number of design No of reviews used in project influence - attribute of an approach
reviews

209
210
Effectiveness Efficiency
Design Design Design Design
Mgmt. Mgmt
Crossfunctional Integration Ext sources of ideas influence - use of external resource
[26] knowledge
Early market influence - attribute of an approach
involvement
Reverse influence - attribute of an approach
Engineering
Preferred parts list influence - attribute of an approach
DFM information influence - attribute of an approach
Value engineering influence - use (or not) of a method
Early manufacturing influence - use (or not) of approach
involvement
Design complexity influence - attribute of resource
knowledge
Early purchasing influence - use (or not) of approach
involvement
Early supplier influence - use (or not) of approach
involvement
Joint supplier influence - use (or not) of approach
designs
Cooperation with influence - use (or not) of approach
basic research

People management and Team rewards 0/1 for rewards based influence - use (or not) of approach
learning [26] on team not individual
performance
Job rotation No of engineers involved influence - attribute of an approach
in rotation at business
level (%)
Training per Active training per influence - level of knowledge of
employee employee resources
Cross- functional Prop. engineers involved influence - attribute of an approach
training in cross functional
training
Appendix

9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1
8
7
6
5
4
3
2
1
9
8
7
6
5
4
3
2
1

50
91

4011
3011
2011
1011
Appendix

1 Sources of Metrics
2
3
4 The following references are cited throughout the Table A-1 as the sources
5 for particular metrics listed:
6
7 1. Bahill A.T. and W.L. Chapman. Case Studies in System Design. in International Symposis and
Workshop on Systems Engineering at Computer Based Systems. 1995.
8 2. Bashir H.A. and V. Thomson, Metrics for Design Projects A Review. Design Studies, 1999.
9 20: p. 263277.
1011 3. Beaumont L.R., The PDMA Handbook of New Product Development, , J.M.D. Rosenau, Grifn,
1 A., Castellion, G.A. and Anschuetz, N.D., Editor. 1997, John Wiley and Sons, Inc. p. 463485.
4. Brown W.B. and D. Gobeli, Observations on the Measurement of R & D Productivity: A Case
2 Study. IEEE Transactions on Engineering Management, 1992. 39(4): p. 325331.
3 5. Clark K.B. and T. Fujimoto, Product Development Performance. 1991: Harvard Business
4 School Press.
5 6. Cooke J.A., C.A. McMahon, and M.R. North, Metrics in the Engineering Design Process.
6 Proceedings Institution of Mechanical Engineers, 1999. 213(Part B): p. 523526.
7. Cooper R.G., Debunking the Myths of New Product Development. Research Technology
7 Management, 1994. 37: p. 4050.
8 8. Cooper R.G. and E.J. Kleinschmidt, Benchmarking the Firms Critical Success Factors in New
9 Product Development. Journal of Product Innovation Management, 1995. 12: p. 374391.
2011 9. Cusumano M.A. and K. Nobeoka, Strategy, Structure, and Performance in Product
Development: Observations from the Auto Industry, in Managing New Product Development,
1 T. Nishiguchi, Editor. 1996, Oxford University Press. p. 75120.
2 10. Fenton N.E., Software Metrics: A Rigorous Approach. 1991: Chapman and Hall.
3 11. Foster R.N., et al., Improving the Return on R & D Part I. Research Management, 1985. 28:
4 p. 1217.
5 12. Foster R.N., et al., Improving the Return on R & D Part II. Research Management, 1985. 28:
p. 1322.
6 13. Goldense B.L., Rapid Product Development Metrics. World Class Design to Manufacture, 1994.
7 1(1): p. 2128.
8 14. Grifn A. and A.L. Page, An Interim Report on Measuring Product Development Success and
9 Failure. Journal of Product Innovation Management, 1993. 10: p. 291308.
15. Grifn A., Metrics for Measuring Product Development Cycle Time. Journal of Product
3011 Innovation Management, 1993. 10: p. 112125.
1 16. Grifn A. and A.L. Page, PDMA Success Measurement Project: Recommended Measures for
2 Product Development Success and Failure. Journal of Product Innovation Management, 1996.
3 13: p. 478496.
4 17. Grifn A., The Effect of Project and Process Characteristics on Product Cycle Time. Journal of
Marketing Research, 1997. 34(1): p. 2435.
5 18. Hauser J.R. and F. Zettelmeyer, Metrics to Evaluate R, D&E. Research Technology
6 Management, 1997. July-August 1997: p. 3238.
7 19. Hubka E., Theory of Technical Systems. 1988: Springer-Verlag.
8 20. Hutlink E.J., et al., New Consumer Product Launch: Strategies and Performance, . 1999,
University of Strathclyde: Glasgow.
9 21. Jacome M.F. and V. Lapinskii, NREC: Risk Assessment and Planning of Complex Designs. IEEE
4011 Design & Test of Computers, 1997. January March 1997: p. 4249.
1 22. Kan S.H., Metrics and Models in Software Quality Engineering. 1995: Addison-Wesley.
2 23. Kaplan R.S. and D.P. Norton, The Balanced Scorecard Measures that Drive Performance.
3 Harvard Business Review, 1992. Jan- Feb 1992: p. 7179.
24. Kaplan R.S. and D.P. Norton, Putting the Balanced Scorecard to Work. Harvard Business
4 Review, 1993. Sep-Oct 1993: p. 134147.
5 25. Kerssens-van Drongelen I.C. and J. Bilderbeek, R&D Performance Measurement: More Than
6 choosing aSset of Metrics. R&D Management, 1999. 29(1): p. 3546.
7 26. Loch C., L. Stein, and C. Terwiesch, Measuring Development Performance in the Electronics
Industry. Journal of Product Innovation Management, 1996. 13: p. 320.
8 27. McGrath M.E. and M.N. Romeri, The R&D Effectiveness Index: A Metric for Product
9 Development Performance. Journal of Product Innovation Management, 1994. 11: p. 213220.
50 28. Moser M.R., Measuring Performance in R & D Settings. Research Technology Management,
51111 1985. 28(5).

211
Appendix

29. Pahl G. and W. Beitz, Engineering Design: A Systematic Approach. 1988: The Design Council. 1
30. Smailagic A. Benchmarking an Interdisciplinary Concurrent Design Methodology for 2
Electrical/Mechanical Systems. in 32nd Design Automation Conference. 1995.
31. Walrad C. and E. Moses, Measurement: The Key to Application Development Quality. IBM 3
System Journal, 1993. 32(3): p. 445460. 4
32. Zirger B.J. and M.A. Maidique, A Model of New Product Development: An Empirical Test. 5
Management Science, 1990. 36: p. 867883. 6
7
8
9
1011
1
2
3
4
5
6
7
8
91
2011
1
2
3
4
5
6
7
8
9
3011
1
2
3
4
5
6
7
8
9
4011
1
2
3
4
5
6
7
8
9
50
51111

212
1
2
3
Index
4
5
6
7
8
9
1011
1
2
311 Action 910 Decomposition relationships 6566
4 Activity management 6165 Delft study
5 Actual effectiveness 116 Design Activity Management model of
Actual exploitation 110 135, 139
6
Analyser 161 elements of performance in 140
7 Analysis 9 overview of 137139
8 in PERFORM approach 111121 Dependent activities 68
9 of performance 105 Descriptive models 11
2011 Analysis approach in PERFORM Design
1 approach 112113 characteristics of performance in
2 Analysis measures 115116 1314
3 in PERFORM approach 114121 knowledge-based model of 5660
4 Assessment 9 management and 6163
5 in PERFORM approach 110111 Design Activity Management (DAM)
Attainment of objectives 21 model 135
6
Axioms of performance 185186 complexity and novelty 176177
7 comprehensiveness 172173
8 Balanced scorecard 2629 of Delft case 139
9 Basic design cycle 12 generality 172
3011 Brainstorming 108 subjects discussed 154 157158
1 Business level approaches 38 supporting 148149
2 supporting coherence 173
3 Coherence 11 Design activity modeling 2326
4 achieving 83 Design activity performance 1213
5 in design performance 1517 Design cycle, basic 12
goal alignment for 1617 Design development
6
supporting 2629 nature of performance in 181
7 Computer-aided design 32 performance in 1117
8 Consolidation 108 term 2
9 Constraints 5758 Design development analysis 3941
4011 Cost 150 Design development metrics, insights
1 quality versus 24 into 149151
2 Cost reduction 42 Design development performance
3 Customer need 60 analysis for PERFORM approach
4 Customer satisfaction 148 174176
5 Design development performance
DAM model, see Design Activity formalism 126 127128
6
Management model Design effectiveness 76 147
7 Data input, ideal 45 Design efciency 147
8 Data-ordering templates 160 Design management effectiveness 76 147
9 Data representation in PERFORM Design management efciency 147
50 approach 113114 Design performance 12
51111 Datum method 113 application and key features 135186

213
Index

coherence in 1517 Goal prioritisation in PERFORM 1


denition of 78 approach 108109 2
enhanced, methodology for 55131 Goals 5759 3
factors inuencing 1415 inuence of 9192 4
measurement of 1936 Graphical templates 160
5
nature of 718
need for 751 Ideal exploitation 110 6
Design performance model (E2 model) Impact 7
6979 135 relative, assessing 99100 8
complexity and novelty 176177 of resources, principles of analysing 9
comprehensiveness 172173 102 1011
effectiveness in 7377 Impact proles 111 1
efciency in 7073 nature of 9699 2
generality 172 Improvement measures 116119 3
subjects discussed 154 157 Independent activities 68 4
supporting 148149 Industrial appraisal 136 153158
5
Design performance modelling and Industrial insight 4150
analysis 125131 Industrial practice 3751 156 6
analysis approach 129130 Industrial relevance 184185 7
area of performance analysis 128129 Industry, application of PERFORM 8
methodology for 183184 approach in 162167 91
overview of methodology 125127 Inuencing factors, identication and 2011
problems/deciencies in 183 quantication of 2934 1
requirements for 183 Interdependent activities 68 2
subjectivity 175 3
Design process, phase model of 25 Key elements 181185 4
Development cost 149 Knowledge, resource 9394
5
Knowledge-based efciency 178
E2 model, see Design performance Knowledge-based model of design 5660 6
model Knowledge categories 5960 7
Ease 43 8
Effectiveness 30 Lead time 30 9
actual 116 3011
characteristics of 86 Managed activity 65 1
in design performance model 7377 Managed activity relationships 6568 2
impact of resources on 89103 Management, design and 6163 3
inuences on 9093 Management information systems 37 4
measuring and managing 8082 Manufacturing performance 2
5
potential 116 Methodology review 171179
relating efciency and 7778 Metrics review and analysis 136 147151 6
relating resource use and 94102 7
resource 100102 Objectives, attainment of 21 8
Effectiveness metrics 7477 Organizational performance 12 9
Efciency 4011
characteristics of 86 PERFORM approach 105124 127 135 1
in design performance model 7073 analysis approach in 112113 2
knowledge-based 178 analysis in 111121 3
measuring 8283 analysis measures in 114121 4
relating effectiveness and 7778 application of, in industry 162167
5
use of 178 applying, to improve performance
Efciency metrics 7173 142145 6
Empirical research 3033 assessment in 110111 7
comprehensiveness 174 8
Goal alignment for coherence 1617 data representation in 113114 9
Goal denition in PERFORM approach design development performance 50
108 analysis for 174176 51111

214
Index

1 discussion 123124 PMM, see Performance Measurement


2 generality 174 and Management model
3 goal denition in 108 Potential effectiveness 116
4 goal prioritisation in 108109 Potential exploitation 110
implementation of 159170 Prescriptive models 11
5
overview 105106 107 Presentation templates 160
6 potential developments for 179 Presenter 161
7 process model 107 Prioritisation matrix 113
8 representation in 121 Productivity 30
9 resource denition in 109110
1011 resource impact in 110111 Quality, cost versus 24
1 specication 106 108110 Quality Function Deployment approach
2 supporting 122123 29
3 supporting coherence 174
4 system overview 160162 Rank-sum rule 113
validation of results for 167170 Relationship modeller 160
5
PERFORM matrix 113 Relative impact 105106
6 Performance 7879 assessing 99100
7 analysis of 105 Reliability 149150
8 applying PERFORM approach to Representation in PERFORM approach
9 improve 142145 121
2011 axioms of 185186 Representer 161
1 characteristics of, in design 1314 Research, empirical 3033
2 dening and modeling 2123 Resource denition in PERFORM
3 design, see Design performance approach 109110
4 in design development 1117 Resource effectiveness 100102
formalising 139141 Resource effectiveness measure 114115
5
measuring 141142 Resource exploitation 110
6 nature of, in design development 181 Resource impact 9596
7 Performance analysis, area of 128129 on effectiveness 89103
8 Performance drivers 27 in PERFORM approach 110111
9 Performance Measurement and principles of analysing 102
3011 Management Resource Impact model 96 126127 135
1 k model (PMM) 7983 135 sensitivity 175
2 comprehensiveness 172173 Resource impact proles 177 178
3 generality 172 Resource knowledge 9394
4 subjects discussed 157158 Resource use, relating, and effectiveness
supporting coherence 173 94102
5
Performance Measurement Resources, inuence of 9293
6 Questionnaire 2629 Return on investment 119120
7 Performance measurement system 810
8 Performance metrics 10 Strategy, inuence of 9091
9 Performance modelling and analysis, Strategy planning 24
4011 design, see Design performance
1 modelling and analysis Temporal relationships 6668
2 Performance research Theory of Constraints 26
3 overview of 1921
4 trends in 2021 Variant management 42
Performance trends, establishing 179
5
Phase model of design process 25 Worked example 136 137146
6
7
8
9
50
51111

215

You might also like