You are on page 1of 359

ENGINEERING PRODUCTIVITY MEASUREMENT

By
Luh-Maan Chang
Maged E. Georgy
Lei Zhang

A Report to
The Construction Industry Institute
The University of Texas at Austin

Under the Guidance of
Research Team 156
Engineering Productivity Measurement

Purdue University
West Lafayette, Indiana

August 2001

© 2001 Construction Industry Institute™.
The University of Texas at Austin.
CII members may reproduce and distribute this work internally in any medium at no cost to internal recipients. CII members are
permitted to revise and adapt this work for the internal use provided an informational copy is furnished to CII.
Available to non-members by purchase; however, no copies may be made or distributed and no modifications made without prior
written permission from CII. Contact CII at http://construction-institute.org/catalog.htm to purchase copies. Volume discounts
may be available.
All CII members, current students, and faculty at a college or university are eligible to purchase CII products at member prices.
Faculty and students at a college or university may reproduce and distribute this work without modification for educational use.
Printed in the United States of America.

TABLE OF CONTENTS

LIST OF TABLES ............................................................................................................ vii
LIST OF FIGURES............................................................................................................ ix
EXECUTIVE SUMMARY..............................................................................................xiii
CHAPTER 1: INTRODUCTION ....................................................................................... 1
1.1 Background ....................................................................................................... 1
1.2 Research Objectives .......................................................................................... 2
1.3 Composition of the Research Team 156 ........................................................... 3
1.4 Research Contributions ..................................................................................... 5
1.5 Report Organization .......................................................................................... 6
CHAPTER 2: PREVIOUS CII STUDIES .......................................................................... 7
2.1 Engineering Project Control.............................................................................. 7
2.2 Input Variables Influencing Engineering Performance..................................... 9
2.3 Measuring Engineering Performance.............................................................. 12
CHAPTER 3: RESEARCH METHODOLOGY .............................................................. 17
3.1 Research Focus Areas ..................................................................................... 17
3.2 Research Methodology.................................................................................... 23
3.3 Evolvement of Engineering Productivity Measurement Approach ................ 28
CHAPTER 4: BASIC DEFINITIONS OF ENGINEERING DESIGN IN INDUSTRIAL
CONSTRUCTION PROJECTS........................................................................................ 31
4.1 Industrial Facility Delivery Process ................................................................ 31
4.2 Engineering Definition.................................................................................... 33
4.3 Engineering Disciplines .................................................................................. 40
CHAPTER 5: AN INDUSTRY-WIDE SURVEY OF ENGINEERING PERFORMANCE
IN INDUSTRIAL CONSTRUCTION PROJECTS.......................................................... 43
5.1 General ............................................................................................................ 43
5.2 Project performance measures ........................................................................ 45
5.3 Project changes and field rework .................................................................... 48
5.4 Summary ......................................................................................................... 52
CHAPTER 6: CURRENT PRACTICES IN MEASURING ENGINEERING
PRODUCTIVITY ............................................................................................................. 53

iii

6.1 Literature Review on Productivity Measurement for Engineering
and Other Fields .................................................................................................... 53
6.2 Engineering Control Systems in Industrial Construction................................ 62
6.3 Earned Value Concept for Measuring Engineering Performance ................... 63
6.4 Components of the Engineering Performance Control Systems..................... 65
6.5 Summary ......................................................................................................... 87
CHAPTER 7: ENGINEERING PRODUCTIVITY MEASUREMENT SYSTEM.......... 89
7.1 Basics .............................................................................................................. 89
7.2 Conceptual Framework for Measuring Engineering Productivity .................. 91
7.3 Software development Productivity: Similarities and Benefits ...................... 94
7.4 Overview of the Conceptual Approach of Engineering Productivity
Measurement ......................................................................................................... 97
7.5 Data Collection.............................................................................................. 101
7.6 Regression Model Development ................................................................... 102
7.7 Remarks on the Implementation of the Conceptual Model........................... 116
7.8 Summary ....................................................................................................... 119
CHAPTER 8: INTEGRATED SCHEME FOR ASSESSMENT OF ENGINEERING
PERFORMANCE ........................................................................................................... 121
8.1 Research Approach ....................................................................................... 122
8.2 Generic System Architecture ........................................................................ 124
8.3 Neurofuzzy System Development................................................................. 126
8.4 Multiple Attribute Utility Function Development ........................................ 133
8.5. Generic System Verification and Validation ............................................... 140
8.6. Comparison with Statistical Approaches ..................................................... 144
8.7. Demonstrated Use ........................................................................................ 148
CHAPTER 9 SUMMARY AND CONCLUSIONS ....................................................... 161
9.1 Research Summary........................................................................................ 161
9.2 Research Conclusions ................................................................................... 163
9.3 Recommendations for Future Work.............................................................. 165

Appendix A: Research Proposal ..................................................................................... 169
Appendix B: Results of RT-156 Sub-teams.................................................................... 173
Appendix C: Review of Other Industry Practices........................................................... 203
Appendix D: Piping Engineering and Design ................................................................. 223
Appendix E: Questionnaire Survey................................................................................. 233

iv

Appendix F: Illustrations for Engineering Control and Productivity Measurement....... 269
Appendix G: Project Data ............................................................................................... 281
Appendix H: Regression Analysis Results ..................................................................... 291
Appendix I: Basics and Illustrations for Neurofuzzy Systems and Multiple Attributes
Utility Functions ........................................................................................ 303
Appendix J: Sensitivity Analysis for the Project Input Variables Categories................. 323
Appendix K: Additions to the Benchmarking and Metrics Survey ................................ 335
Appendix L: Presentation Materials for the Plenary Session and the Implementation
Session at the 2001 CII Annual Meeting................................................... 343

REFERENCES................................................................................................................ 349

v

LIST OF TABLES

Table

Page

3.1 Engineering Productivity Drivers by Industry ............................................................ 21
4.1 Industrial Project Delivery Process (CII BM&M Committee) ................................... 30
4.2 Engineering Activities within Pre-Project Planning Phase ......................................... 34
4.3 Engineering Activities within Design Phase............................................................... 35
4.4 Engineering Activities within Materials Management (Procurement) Phase ............. 36
5.1 Percentage of Actual Phase Cost to the Total Project Cost......................................... 44
5.2 Performance Measures for Various Project Phases..................................................... 45
6.1. Sample Breakdown of Piping Workhours.................................................................. 68
6.2 Tracking of Work Completed in Piping Engineering and Design .............................. 72
8.1 Assessment of UL and UH for Each Engineering Performance Measure .................. 134
8.2 Individual Utility Functions for Engineering Performance Measures ...................... 135
8.3 Preference Structure Based on the Comparative Judgment of the Participating
Members.......................................................................................................................... 138
8.4 Characteristics of Projects PVER1 and PVER2 .............................................................. 140
8.5 Targeted and Calculated Outputs for the Verification Projects ................................ 141
8.6 Characteristics of Projects PVAL1 and PVAL2 .............................................................. 142
8.7 Targeted and Calculated Outputs for the Validation Projects................................... 143
8.8 Linear Regression Models for Estimating the Engineering Performance Measures, y1,
y2, …, y10 ......................................................................................................................... 144

vii

8.9 Predicted Performance Measures Based on Neurofuzzy System and Linear
Regression Models .......................................................................................................... 146
8.10 Estimated Total Engineering Performance, UT, for Project PVAL1 .......................... 148
8.11 Estimated Total Engineering Performance, UT, for Project PVAL2 .......................... 148
8.12. Characteristics of the Reference Project PREF ........................................................ 151
8.13. Estimated Total Engineering Performance, UT, for Project PREF ........................... 152
8.14. Calculated Measures of Engineering Performance for Varying Conditions of the
Input Variable “Completeness of Scope Definition” ...................................................... 153
8.15 Calculated Engineering Performance Measures for Various Cases of Project
Information Inputs........................................................................................................... 155
8.16. Maximum Improvement in Total Engineering Performance, ∆U T , for the Different
Project Input Variable Categories ................................................................................... 157

viii

... 18 3..................................................................................5..... 49 6................. Net Cost Impact of “Project Scope Changes” During the Various Project Phases................................................ 74 ix .... 47 5................................. 71 6..........................3.....................2 Breakdown of $TIC by Project Phase .....................................................1 Level of Influence of Project Phases............................... 48 5............5 Research Focus Regarding the Engineering Productivity Measurement ...........7...............3 Overview of Engineering Control Systems Regarding Work-hours..............2 Research Focus..4 Initial Approach of Engineering Productivity Measurement ..................... 30 5................6.......... 29 3..1 Value-To-Customers Scheme .5 Overview of Engineering Control Systems Regarding Work Quantities ..................................................................................................................... Frequency of “Project Development Changes” During the Various Project Phases ............................ 42 5.................. Research Methodology...........3 Frequency of “Project Scope Changes” During the Carious Project Phases ...............................4 Methods for Tracking Engineering Work Progress ........................................................ 65 6...................................................... 69 6.......... 67 6..............................4.................................. Sources of Field Rework ......... 15 3.....................1 Levels of Tracking Engineering Performance ....................................................................................... 44 5.............................. 48 5................................................2 Approaches for Estimating Engineering Workhours Budget at Project Authorization .. 24 3...........1 Objective Matrix Components ...................LIST OF FIGURES Figure Page 2.......... 47 5.............. Net Cost Impact of “Project Development Changes” During the Various Project Phases ............ 19 3....................

....... 90 7...........................................7 Design Contract Type........................................................ 76 6........14 Overview of Engineering Control Systems Regarding Variance or Deviations................................................10 Piping Productivity Regression Model (PIPING 2)...................... 82 6.................12 Overview of Engineering Control Systems Regarding Productivity/Unit Rate...............7 Measuring Total Engineering Productivity on a Project.....................................11 Suitability of Piping Productivity Metrics .............................. 104 7.......................................... 79 6........... 89 7.......1 General Approaches of Measuring Engineering Productivity . 98 7........................................................................................................ 78 6.........4 Project Type ...................3...............9 Basic Approaches in Measuring Piping Engineering Productivity..... 81 6............................................................................6 Client/Owner Participation............................................2 CII RT-156 Preliminary Scope Model ..................................... 76 6............5 Project Size ($TIC).......... 104 7.................... 125 x .. 102 7.......6 Measuring Productivity for Engineering Disciplines......1 Effectiveness Concept .......... 103 7........................................................8 Split Engineering Practices ..13 Approaches for Calculating the Engineering Performance Factor.....8 Measuring Annual Company-Wide Engineering Productivity ....................2 General Research Approach......... 80 6...............................................3 Schematic Representation of Generic System Architecture .................... 124 8............................................................................................................... 103 7........................................................................ 122 8............................... Research Focus Regarding the Conceptual Approach .............6...........................................................................................................11 Piping Productivity Regression Model (PIPING 3).................................................... 83 7......................................9 Project Industry Type ...... 77 6..10 Feasibility of Collecting Piping Engineering Productivity Data....... 102 7............... 115 7.............. 116 8.....................

........8.................6 Fuzzy Representation of the Linguistic Variable “Completeness of Scope Definition” .. 130 8....................... Influence Chart for the Input Variable “Completeness of Scope Definition”.............................9 Influence Chart for Category “Project Information Inputs”............................7 Training Cycles for the Neurofuzzy System ...............................5 Fuzzy Representation of Numerical Variable “% Design Rework” .........8........................ 156 xi ................. 155 8............... 129 8.4 Fuzzy Representation of Numerical Variable “Project Size ($TIC)” ........ 128 8........................ 131 8.......

The few companies that are presently using "installed quantity" data during the design phase of projects are using it primarily for detection of “scope xiii . unlike Construction labor productivity. Research Team 156 was established to better understand the present status of Engineering Productivity Measurement and to determine a valid measurement approach that will drive the improvement that present day design tools promise.EXECUTIVE SUMMARY Reliable Engineering Productivity Measurement is a critical element of predictable project performance and improvement. Industry standard productivity measurements must first be established. This model development is necessary since data that matches the model will be available only after its acceptance and use. The measurement model concept was validated by conducting an in-depth study of the piping discipline in the heavy industrial sector. and then applied to present day work processes before significant improvement and predictability of performance can be made. it is to be applied at the discipline level in the detail design phase of projects using influencing factors appropriate to project characteristics and industry sector. System Complexity. The research team concluded that a radically new model for measurement of Engineering Productivity that focuses on readily measurable. little work has been conducted in the arena of Engineering Productivity measurement. Influence factors for the Input Quality (Front End Loading). To be useful to the industry in driving continuous improvement. and Design Effectiveness must be developed. Collected project data supports the hypothesis that a correlation between installed quantities and hours spent on piping design exists. Expert opinion was used to propose the variables of interest for each discipline. The effective delivery of engineering in the project cycle is a prime determinant of overall cost and schedule performance. Engineering costs have steadily risen to levels approaching 20% of total project cost on Industrial projects. In addition to this raw data. The model developed is similar to the software industry’s use of function point analysis. However. installed quantities is required.

Multiple attribute utility function provided a collective evaluation of the total engineering performance. The research team went one step further to address the interaction between engineering performance and its driving project variables by introducing a generic system that incorporated neurofuzzy computing with multiple attribute utility function. focuses engineering resources on capital investment (installed quantities) and not solely on intermediate deliverables. • A measurement model framework that addresses all identified shortcomings with present methods. compliments present day work processes. • The application of learnings from other industries. The research products include: • A thorough analysis of present state of engineering productivity methodology. especially the software industry’s use of function point analysis. and puts engineering and construction forces on an equivalent basis for project control purposes. xiv . Neurofuzzy systems were used for relating the measures of engineering performance to the project input variables identified to have the largest impact on such performance.” Data collection efforts revealed that many companies are presently ignoring this readily available data. • Recommendation of initial influence factor assumptions and a data collection process to empirically refine the models for all disciplines into a uniform measurement system that will continuously improve with project usage. • Recommended installed quantity measures and disciplines which drive the engineering efforts with standard descriptions and discipline definitions required for uniform data collection. with particular emphasis on the piping discipline.creep.

Post (1998) reported a survey of owners of facilities constructed over the then past two years. nearly every facility owner in America is alarmed about the rise in engineering costs. However.CHAPTER 1 INTRODUCTION 1. One-third of the projects were over budget and nearly half were delivered late. Undoubtedly. at the same time that advancing technologies are theoretically improving engineering productivity. 1 .1 Background Improving productivity will reduce cost. Owner and facility managers need a means to evaluate performance of internal design organizations or engineering contractors. This contradiction between the advancing technologies and engineering performance is bewildering the minds of several industry practitioners. Furthermore. This is certainly a widely accepted premise. an established system for measuring engineering productivity will enhance the benchmarking effort on both an internal and external basis. Engineering contractors need a means to drive improvement in their organizations as engineering costs as a percentage of total project cost continue to rise. in which 25% rated design as the weak link in the process of developing a new facility. Delays in engineering schedules are not an uncommon event in the industry either. a reliable engineering productivity measurement system is a critical component in the project performance evaluation and improvement process.

and reporting of productivity for engineers and designers. 1. Presently applied engineering productivity measures primarily concentrate on hours expended on "deliverables" such as drawings and specifications.Contrary to construction labor productivity. Several apparent drawbacks to these approaches exist: • They often depend on subjective weighting to address the complexity of each deliverable. • The data-based driven engineering tools that are rapidly becoming the industry norm do not readily permit the capture of hours expended on discrete deliverables. drawing scale changes can increase the number of deliverables and "apparent" productivity with no actual benefit to project outcome.. • They focus on intermediate project products which in themselves may have little value to project outcome. which is a costly and time-consuming data collection effort. 2 . • They depend on accurate documentation of all hours expended on each deliverable. e. measurement. application. easy to use generic system for engineering productivity measurement that aids the industry in: • Reducing engineering cost.g.2 Research Objectives The main goal of the research conducted by the CII Research Team 156 (CII RT156) is to develop a valid. relatively little work has been conducted in the definition.

and the Principal Investiagtor is Luh-Maan Chang from Purdue University. the following set of research objectives were identified: • To understand past and current practices in measuring engineering productivity through literature search.• Reducing Total Installed Costs (TIC). 1. In order to achieve the research goal. the RT-156 has 9 members representing owner companies. Chairman of the RT-156 is Robert Shoemaker from BE&K Inc.3 Composition of the Research Team 156 The CII RT-156 represents a wide spectrum of the industry. 3 . • To develop implementation procedures for measurement of engineering productivity. and interviews. • To establish measurements that presently are considered valid. Some of the RT-156 members withdrew from the team at different stages of the research activity due to various reasons. 8 members representing contractor companies and 2 members from academic institutions. • Maintaining project success factors. • To prioritize those measurements with most potential for project impact and collect project data related to those prioritized measures. With a total of 19 members. surveys. site visits. • To analyze collected data and to construct a framework that allows uniform application of measurements of engineering productivity.

Champion International Corporation. Foster Wheeler Corporation. Steel. • Joseph Barba. • Glen Hoglund. • Navin C. U. Arizona State University • Denny Weber. Members serving in the team for shorter periods before its sunset: • Jorge L. Austin. Aguilar. • D. • William Buss. McNeil. Union Carbide Corporation.Members serving in the team until its sunset: • John K. Air Products & Chemicals. Bechtel Construction. • Linh J. Ontario Hydro. • Foster (Pete) Moore. Chemtex International. • Andrew Warner. • John Rotroff. ABB Lummus Global. Procter & Gamble. Patel.S. • Kenneth Walsh. G. ARCO Products Company. FPL Energy Inc. Black & Veatch. Kvaerner Process. • Dale Griffith. Atwell. General Motors. • J. J. Whatley. • Thomas Zenge. Bufete Industrial. • Duane McCloud. 4 .

it accounts for the variability in productivity among projects based on the quality of project inputs.1. It ties the productivity for engineers and designers to the installed quantities in a project. such as. Furthermore. Such approach handles the ill-defined nature of several project variables through its fuzzy-based variable description. and so forth. The research further introduces a utility-based neuofuzzy approach for assessment of total engineering performance in industrial construction projects. The research demonstrates the potential of this approach for further consideration by the industry and academia in both performance assessment and questionnaire survey analysis. project scope and complexity. and quality of the resulting engineered deliverables. 5 . number of equipment pieces. This approach has several parallels to the methods used for measuring productivity in the software industry. The validity of the research approach has been tested in the piping discipline using appropriate statistical techniques. Meanwhile. length of piping. neuro-computing facilities the learning capabilities of the system and further allows future accumulation of knowledge.4 Research Contributions The research conducted by the RT-156 represents a radical new approach for measuring engineering productivity. Recommendations are given with regard to the project variables to be considered when applying the model for the piping discipline and other engineering disciplines in an industrial project.

length of piping. Recommendations are given with regard to the project variables to be considered when applying the model for the piping discipline and other engineering disciplines in an industrial project. The research further introduces a utility-based neuofuzzy approach for assessment of total engineering performance in industrial construction projects. number of equipment pieces. The validity of the research approach has been tested in the piping discipline using appropriate statistical techniques. Such approach handles the ill-defined nature of several project variables through its fuzzy-based variable description. 5 . such as. Furthermore. neuro-computing facilities the learning capabilities of the system and further allows future accumulation of knowledge. The research demonstrates the potential of this approach for further consideration by the industry and academia in both performance assessment and questionnaire survey analysis. it accounts for the variability in productivity among projects based on the quality of project inputs. and quality of the resulting engineered deliverables. It ties the productivity for engineers and designers to the installed quantities in a project.1. Meanwhile. project scope and complexity. This approach has several parallels to the methods used for measuring productivity in the software industry. and so forth.4 Research Contributions The research conducted by the RT-156 represents a radical new approach for measuring engineering productivity.

1 Engineering Project Control Under the guidance of the earlier CII Cost/Schedule Task Force. Review of the CII research activities addressing the early phases of a project revealed that a number of these studies are fundamental to the current study by the RT156. Nonetheless. a study on the topic of project control in design engineering was conducted in the 1980’s. productivity and production 7 . 7. This chapter will briefly review these relevant CII studies to lay a foundation for the concepts that are further presented in chapters 5. 6. The Construction Industry Institute (CII) has been particularly active in establishing research teams that study these earlier project phases.CHAPTER 2 PREVIOUS CII STUDIES Studies in the last few decades have greatly focused on addressing the management of the construction phase of a project more than its earlier phases.e. there has been a recent growing realization by both academic and research institutes of the influence of project early stages on its successful implementation. it was decided by the task force to commission a study of the then current practices in engineering project control. The primary concern of the study was to examine two basic measures that characterize the performance of an engineering task. i.. 2. and 8. Realizing the fact that engineering is one of the most common reasons for a project being completed late and over budget.

draft.(Diekmann and Thrush. Examples for these milestones are design. the researchers concluded that design contractors approach the performance/progress measurements in quite a similar manner. Rather. The study by Diekmann and Thrush (1986) demonstrated that the then currently used systems failed to address productivity for the complete set of activities associated with design engineering. productivity is a measure of whether resources are being expended in an efficient manner while production is a measure of whether resources are expended within the given time frame. Diekmann and Thrush (1986) claim several potential benefits to adopting this system. a quantity based engineering system control system would put design and construction forces on an equivalent basis for control activities. Also. Most contacted contractors use a progress measurement system that is based upon counting documents (drawings and specifications) as the means for measuring progress. The evaluation is done in light of pre-determined milestones. While measurement systems were used on broader basis to track 8 . issue for construction. The engineering control based on quantities is analogous to the progress measurement techniques used for construction activities. According to the authors. much of the subjectivity would be removed from the measurement system by adopting a quantitative measure of progress. 1986). For instance. check. Following an extensive interviewing of several design contractors. A majority of the contacted engineering contractors do not track manhours to a particular document. and so forth. firms usually apply an “Earned Value” weighting system to the manhours reported against the document in concern. The reason is that those firms find it unnecessary and time consuming to track the manhours to the individual-document level. manhours are charged to work packages or group of documents. Also.

ruling codes and standards. 2.documents. Completeness of scope definition (prior to the detailed design phase): This is primarily the responsibility of the owner organization. But. In addition. To study the effects of these early decisions upon design. the remaining design engineering activities not examined by the control system are subjectively assigned a “level of effort” status or an approximate percentage of completion. general facility process description. Chalabi et al.2 Input Variables Influencing Engineering Performance Early Decisions made during the initial phases of the project constitute input variables which have far reaching influences on design effectiveness. This is believed by several interviewed personnel from both owner and design 9 . Scope definition identifies project objectives and limitations. this is only an estimate rather than a measure of actual performance. Based on an extensive review of several project documents and personal interviews. only few design contractors approached the engineering activities associated with procurement. and design engineering cost/scheduling. budgets and schedules. The ten variables are: 1. (1987) identified ten input variables that have the greatest impact on design effectiveness. engineering studies/modeling. the Design Task Force of the CII has sponsored a study to identify the most significant input variables having impact on design effectiveness.

Completeness of basic design data (prior to the detailed design phase): This is primarily an engineering responsibility that falls on the owner organization and/or design organization (according to the contract terms). Completeness of project objectives and priorities (prior to the detailed design phase): In an industrial project environment. decision timing.organizations to be the factor having the greatest impact on design outcome. timing and critical schedules. Owner profile designates the owner managerial capabilities. Owner profile and level of participation: Owner organizations constitute a primary source of information for designers. level of technology. Owner participation is considered to be the quality of the owner’s contribution and his readiness when that contribution is required. management of changes. 2. expandability and a host of others. utility. Objectives usually include safety. and design/construction experience. Owners have different interpretations of priorities for these objectives. cost and investment returns. several objectives compete with each other. personnel continuity. Design organizations need to have full understanding of these priorities to gain the owner satisfaction with design outcomes. plant capacity. 10 . 3. 4.

constructor input. Project manager qualifications: The project manager has an important responsibility of coordinating between project participants. 11 . 6. including. Pre-project planning effort: This effort is carried out by the owner organization and/or its representative to translate objectives and priorities into desired end results.It includes plant description. this variable was rated low in its impact compared to other variables like scope definition. experience with similar industrial projects. This requires proven qualities of the project manager to handle such responsibilities. designer selection and qualifications. Nevertheless. vendor data. and a host of others. environmental data. etc. design approaches. technological capabilities. 5. project manager qualifications. specific operation requirements. industrial process description. Designer qualifications and selection: Several attributes of the design organization determines its ability to perform the assigned tasks to the satisfaction of the owner organization. project objectives and priorities. site location. pre-project planning. and type of contract. Several industry personnel interviewed in the study advised to consider pre-project planning as a general category of input variables that includes scope definition. organizational structure. 7.

the engineering job cannot be described as successful. Type of contract: The contract document formally establishes the contractual relationship between the owner and design organizations. labor and material availability. Regardless of the neatness and timeliness of engineering outcomes.3 Measuring Engineering Performance Engineering performance is an inconsistently used term by many industry practitioners.8. timeliness and firmness of vendor data are always of a great concern to designers. Engineering provides the means to transform owner requirements into engineered deliverables and documents. 12 . one cannot limit himself/herself only to the tangible outcomes of the engineering and design activity. field conditions. When thinking of engineering. 9. and unless the owner is satisfied with how these engineering outcomes perform down-the-road in the project life cycle. Equipment sources / reliability of vendor data: In addition to being sometimes responsible for portions of the design. completeness. Constructor input: Several constructability inputs prior to and parallel to detailed design. Quality. 2. include. 10. vendors provide data that are key to development of the design by the primary design organization. construction methods and technology. among others.

1) can be written as S (t ) = α1 x1 + α 2 x2 + . (1987) introduced owner satisfaction as means of measuring engineering performance through parameters such as cost. x2. The general form of the objective function to optimize is: S(t)=f(x1. then accumulate the single evaluations to determine a comprehensive value of satisfaction measured by a function “S”.. ∑α i =1 i =1 13 . + α n xn (2. equation (2. have investigated more reliable engineering performance measures. Chalabi et al. the most common indicator of engineering performance in the construction industry is the ratio of design work-hours per drawing.. For an additive objective function. schedule and quality. Few researchers.xn) (2. however.2) where α (i ) are coefficients reflecting owners priorities for each parameter (i) at time (t) n and.According to the study by Tucker and Scarlett (1986). This can be formally represented as a multi-attribute value function. The concept assumes that the owner makes single comparisons between actual and specified performance on each of these attributes designating value to him/her. Common to these studies is the introduction of engineering performance in the form of value-added function although implementation varies between one study and another. ….1) where t is the time at which satisfaction function is measured and xi is the project parameter being observed.

is commonly used to measure and improve performance of those difficult-tomeasure functions. and to the overall measurement objectives. Therefore. and performance index. the performance index is calculated and used to evaluate and track performance. The definition of design effectiveness criteria constitutes the core of the research by Tucker and Scarlett.It is further possible to encode a utility function to replace the single attribute x(i) for each parameter considered. The performance scale compares the project measured-value of the criterion to past performance and future goals. An objective matrix method was used to list.µ ( xn ) (2. An objective matrix consists of four main components: criteria. it is possible to evaluate the satisfaction function S(t) as follows: S (t ) = α1 . This method. by applying the principle of utility theory.1. as an evaluation procedure. The two researchers introduced the concept of design effectiveness. The weights determine the relative importance of criteria to each other... importance rating (weights).µ ( x1 ) + α 2 .3) Tucker and Scarlett (1986) pursued another research activity. The criteria chosen are accuracy of design 14 . which addresses measuring engineering performance. categorize. as shown in Figure 2. The criteria define what is to be measured. Based on the quantitative nature. Fourteen criteria were originally identified for the overall evaluation of design effectiveness. availability of data and timing of this availability. Using these three components.µ ( x2 ) + . the original list was further refined to include only seven criteria to evaluate design effectiveness. and weight design criteria having significant project impact. performance scale. + α n .

design costs.1 Objective Matrix Components (Tucker and Scarlett. The criteria with their assigned weights were further used to construct an objective matrix with a 0-10 scale corresponding to each individual criterion. constructability of designed facility.documents. Tucker and Scarlett assigned weights indicating the relative importance of each criterion to the remaining set of criteria. usability of design documents. 0 as indicator of poor performance and 3 as 15 . Criterion n … Criterion i … Criterion 2 Criterion 1 Performance Criteria Performance Scale 10 9 8 7 6 5 4 3 2 1 0 Score Weight Value Index Figure 2. 1986) After interviewing several construction industry personnel. performance against schedule. and ease of facility start-up. design economy. This scale uses 10 as indicator of optimal performance.

This makes a sub-matrix final index correspond to a single value entry in the original objective matrix. Based on the values and weights of the different criteria. For instance. each criterion is evaluated and assigned a value on the scale. The use of an objective matrix also allows each criterion to be measured by use of a separate sub-matrix. The use of submatrices allows for a far better evaluation of a single criterion compared with a one-step approach (Tucker and Scarlett.indicator of an average performance. a cumulative design effectiveness index is calculated. A sub-matrix employs sub-criteria to evaluate a specific criterion. 16 . accuracy of design documents can be evaluated by measuring the amount of drawing revisions per total amount of drawings. 1986). Each criterion can be evaluated based on a subjective or quantitative measure(s). For any project to be judges under this method.

Engineering productivity plays a major role in assessment of the value to customers in constructing an industrial facility. the chapter provides a brief look at the evolvement process of engineering productivity measurement approach. At last. initiation stage. that is. Five major stages are recognized. data collection and analysis stage. the research methodology is elaborated on. whose research findings were fundamental to defining the various design work packages and specifying piping to test the research hypotheses. quality.1 Research Focus Areas Value-to-customers represents a major instrument for evaluating the performance of the various entities participating in an industrial construction project. Figure 3. the productivity of A/E design influence the resulting engineering costs in the project and consequently determines the value provided to the customer. 3. and operating expenses. Afterwards. the RT-156 was divided into four subteams. experimental stage.CHAPTER 3 RESEARCH METHODOLODY This chapter introduces the research methodology adopted by the CII RT-156. literature search and knowledge gaining stage. First. cost. and research deliverables and close-out stage. For such purposes. In particular. this value is recognized through a number of indicators such as schedule. it represents the early activity of finalizing the research focus areas.1 illustrates such value to customers scheme. According to a member of the RT-156. 17 .

Chapter 4 will elaborate on these basic definitions. IT STAFF PRODUCT QUALITY SPECIALTY DESIGN (ZERO DEFECTS) OVERHEAD & PROFIT FIELD SUPPORT GOING OPERATING EXPENSE ($/UNIT OF PRODUCTION) Figure 3.2. Appendix A. The research proposal. etc. choosing detailed design as the primary focus area and identifying engineering disciplines that designate the various design work packages. 18 . identifies the research general scope to be engineering processes. The preliminary steps in identifying the research focus areas included the definition of engineering.g.1. CONTROLLED $ A/E FEL SUPPORT # HOURS LABOR $/HOUR HARDWARE $ VALUE FOR CUSTOMER ENGINEERING $ (% TIC) A/E DESIGN MACHINE SOFTWARE $ OPERATING $ e. This procedure is illustrated in Figure 3.ENGINEERING PRODUCTIVITY CII RT-156 START OF PRODUCTION SCHEDULE TIME TO GOING EQUIPMENT $ OWNER "Scope" based on 2/16 discussion May include "Engr'g" $s TOTAL INSTALLED COST ($/Unit of Capacity $/SF. Value-To-Customers Scheme (Zenge 1999) In the context of value-to-customers and its relevance to engineering productivity.) CONSTR. the CII RT-156 followed a structured procedure for identifying and refining the research scope and focus areas.

the RT-156 broke down the research into four packages of tasks and set up sub-teams responsible for their completion. Research Focus To pursue the subsequent stages in defining the research scope and focus areas. Industry Driver Analysis Objectives: • Define list of industry groups to be considered in the research.2. They are: 1.Research Scope Definition Define Engineering Focus on Detailed Design Design Work Packages Define Discipline Export Selection of Critical I/O by Discipline Choose Piping to Test Hypothesis Piping Work Package Figure 3. • Define list of key disciplines for each industry group that are significantly relative to engineering productivity. Input/Output (I/O) Matrix & Definition of “Engineering” Objectives: 19 . 2.

Other Project Data Objectives: • Define “What Data” outside of productivity I/O values will be requested. • Clarify and align the definition of engineering for the purpose of the research team progress. Realizing this fact.• Identify and define Inputs (project disciplines) and Outputs (engineering units) that will be used for engineering productivity measurement. 2 and 3 are illustrated in Appendix B. • Explore the techniques employed for measuring productivity in other industries. the literature review conducted by sub-team 4 is illustrated in Appendix C. • Keep additional data requests to “Critical Few” correlation factors. These findings are utilized in finalizing the research scope and focus areas as explained in the following sections. • Leverage project data already collected through CII Benchmarking / IPA work. 3. 4. Furthermore. the industry driver sub team developed an 20 . The findings of the sub-teams 1. Literature Search Objectives: • Review studies of productivity measurement and improvement in the construction industry. A number of engineering disciplines constitute major drivers of performance in several industry sectors. • Have enough project facts/data to normalize the I/O data.

I/O matrix that relates engineering disciplines and the various industry sectors for the purpose of identifying those engineering disciplines with the most industry impact.1 illustrates the developed matrix where ten engineering disciplines and six industry types are recognized.1. Engineering Productivity Drivers by Industry Industry → Discipline ↓ Light Industrial Process Mechanical Heavy Industrial Process Mechanical × Civil/Structural × Architectural Project Mgmt & Controls Mechanical (HVAC. Accordingly. and so forth. Table 3. light industrial (process). The “x” mark denotes an engineering discipline that is considered a driver of performance in the associated industry type. This matrix was intended to narrow down the RT-156 focus to a few selected engineering disciplines (inputs) for the later stages of the research. Utilities. Vessels) Piping (Design & Mechanical) Manufacturing Process (Mechanical) Manufacturing Process (Chemical) Electrical Instrument/ Controls/ Automation Other Buildings Infrastructure × × × × × × × × × × × × × × Based on the developed I/O matrix. light industrial (mechanical). The industry types identified include heavy industrial (process). heavy industrial (mechanical). Table 3. the piping discipline was found to have significant impact on an industry-wide basis. piping was selected as the 21 .

• Literature provides a comprehensive background for the piping engineering and design practices. Furthermore. O’Connor and Liao 1996). Focusing on piping in the research activity provides a better normalization of data collected in later stages compared with the case of general engineering activities taking place during the detailed design phase. Piping engineering and construction was the focal point of a number of the CII publications (Howell and Ballard 1996. O’Connor (1996) reported that improving the piping function could amount to an average saving of $3. • Several RT-156 companies are well-known organizations in the piping industry in terms of both experience and size of work. identification of the type and feasibility of data to be collected. Other rationales supporting the selection of the piping discipline include the following: • Piping constitutes a significant portion of the costs in delivery of an industrial facility. This provided the researchers with extensive pool of experience needed for the subsequent stages of the research especially the prioritizing of the focus areas. experimenting with data gathering tools and so forth. 1986). Piping engineering and design 22 . the CII conducted several research activities to address different aspects of the piping industry.experimental area for testing the RT-156 research hypotheses. In the last few years.8 million on a $190 million heavy industrial project. O’Connor and Goucha 1996. earlier surveys by the Business Roundtable have identified piping as one of the highest overall potential areas for improvement (Tucker.

Frankel 1996). Literature search and knowledge gaining 3. Initiation 2. depicts the design process in the piping industry. in their pursuit for a better documentation of findings from the piping engineering and design literature and the experience gained through materials or interviews with industry experts from RT-156 companies. prepared a concise document of the piping engineering and design process. Crocker 1992. Data collection and analysis 5. This provided the researchers with further background. Appendix A. Appendix D. the researchers developed a comprehensive research methodology as depicted in Figure 3. The researchers. Research deliverables and close-out stage 23 . This document. the major engineering and design deliverables produced throughout the process and others. in addition to the input provided by the RT-156 members. its progression in a CAD environment. Five major research stages were identified: 1. Experimental stage 4.3.practices was also the topic of several general publications (Bausbacher and Hunt 1993. The methodology flow chart acted as the road map for both the researchers and RT-156 members. 3.2 Research Methodology After a thorough review of several research activities conducted by earlier CII research teams and based on the research proposal.

3. ) Engineering / Design Productivity Approach Consolidate Research Findings RESEARCH DELIVERABLES & CLOSE-OUT STAGE Research Publications CII Annual Meeting Presentation Figure 3. . Research Methodology 24 Understanding Engineering Functions and Practices . . .General Research Scope and Limitations INITIATION STAGE Basic Definitions LITERATURE SEARCH & KNOWLEDGE GAINING Review of Data Gathering Tools Review of Productivity Measurement Studies Prioritize Focus Areas EXPERIMENTAL STAGE Tentative Data Collection Methodology Survey (RT-156 companies) Consolidate Data Collection Methodology Massive Questionnaire Survey DATA COLLECTION & ANALYSIS Statistical Analysis Validation (interviews.

The major outcomes of this stage include: • Standardized industrial facility delivery process. 3. • Engineering practices in industrial construction. 25 . detailed design and procurement phases of a project.2. • Definition of engineering including both a generic definition and a list of engineering tasks within the pre-project planning.2. • Standardized definition of engineering inputs (engineering disciplines).3.1 Initiation Stage This stage comprised the basic activities undertaken by the RT-156 members for a better identification of the research problem. The literature search and knowledge gaining focused on exploring three major areas: • General studies for measuring productivity for engineers or similar professions.2 Literature Search and Knowledge Gaining This stage comprised the basic activities undertaken by the researchers for identifying the research approach and acquiring the knowledge needed to pursue the later stages of the research. • Developing an I/O matrix that relates engineering discipline and the various industry types for the purpose of identifying critical engineering disciplines (inputs).

several experts of the RT156 companies provided their input and comments on the content and feasibility of the questions. complexity levels. which is to be filled out by the project manager. Investigation of the data gathering tools highlighted a number of feasible alternatives for conducting the research during the experimental and massive data collection stages. However.2. Personnel to fill out the questionnaire are asked to choose a project that they recently completed and provide both general and piping-related data. The RT-156 reached a consensus on using questionnaire surveys as the prime platform for data collection. measures of design effectiveness 26 . questionnaires. The final questionnaire survey is grouped in two parts: Part A and Part B. input variables influencing engineering and design.• Literature on data gathering tools such as interviews.3 Experimental Stage The questionnaire survey prepared by the researchers has undergone several cycles of review and revisions by the RT-156 members. workshops. is to collect data relevant to project scope. 3. etc. The site visits and interviews served in providing the researchers with a comprehensive review of the current practices in measuring and evaluating productivity for engineers in general and the piping discipline in particular. Also. interviews. questionnaire surveys and workshops. Examples for data gathering tools are site visits. the researchers arranged for site visits to a number of RT-156 companies and conducted several interviews with experts in both engineering control/reporting systems and piping engineering and design. Part A.

Part B. which is to be filled out by the piping lead or project piping engineer.2. Several types of data regarding the inputs and outputs of engineering productivity measures were further addressed and collected in the survey. Refer to Appendix E for Part A and Part B of the questionnaire survey. productivity measures. is intended to broaden industry-wide knowledge in overall engineering productivity measurement methods and to define methods for benchmarking measures in detailed engineering and design. the final form was used for the massive data collection. a traditional paper-based questionnaire survey and an electronic questionnaire survey on the Internet.4 Data Collection and Analysis Stage Following the refinement of the questionnaire survey.among a host of others. For normalization purposes. The survey was sent to various CII members including the member companies of the RT-156. the projects considered in the survey were required to belong to the process industries and to be engineered in the US. piping practices. and so forth. 3. Regression models constitute the primary data-analysis tools used in the study for investigating and 27 . The review of the current practices. engineering inputs / outputs. data of a total of 40 industrial projects were collected for the data analysis purposes. as provided by the survey data. Through two rounds of questionnaire survey. Examples include engineering control systems. covers various areas. 18 of the 40 projects provided enough information for statistical analysis and were used in piping productivity model building.

5 Research Deliverables and Close-out Stage This stage comprises the preparation of the research publications including this research report and the research summary. 3. Two opinions prevailed. that is. 28 .3 Evolvement of Engineering Productivity Measurement Approach Over the lifespan of the research. the general approach to measuring engineering productivity went through an evolvement process as stated below. Other techniques such as fuzzy-based models and neurofuzzy systems were investigated for their applicability in the study.2. The presentation materials for the plenary session and the implementation session are included in Appendix L. as will be illustrated in a later chapter. 2001. in addition to the presentation of the research findings during the 2001 CII Annual Meeting in San Francisco. statistical variable reduction techniques were employed for the development of appropriate statistical models and verify the study primary hypotheses. 3. Because of the substantial number of variables considered in model development. The other tried to assess effectiveness of engineering productivity. The research team held lengthy discussions and debates on how engineering productivity should be measured. August 8-9.establishing the engineering productivity measurement system. One was to measure actual productivity through metrics such as engineering work hours spent on unit finished product.

4 had the potential of rendering a comprehensive interpretation of effective productivity.4. scope factor and complexity factor. scope factor and complexity factor. less number of work hours expended on a job would not be favored if a substantial amount of design errors and omissions existed. For instance.effective productivity. The approach interprets effective productivity by incorporating three factors: input quality factor. an approach to measuring effective productivity was put forward as shown in Figure 3. Detailed explanation for the factors is included in Chapter 7. In addition. the research committee deemed its scope too broad for this research project and decided to focus on the chartered scope 29 . A neuro-fuzzy network system was utilized to establish the relationships between the effectiveness measures and the engineering input variables. and 10 measures of engineering effectiveness were also identified. Effective Productivity = Input Quality Factor X Scope Factor X Complexity Factor Figure 3. The effective productivity opinion pointed out that actual productivity such as engineering work hours per finished product could be quite misleading if considered on a standalone basis. linear regression model was developed for each of the measures.4 Initial Approach of Engineering Productivity Measurement A set of variables was selected for input quality factor. Details can be referred to in Chapter 7. Therefore. and a comparison was made between the linear regression models and the neuro-fuzzy model. Although the scheme illustrated in Figure 3.

as a complementary effort. Chapter 8 of this report expands the research focus in Figure 3. Therefore.5 Research Focus Regarding the Engineering Productivity Measurement Conforming to the research focus. the above model appears to be not broad enough.outlined in the original research proposal. As a result. the paradigm in Figure 3. if it is to measure and interpret the effective engineering productivity. The analysis arrived at a linear regression model that interpreted raw productivity using only number of equipment pieces. which will be elaborated on in Chapter 7. Variable Reduction Technique was employed to filter out the variables that had marginal impact on productivity. Effective Productivity = Input Quality Factor x Scope & Complexity Factor x Raw Productivity x Output Quality Factor Research Focus Figure 3. scope and complexity variables were collected as well as the corresponding project raw productivity. Chapter 7 contains the details of the analysis. However. 30 . Appendix A. The linear regression model serves to predict actual engineering productivity and provides engineering practitioners a general idea on applying the approach to other engineering disciplines.5 acted as the framework for later data collection and analysis.5 back to the one shown in Figure 3. questionnaire survey was carried out and the data for 33 input.4 and presents a generic system for assessing effective engineering productivity.

Analysis showed several variations between the processes followed in delivering these industrial facilities. and the engineering disciplines to carry out such tasks. Table 4. 31 . According to this definition. the CII RT-156 adopted the definition of project phases developed by the CII Benchmarking and Metrics (BM&M) Committee. Since a clear definition of the research scope required a standardized delivery process to be identified.CHAPTER 4 BASIC DEFINITIONS OF ENGINEERING DESIGN IN INDUSTRIAL CONSTRUCTION PROJECTS This chapter covers several fundamental definitions of the engineering design activity in an industrial construction project. an industrial facility delivery process can be divided into five phases: preproject planning.1 Industrial Facility Delivery Process During the early stages of the research. procurement (material management). and start-up and commissioning (CII 1997. CII 1998). construction. detailed design. the engineering tasks throughout the various project phases. This includes the industrial facility delivery process. along with their start/finish milestones and the prime activities and products typical of each phase. a sixth phase is added to the project phase structure to denote this demolition activity. 4. members of the RT-156 provided several examples of the industrial facility delivery processes used in their organizations. Whenever a demolition activity exists in the delivery process of the facility.1 provides the CII BM&M Committee description of the various industrial project phases.

1. Rendering Drawing & spec preparation Bill of material preparation Procurement Status Sequence of operations Technical Review Definitive Cost Estimate Start: Mechanical Completion Stop: Custody transfer to user/operator (steady state operation) • • • • • • • • • 32 Vendor Qualification Vendor Inquiries Bid Analysis Purchasing Expediting Engineered Equipment Transportation Vendor QA/QC Set up trailers Site preparation Procurement of bulks Issue Subcontracts Construction plan for Methods/Sequencing Build Facility & Install Engineered Equipment Complete Punchlist Demobilize construction equipment Warehousing Testing Systems Training Operators Documenting Results Introduce Feedstocks and obtain first Product Hand-off to user/operator Operating System Functional Facility Warranty Work .Table 4. Industrial Project Delivery Process (CII BM&M Committee) Project Phase Pre-Project Planning Typical Participants: • Owner Personnel • Planning Consultants • Constructability Consultant • Alliance / Partner Detail Design Typical Participants: • Owner Personnel • Design Contractor • Constructability Expert • Alliance / Partner Demolition / Abatement Typical Participants: • Owner Personnel • General Contractor • Demolition Contractor • Remediation / Abatement Contractor Procurement Typical Participants: • Owner personnel • Design Contractor • Alliance / Partner Construction Typical Participants: • Owner personnel • Design Contractor (Inspection) • Construction Contractor and its subcontractors Start/Stop Start: Defined Business Need that requires facilities Stop: Total Project Budget Authorized Start: Design Basis Stop: Release of all approved drawings and specs for construction (or last package for fast-track) Typical Activities & Products • • • • • • • • • • • • • • Start: Mobilization for demolition Stop: Completion of demolition • Remove existing facility or portion of facility to allow construction or renovation to proceed • Perform cleanup or abatement / remediation Start: Procurement Plan for Engineered Equipment Stop: All engineered equipment has been delivered to site • • • • • • • • • • • • • Start: Beginning of continuous substantial construction activity Stop: Mechanical Completion • • • Start-up / Commissioning Typical Participants: • Owner personnel • Design Contractor • Construction Contractor • Training Consultant • Equipment Vendors Options Analysis Life-cycle Cost Analysis Project Execution Plan Appropriation Submittal Pkg P&IDs and Site Layout Project Scoping Procurement Plan Arch.

Procurement and Operations) required to convert a defined technology or facility change into the data and documents needed to procure. some of the engineering tasks would exist in some organizations while not others. construct and start-up the new or revised facility. a comprehensive activity list of items encountered in a typical Engineer-ProcureConstruct (EPC) project was developed.4. Construction. This activity list primarily defines the various activities belonging to the major five industrial project phases cited earlier. Continued discussions led to a generic definition of engineering as follows: “Work of all contributing groups (except business planning. The list is a 33 .2 Engineering Definition The RT-156 realized that a comprehensive definition of engineering or what constitutes an engineering/design task is a necessity in defining research scope and limitations. the research team members presented several thoughts about what constitutes engineering in nature. In pursuit of this task. In the earlier study by the CII Research Team on Informative Management Impact. Due to the fact that projects are structured differently between different organizations. the research team made use of the results of an earlier study conducted by the CII Research Team on Informative Management Impact (CII 1997). To overcome this difficulty. R&D. defining what constitutes an engineering/design task can be quite challenging.” While this generic definition may be acceptable by several industry practitioners.

nor be inclusive of every project undertaken. 34 . This level is highly detailed as it contains 164 activities in total between all five phases (CII 1997). and suggestions from practitioners representing more than 40 CII member companies. the list includes only those activities most commonly executed by a significant portion of the industry to complete a conventional project (CII 1997). Since the input was obtained from experienced personnel on both contractor and owner companies representing a wide range of facility types. the activity list is not expected to summarize any one company’s process explicitly. The first level defines broad subdivisions or categories of the work performed in each of the five primary phases. The second level is a further refinement of the hierarchical activity listing and defines specific project activities that commonly occur in a project. Only the first three phases of delivering an industrial project were considered in the engineering definition while both construction and start-up phases were excluded. Rather. This made the use of the developed activity list a plausible approach for the purpose of normalizing the definition of engineering by the RT-156. the RT-156 reached a near-consensus agreement on what constitute engineering tasks among all activities listed. two levels of detail are recognized.consolidation of ideas. Through a comprehensive screening of activities defined in the activity list. This is due to the fact that engineering involvement in the construction and start-up phases is limited compared to the preceding three phases. comments. For each of the major five project phases.

Pre-project planning is a multi-disciplinary. and contract strategy are included in the definition of engineering by RT-156 while business plan and product technical plan are entirely excluded. address project risk. detailed cost estimate. required to produce construction documentation as a final product. and contract strategy. Only activities comprising facility scope planning. The five major constituents of this phase are finalize scope.2. project execution plan. including numerical engineering analysis and obtaining third party input. project execution plan.3. The five major constituents of this phase are business plan. product technical plan. 35 . multiorganizational process. and determine contract and execution strategies to maximize the likelihood of a successful project outcome. This phase represents the core of engineering activity undertaken within an industrial project undertaking. as depicted in Table 4. Design This phase includes all activities. detailed schedule. and prepare work packages. All five items are included in the research team definition of engineering.1 Engineering tasks during project phases Pre-Project Planning Stage This stage is defined as the process of developing sufficient strategic information with which project teams can determine preliminary project scope.4.2. as depicted in Table 4. detailed design. facility scope plan.

TP PPP.CS3 Review Potential EPC Contractor Bidders YES PPP. Issues NO Define Start-up Requirements NO PPP.BP3 Conduct Market Research NO PPP.BP7 Develop Funding Plan NO PPP.TP3 Obtain Patent and Licenses and Analysis NO PPP.SD4 Develop Site Plan YES Detail Work Breakdown Structure YES PPP.BP4 Establish Image and Public Relations NO PPP.SD3 Develop Environmental Plan YES PPP.PP6 Develop Preliminary Execution Plan YES PPP.PP8 Develop Start-up Plans YES PPP.SD PPP.CS5 Develop Labor Strategy YES PPP.PP1 Project Execution Plan Develop Preliminary Design Criteria YES PPP.BP8 Raw Materials Sourcing NO PPP.PP7 Summarize Project Scope YES PPP.PP PPP.BP10 PPP.2.BP9 Develop Labor Plan/Address Human Res.BP6 Address Regulatory Issues NO PPP. Engineering Activities within Pre-Project Planning Phase INCLUDE IN RT 156 ENGINEERING Phase I Pre-Project Planning DEFINITION PPP.CS 36 .CS1 Contract Strategy Develop Contract Strategy YES PPP.SD1 Facility Scope Plan Process and Facility Planning YES PPP.PP3 Complete Preliminary Estimates Scope YES PPP.TP1 Product Technical Plan Conduct Technical Surveys and Process Analysis NO PPP.BP2 Define Facility Objectives/Capacity Demands NO PPP.BP Business Plan PPP.Table 4.PP4 Establish Master Project Schedule YES PPP.BP1 Define Business Objectives NO PPP.SD2 Develop Utilities and Offsite Scope YES PPP.PP5 Address Quality and Safety Issues YES PPP.TP2 Product Development NO PPP.TP4 Establish Security and Secrecy agreement NO PPP.BP5 Finalize Site Selection NO PPP.SD5 PPP.CS2 Develop Bid Package Scope YES PPP.CS4 Select EPC Contractor Team YES PPP.PP2 Formulate Preliminary Organization YES PPP.

6 Obtain Intermediate Owner Reviews & Approvals Review Changes & Approve D.2 Estimate Installation Costs YES D.3 Conduct Cost And Schedule Review Analysis Design / Engineering Review YES YES D.DCE.FS.6 Address Codes.5 D.DD.3 Define Major Equip.FS.DS. Standards & Enviro.11 Distribute Documents YES D.DD.DCE.7 Conduct Site Evaluation YES D.2 Finalize Drawings and Specs YES D.1 Estimate Equipment Cost YES D.2 Detail Materials Management Schedule YES D.FS.3 Estimate Support Services Costs YES D.DD D.5 Estimate Materials Costs YES D.DD.2 Finalize Facility Plans YES D. Engineering Activities within Design Phase INCLUDE IN RT 156 ENGINEERING Phase II DESIGN DEFINITION D.PWP.3.DCE.10 Coordinate Vendor / Engineering Interface YES D. & Material Specs YES D.DCE.DS.3 Detail Construction Schedule YES D.4 D. Require.DD.DCE.1 Detailed Schedule Detail Design Schedule YES D.Table 4.DD.4 Detail Start-up Schedule YES D.PWP.1 Detailed Design Detail Engineering Discipline Drawings YES D.DD.DD.5 YES D.FS D.4 Finalize Utilities and Offsite YES D.DCE.FS.4 Estimate Indirect Costs YES D.1 Prepare Work Package Draft Project Plan YES D.DS.PWP.6 Estimate Other Costs YES D.9 Conduct Scope / Estimate Review YES D.DD.FS.1 Finalize Scope Finalize P&ID's and PFD's YES D.DD.DD.DD.FS.FS.DCE Detailed Cost Estimate D.DS.DS D.PWP 37 YES YES .8 Complete QA/QC Review YES D.3 Develop Bill of Materials YES D. Acquire Permits and Approvals YES D.2 Prepare Material Requisitions YES D.7 Complete Constructibility Review YES D.

specialized-engineered equipment. The eight major constituents of this phase are bulk commodities.BC MM.BC.2 Issue Inquiry YES / NO MM. documentation.1 YES YES 38 .7 Ship Materials NO Fabricated Items Finalize Materials Specs YES MM. procurement. and field equipment management.Materials Management (Procurement) This phase is defined as the process of ensuring that the right materials are at the right place at the right time of a project. Table 4.FI MM.BC.4. field management. award contract and release materials could arguably be considered engineering or non-engineering activities.BC. as depicted in Table 4. standard engineered equipment. Some tasks like issue inquiry.4. receive vendor bid. Their placement within the definition would primarily depend on the individual organization practices. shipping and receiving of all project materials is included in this process. The last three items are entirely excluded from the engineering definition while only spec preparation.BC. expediting. Engineering Activities within Materials Management (Procurement) Phase INCLUDE IN RT 156 ENGINEERING Phase III Material Management DEFINITION MM.1 Bulk Commodities Specify Materials MM.BC. All activities necessary for the definition.FI.6 Release Materials YES / NO MM.BC. fabricate items. services.4 Evaluate Vendor Bid MM.BC.5 Award Contract YES / NO MM.3 Receive Vendor Bid YES / NO MM. bid evaluation and vendor inspection are considered as engineering tasks among the rest of phase constituents.

5 Award Contract YES / NO MM.3 Issue Materials NO MM.FI. Store.3 Coordinate Materials Management NO MM.1 Specify Equipment MM.5 Award Contract YES / NO MM.FI.S.FI.STE.7 Vendor Documents YES MM.STE.FI.2 Prequalify Vendors/Subs and Issue Inquiry NO MM. and consequently the associated engineering costs.3 Receive Vendor/Sub Bid NO MM.FEM 4.8 Ship Materials NO MM.8 Vendor Fabrication NO Ship Equipment NO MM.5 Award Contract NO Documentation Prepare Final Report/Turnover Documents NO MM.DO MM.4 Evaluate Vendor Bid MM.1 Receive and Inspect Materials MM.4 Vendor Inspection YES MM.FD.2 Refined research focus The majority of engineering labor is engaged in the delivery process of an industrial facility during the detailed design phase.5 Conduct Accounting Activities NO MM.STE MM.FD.MM.FD.FEM.FI.3 Receive Vendor Bid YES / NO MM.4 Evaluate Vendor/Sub Bid NO MM.FEM.6 Release Data Sheets YES / NO MM.FI.S MM.STE.9 MM.2.3 Receive Vendor Bid YES / NO MM.1 MM.STE.1 Services Work Packaging / Scope of services NO MM. are incurred 39 .SPE YES Standard Engineered Equipment Specialized Engineered Equipment MM.STE. The screening of the various project phases conducted by the RT-156 further demonstrated this issue.STE.2 Issue Inquiry YES / NO MM.2 Issue Inquiry YES / NO MM.DO.FD.FD YES Field Management MM.STE. As most the engineering work hours expended.6 Vendor Document Management YES / NO MM.2 Inventory.FI.S.S.7 Fabricate Materials NO MM.FD.S. and Maintain Materials NO NO MM.S.4 Evaluate Vendor Bid YES / NO MM.STE.1 Field Equipment Management Coordinate Materials Management Schedule NO MM.STE.

security.3 Engineering Disciplines The various engineering disciplines in an organization constitute the main inputs to engineering work processes. These disciplines are the major drivers of engineering and design work throughout the project phases and in particular the detailed design phase. seismic. drainage. etc. 40 . 2.. the RT-156 members have created standard descriptions and discipline definitions of engineering disciplines commonly encountered in engineering organizations. blast resistance). interior design. major equipment supports. building frames. and structural elements such as foundations. material/people flow. the RT-156 narrowed the research focus to this particular phase. Architectural includes building exterior presentation. underground utilities.during the detailed design phase of a project. Civil/Structural includes site development such as grading. parking lots. Because each industry and organization operates differently and defines work in a variable manner. site and building master plans. geotechnical studies and reports. 4. analysis for special loading (reciprocating equipment. etc. etc. landscaping. floor plans. roads. Ten different engineering disciplines are identified and described below: 1. railroads.

etc. utility systems (steam. Does not include on-site follow up. Procurement & Materials Management includes inquiry.3. Limited to those job hours spent during the engineering phase of the project through final issuance of all “issue for construction” documents. 6. etc. cost management. bid tabulation. Piping includes routing. supports. etc. Also known as chemical or wet processes. 4. inspection. refrigeration). Manufacturing Process (Chemical) includes design and integration of the equipment that produces the intermediate or finished product of the facility where those processes are chemical conversion. 7. contract negotiation. detailing from Piping and Instrument Drawings (P&ID’s). plumbing. thermodynamic. pressure vessels). field run testing or start-up of equipment. design coordination. Does not include on-site procurement or materials management job hours 5. schedule management. isometrics. compressed air. etc. potable water). process systems (rotating pumps. agitators. 3D presentation. expediting.) special vessel design (tanks. inventory. compressors. Project Management & Controls includes project execution planning. reaction. insulation. (CII— 41 . specifications and piping materials. delivery/transportation. Mechanical includes building services (HVAC. etc. filters. stress analysis.

etc. quality monitoring. substations. measurement. etc. etc. stamping. (CII— Control Systems—Engineering discipline responsible for instrumentation. lighting. Electrical includes design of power distribution systems. Manufacturing Process (Mechanical) includes design and integration of the equipment that provides the intermediate or finished product of the facility. communications. mixing. material movement. 9. inventory control. Instrument/Controls/Automation includes design of process control systems. packing. The processes may also be known as dry or non-chemical. power distribution panels. Prepares instrument logic diagrams and data sheets among other things. first and second level computer systems. machine control systems. where those processes are assembly.) 42 . 10. motor control centers. Prepares PFDs (Process Flow Diagrams).Engineering discipline responsible for process engineering. heat and mass balance sheets and process calculations. physical conversion.) 8. P&IDs. machining.

CHAPTER 5
AN INDUSTRY-WIDE SURVEY OF ENGINEERING PERFORMANCE IN
INDUSTRIAL CONSTRUCTION PROJECTS

The chapter presents a study conducted by the RT-156 to demonstrate the farreaching impacts of the engineering activities in delivery of an industrial facility. This
study is based on comprehensive set of project data made available to the RT-156 by the
CII BM&M Committee. The study covers several performance indicators including
project cost, project schedule, project scope and development changes, and field rework.

5.1 General

Industrial facility construction constitutes a significant sector of the construction
industry in the United States. Projects belonging to this sector include electrical
generation, oil exploration/production, oil refining, pulp and paper processing, chemical
manufacturing,

pharmaceutical

microelectronics

manufacturing,

manufacturing,
consumer

product

metals

refining/processing,

manufacturing,

natural

gas

processing, automotive manufacturing, food processing, and others. The construction of
these facilities is an intricate undertaking requiring the cooperation of several
organizations. An industrial facility costing $250,000,000 might have 50 or more
organizations participating in its engineering, financing, regulation, and construction
activities (Fergusson and Teicholz, 1996).

43

It has been long realized that the influence of project phases substantiates at the
project early stages and diminishes as the project approaches completion. While the early
project phases exhibit lower expenditures, the decisions and commitments made during
that period have far greater influence on what later expenditures will in fact be (Barrie
and Paulson, 1992). Figure 5.1 illustrates the level-of-influence concept that is commonly
encountered in the literature.

Figure 5.1 Level of Influence of Project Phases (Barrie and Paulson 1992)

Realizing the significant importance of engineering activity in delivery of an
industrial facility and seeking a better understanding of the industry-wide engineering

44

performance, the RT-156 analyzed a comprehensive set of industrial project data from
the CII Benchmarking and Metrics (BM&M) Committee. The data was collected during
the years 1996, 1997 and 1998. The total number of projects used for analysis purposes
was 163 grass-roots industrial projects. Several statistical analyses that will be presented
later used a smaller set of data points. As is common with questionnaire surveys,
responses are frequently incomplete; this results in fewer data points available for data
analysis purposes.

5.2 Project performance measures

In industrial construction, the detailed design phase constitutes a significant
portion of the project costs. This emerges primarily from the level of complexity of the
engineering/design activities required for the industrial facility design. Figure 5.2 depicts
a breakdown of the actual costs among the various project phases based on a sample of
12 grass-roots industrial projects.
The detailed design phase accounted for an average of 14% of the total project
cost. Only the procurement and construction phases of the project consumed larger
proportions of the budget. Procurement costs for industrial facilities, according to CII
BM&M definition, include bulk material costs and mechanical equipment. Table 5.1
shows that design costs considerably vary among projects. The degree of variability in
design costs, depicted through the standard deviation, reaches 6.6%. With 95%
confidence level, the design costs in industrial facility construction would fall within the
range of [9.2%, 17.8%].

45

Pre-project
Planning
4%

Start-up
5%

Detailed Design
14%

Procurement
26%

Construction
49%
Demolition
2%

Figure 5.2. Breakdown of $TIC by Project Phase (Based on 12 grass-roots industrial
projects)

Table 5.1. Percentage of Actual Phase Cost to the Total Project Cost

Average
Standard
Deviation

Pre-project
Planning
3.7%
2.7%

Design

Procurement

13.6%
6.6%

26.3%
10.6%

Demolition Construction
2.4%
1.7%

49.3%
10.8%

Start-up
4.7%
3.5%

In evaluating the performance of industrial facility construction, various
performance measures can be considered including project costs, project schedule, safety,
reliability, compliance with production output specifications, among a set of other
business-oriented performance measures. The RT-156 evaluated a number of
performance measures in delivery of an industrial facility. During the preparation of a
project budget, contingencies are added for unpredictable cost deviations that are difficult
or impossible to identify prior to budget authorization. The contingency index is
commonly defined as:
Contingency cost of a project phase
Contingency Index =
Budget of this project phase (including contingency cost)

46

The contingency index was calculated for nine grass-roots industrial projects. The results
in Table 5.2 show that the average contingency index for various project phases varied
between 1.9% and 7.3%. The detailed design contingency index averaged 6.0%. With
95% confidence level, the detailed design contingency index would fall within the range
of [1.6%, 10.4%].
Table 5.2. Performance Measures for Various Project Phases

Contingency
Index
Cost Perf. Index
Schedule Perf.
Index

Pre-project
Planning
1.9%

Design

Procurement

Demolition Construction

Start-up

6.0%

6.3%

4.0%

7.3%

4.0%

94.1%

99.6%

87.5%

103.8%

96.7%

99.2%

115.4%

118.4%

116.0%

102.9%

108.4%

123.2%

A prime measure of project performance is the conformance of the project’s
actual costs with its budgeted costs (including contingencies). High deviations indicate a
potentially troubled project (Chalabi et al. 1987). The cost performance index is defined
as:
Actual cost of a project phase
Cost Performance Index =
Budget cost of this project phase
The cost performance index was calculated for each phase of nine industrial projects. The
results in Table 2.6 show that the cost performance index for the project phases varied
between 87.5% and 103.8%. The cost performance index for the detailed design phase
averaged 99.6%. With 95% confidence level, the cost performance index of the detailed
design phase would fall within the range of [84%, 115.2%].
Another important measure of project performance is the conformance of the
project’s actual schedule with its planned schedule, which has been used as a measure of

47

design effectiveness (Stull and Tucker, 1986). The schedule performance index is defined
as:
Actual schedule duration of a project phase
Schedule Performacne Index =
Planned schedule duration of this project phase
The schedule performance index was calculated for each phase of 42 industrial projects.
The results show (Table 5.2) that the schedule performance index for the various project
phases ranged between 102.9% and 123.2%, indicating, on average, a schedule delay
occurred in each phase of these projects. The schedule performance index for the detailed
design phase averaged 118.4%, which makes it one of the major sources of project
schedule delays. Only the start-up phase has a higher average schedule delay than
detailed design. With 95% confidence level, the schedule performance index of the
detailed design phase would fall within the range of [108%, 129%].

5.3 Project changes and field rework

Project changes are frequent occurrences in industrial construction projects, and
can be classified as either scope changes or development changes. The former depict
changes in the base scope of work or the process basis. The latter depict the changes
required to execute the original scope of work or the original process basis with different
strategies. Figures 5.3 and 5.4 illustrate the frequency of scope and development changes
taking place during the various project phases. Analysis of more than 20 industrial
projects indicates that the detailed design phase emerges as the project phase having the

48

highest frequency of both scope and development changes. On average, 48% of scope
changes and 56% of development changes take place during the detailed design phase.

Pre-project
Planning
0%

Start-up
5%

Construction
43%

Design
48%

Procurement
4%

Demolition
0%

Figure 5.3. Frequency of “Project Scope Changes” During the Various Project Phases
(Based on 22 grass-roots industrial projects)

Construction
33%

Start-up
2%

Pre-project
Planning
0%

Demolition
0%

Design
56%

Procurement
9%

Figure 5.4. Frequency of “Project Development Changes” During the Various Project
Phases
The net cost impacts of both scope and development changes were further
analyzed. The analysis shows that changes taking place during the detailed design phase
are the ones that considerably affected the project costs, as shown in Figures 5.5 and 5.6.

49

Net Cost Impact of “Project Development Changes” During the Various Project Phases Field rework depicts efforts expended to re-fabricate or re-construct portions of the facility due to changes.00% Pre-project Planning Design Procurement Demolition Construction Start-up Project Phase Figure 5.00% Pre-project Planning Design Procurement Demolition Construction Start-up -4. respectively.00% 2. Minimum Maximum Average 8.With 95% confidence level.00% 3. 3. Field rework is a major cause of both project cost overruns and schedule delays.00% Project Phase Figure 5.2%] and [-0.00% 1. Net Cost Impact of “Project Scope Changes” During the Various Project Phases Minimum Maximum Average 5.0%.00% 0.4%.5. the net cost impact of scope and development changes during the detailed design phase would fall within the ranges of [-1.00% 2.00% 6.00% 4. omissions and errors of involved project parties.2%]. 2.00% -2.00% 0.6.00% 4. Based on the analysis 50 .

If design changes are added. There are several sources of project field rework such as owner change. transport error 0% construct change 2% others 9% owner change 20% construct error 6% vendor change 1% vendor error 23% design error 33% design change 6% Figure 5. averaging 33% of the total field rework occurrences.of 8 industrial projects. vendor error/omission. Figure 5.2%]. With 95% confidence level. Sources of Field Rework 51 .7 depicts a breakdown of field rework occurrences according to their prime sources. this value reaches 39% of the total field rework occurrences. vendor change. Design errors/omissions emerge as the major source of field rework.7. thus surpassing owner and constructor field rework occurrences combined by a significant margin. designer change. constructor change. the cost overrun due to field rework would fall within the range of [0. the cost overrun due to field rework was found to average 4.1% of the actual project cost. transportation error. 8. designer error/omission. and so forth. constructor error/omission.

4 Summary The study clearly demonstrated that the engineering activity of the detailed design phase is a key factor in the successful implementation of industrial construction projects. The findings of the analysis also support the rationale behind the research activity undertaken. schedule. significantly contributing to its successful/unsuccessful implementation. 52 . a reliable engineering productivity measurement system is a crucial component in the entire project performance evaluation and improvement system.5. changes and field rework along with other operating characteristics. such as cost. thus. This detailed design activity has a far-reaching impact on several aspects of the project. that is.

CHAPTER 6 CURRENT PRACTICES IN MEASURING ENGINEERING PRODUCTIVITY This chapter covers the practices of measuring productivity in various industries. Productivity is commonly defined as the efficient utilization of resources (inputs) in producing goods or services (output) (Sumanth 1984). partial productivity. The chapter particularly elaborates on engineering productivity metrics and their individual components. First. a thorough review of the control systems used in engineering organizations is given. total-factor productivity and total productivity.e. is presented. 6. health care. and so forth. with emphasis on engineering organizations. engineering workhours and output quantity.1 Literature Review on Productivity Measurement for Engineering and Other Fields In its simplest form. Another example of this class is the construction labor productivity that 53 . The scope of research conducted by the RT-156 can be classified as partial productivity study as it relates the output to a single type of input.. Afterwards. Practices of measuring productivity for the piping discipline are used to demonstrate such concepts in a typical industrial construction project. service industries. that is. an overview of productivity measurement in various fields such as manufacturing. academics. i. Three basic types of productivity usually emerge in literature. productivity is a relationship between an output and the means to produce this output.

quality. or particular approaches to engineering management (Repic. one frequently finds references to the idea that. 1993. certain specialized software engineering techniques (Bidanda and Hosni. and/or performance of engineers. for example. 1996. Benayoune and McGreavy. there is surprisingly little work conducted in the definition. Thamhain. In such cases. 1992). Productivity and performance measures are certainly common in other industries. 1997). one can improve the productivity. 1996). particular software for integrated circuit. application. or reporting of productivity for the engineering profession (Refer to Appendix C for the complete study conducted by Walsh on measuring productivity in various industries). Jundt. Graham. 1992). The most common place where one finds reference to the general topic is in literature with a bias for a particular tool or technique. A recent review uncovered discussion of potential performance improvements resulting from the use of concurrent engineering schemes (Isbell. 1994. However. 1984. Probert and Lew. and a rich econometric literature on the subject exists (see. et al. 1990. Examples of such claims are many. by making use of the technique under discussion. In these studies there is an abundance of conclusions and a lack of data.relates the installed quantities for a construction activity (output) to the productive work hours utilized to deliver it. the excellent 54 . 1994). or civil engineering design (Girczyc and Carlson. 1993. Sackett and Evans. process. and can be found in all branches of engineering. Winter. Fenton (1993) found that the claims of dramatic productivity improvements in the software engineering industry are usually not defensible when performance data are actually collected. 1985. According to Walsh (2000). measurement. ergonomic office design (Hacker.

productivity can be simply expressed as the amount of production or output (perhaps measured in completed pieces) per some unit of input resources (perhaps time or quantity of input resource). productivity is generally defined as the output of work put in place for some unit of input (for example. However. using an index such as the ratio of 55 .summary collection by Christopher and Thor. although they define productivity as input/output instead of the more common output/input. however. yards of concrete finished per labor hour. but are generally statistical interpretations of the product testing failure rate. Construction is often likened to manufacturing. In addition. usually based on a weighted earned-value analysis. or pipe hangars installed per labor hour). many of the processes which take place on a construction site are. 1993). When multiple steps are required for a portion of the work. quite repetitive. are used. measures in the econometric literature tend to focus on the measurement of manufacturing productivity. In the manufacturing environment. Thomas and Kramer (1988) provide an excellent and frequently used summary of construction productivity measurement. or some kind of survey-based customer satisfaction rating. methods for allocating partial completion. In this case. and as a result construction productivity and performance measurement builds heavily on manufacturing methodologies. Performance measures are also commonly made. Performance or quality measures are somewhat more complex. the most well-known being materials testing for QA/QC activities during construction. the product return rate from the ultimate consumer. taken by themselves. as there are many similarities and the primary differences relate to the relatively unique nature of a given project and the lack of control of the environment at the work space. However. the quality of the performance itself can be measured.

plan. 1982. In many professions. This measure is so important it is a common field of study in construction management curricula in the United States and is commonly used in the industry. changes during construction (Hanna. The increase in understanding leads. and even geographic location (Proverbs. but serves to point out how the many complex factors which might affect construction productivity can be evaluated given an accepted measurement. et al. and the inputs to that product more nebulous. It is interesting to note that. Other studies have considered the impact of overtime (Thomas. 1998. 1992). 1995). 1988). it has been possible to complete detailed evaluations of the factors which might affect it on a particular job. which is ironic given that it was not that long ago that the problem was considered too complex by practitioners (Thomas and Kramer. Sanders and Thomas (1991) consider a number of factors which influence masonry productivity. et al. Thomas and Napolitan. to an improved ability to estimate. 1998). 1990. 1993). 1994). Ohtake and Ohkusa. Sanders and Thomas. and control construction projects (Thomas. 1993). wait times between consecutive trade processes (Howell. complex processes are boiled down to a bottom-line measure of an individual. in turn.actual productivity to planned productivity or the work completed to date compared to the scheduled completion as of that date. et al. This is by no means an exhaustive list. with construction productivity measurement a clearly defined process. et al. The clearest example comes from a number of studies of professional baseball managers (Porter and Scully. in which the winning 56 . The measurement of productivity and performance is considerably more complicated outside the manufacturing environment. For example. where in many cases the product is more difficult to define. using a statistical regression process.

research projects. individual engineers handle by themselves only the very smallest projects. the productivity of the individual engineer is of interest only to their direct supervisor in charge of continuing employment review. Stephan and Levin (1997) point out that individual productivity may 57 . measures such as those outlined above have significant limitations. 1997). etc. however. However. Goodwin and Sauer. measurement of the productivity or performance of individual doctors or attorneys is sensible. et al. There is a long tradition of measurement of individual productivity in service industries. Reuben. Wimo. 1993). The measurement of productivity by measurement of bottom-line performance is fairly common in service and technical industry sectors. Stephan and Levin. For example. papers. 1995). and perform or directly supervise nearly all activities related to it. As applied to engineering. The productivity of doctors is often reported in terms of patients seen (or related measures such as billings or bookings) per given time period (Lee. but journal publications per year appears to be a fairly common measure (Pelz and Andrews.percentage of the team over the season is taken as a measure of managerial productivity or performance. The use of bottom-line performance also appears to be common in the professions frequently used in comparison with engineering. et al. but certainly not to the owner of the project on which that individual toils. service activities. 1990. 1995. because the right output to measure is quite unclear (students. as they often are in essentially sole charge of a given case or “project”. 1966. usually employed in several different companies (Anderson and Tucker. most engineering projects require the activities of dozens of engineers.). as well. As a result. 1993. Attorney productivity may be measured using client billings per year (Rebitzer and Taylor. The productivity of professors is very complex. 1994).

are not particularly useful to the owner. As such. However. but that this measure is inappropriate for projects requiring significant degrees of collaboration. This measure is interesting to owners of engineering companies. references to productivity and/or performance of engineers in the literature are typically qualitative in nature. Wimo. Mairesse and Kremp (1992) measured company productivity on average in France. et al (1993). Furthermore.have been studied so much because it is relatively easy to measure. these references chiefly relate to the productivity of individual engineers and their ability to do more work in a given time. and usually made in the context of a recommendation for a given product. total corporate profit reported by engineering firms in the nation) per employee. and propose additional consideration of the patient mortality/period as a measure of the quality of physician practice. a few attempts to make quantitative measurements of group performance have been made. then. estimating the 58 . and in fact most engineering companies would likely be unwilling to report them. Profit-based measures. for example. describes profitability but gives no indication of the number of cases for which hours were billed that the attorney ultimately won for their client. or better work in the same time. do not provide a measure of quality. using the measure of average value-added (in essence. point out this shortcoming for the measure of bookings/period for health care workers. As has been described previously. but less so for use in identifying avenues of possible improvement and understanding the quality of the performance. but does not provide a management tool for identifying good performance. profit-based measures of productivity (such as billings/year) are very useful for making employment decisions about individuals. The measure of billings per year for attorneys.

1995. et al (1987) identified seven general parameters which described the quality of the design work. et al. Certainly the most rigorous evaluation of engineering performance to date arises from work sponsored by the Construction Industry Institute (CII) in the middle 1980’s (Chalabi. 1996). final project cost. Russell. performance. and safety. Warne. et al. These techniques are undoubtedly important. For the most part. engineers. 1986). Thompson. They then set about identifying what owner inputs from the 59 . 1993. 1994. describing the factors which influence performance. Porter. Partnering has received a considerable amount of attention as a means to improve the quality of the engineering activity (Featham and Sensenig. Packer. 1987. et al. Burati and Oswald. discussions of engineering quality can be described as study of the efficacy of certain management schemes for improving qualitative performance. Stull and Tucker. 1994). plant start-up. and designers. 1994. The parameters they identified were: final project schedule. but this literature also provides no means of evaluation or quantification. 1996. For example. quality of design. there is a significant body of literature which carries the Total Quality Management banner to engineers (Brown and Beaton. 1993). Engineering quality is a field which has generated a great deal of literature. but again it is largely qualitative. constructability. or differentiating better performing engineers. based on an extensive survey of owners. Fisher (1991) proposes a similar method for engineering company owners to use to identify strongly and weakly performing branch offices. 1990. Some research has tied project success to use of a number of management techniques during the design process or to the ability of the engineering manager (Anderson and Tucker.cost or schedule for new projects. Chalabi.

and plant utilization. Furthermore. effectiveness relating to the adequacy of work done on the task after it is completed. It was previously pointed out that the construction phase. but are proprietary methods developed by corporations. also using a qualitative survey method. Other similar indices exist. et al (1997) report the development of a Project Definition Rating Index (PDRI). is reasonably well quantified in terms of performance. et al (1987). and is now frequently used by a number of large industrial owners to guide their pre-project planning process. The correlation to project success was performed by comparing the PDRI to an index function of ratios of planned to actual schedule. plant capacity. they pointed out the importance of the input of the owner and the work of previous phases in the procurement process to the performance of the engineer. These authors pointed out the distinction between productivity and effectiveness. combined into a single index with a standardized weighting scheme. there are quantitative methods for evaluating the scope definition and planning phase which creates the input to the design phase. The PDRI consists of 70 elements which describe the state of the project definition and scoping process. which utilizes the output from the design phase in traditional delivery methods.4). budget. In recognition of the complexity of the design effort for an entire project.planning process most directly influenced which of these parameters. productivity relating to the use of resources to complete a task. Stull and Tucker (1986) attempted to make a quantitative evaluation of the effectiveness of the design process itself. which built on the work of Chalabi. The index has been shown to be correlated to project success (with R2=0. they considered only the piping design process and the subsequent construction of the piping described by the resulting design documents. Dumont. 60 . In this way.

1991. and certainly the complexity of the project and the readiness of the owner would be expected to influence this parameter regardless of the ability of the engineer (Glavan and Tucker. Some were evaluated quantitatively (e. this procedure suffers from the subjectivity of the scoring of individual parameters and the weighting of those parameters into a combined score. However. Then all of the mapped parameter scores were combined into a single score ranging from 0-1000 by weighting the individual scores. on a small project. cost of the design. and ease of start-up.g. while others were evaluated using a 1-10 scale (e. However.Piping was selected because it is a demonstrably important part of the overall project. For example. “accuracy” = number of drawings requiring revision/total number of drawings). 61 . revision of 20% of the design documents might be acceptable.g. et al. this work does validate the idea of evaluating design effectiveness. (1987) but because they were attempting to develop a quantitative measure they selected slightly different parameters to describe the effectiveness of the design. They were aware of the ongoing work of Chalabi. usability of the design documents. performance against design schedule. Through this process. Dumont. in which the value of each parameter was mapped to a standard for that parameter. but a different threshold might be appropriate on a larger project. All of these factors were then entered into an objectives matrix. the performance of the design effort could be evaluated. 1997). at least semi-quantitatively. constructability. economy of the design. et al. constructability and usability). They then attempted to evaluate these parameters. namely: accuracy of the design documents.

This study indicated that A/E firms. Eldin further proposed a management system that would complement the deficiencies in the then current practices. through site 62 . Apparently. the engineering management and control systems that are applied in engineering organizations remain the primary platform for measuring and tracking engineering performance and productivity. as compared to construction activities. track the production of drawings as its primary control tool.2 Engineering Control Systems in Industrial Construction The literature cites little about engineering control systems compared with the abundance of articles that address the control of construction activities. Although engineering control in essence differs from engineering productivity measurement. extended tracking for all types of engineering documents. engineering organizations lack a standalone system for measuring engineering productivity.6. Eldin (1991) gave a concise description of the common practices in engineering/design phase control and management. and a quantitative method for progress measurement. The comprehensive review conducted by the RT-156 researchers. In the 1980’s. the engineering control systems employed in engineering organizations provide the means for controlling cost and schedule. It is primarily based on the work breakdown structure (WBS) for establishing a control budget. Today. which are directly influenced by engineering productivity. the CII Cost/Schedule Task Force conducted one of the few comprehensive studies on the topic of project control in design engineering (Diekmann and Thrush 1986). The study primarily concerns the tracking of engineering cost and schedule performance. almost solely.

McConnell 1985. They are: actual work-hours of work performed. is quantified by means of three basic indicators (Wynton 1986). The budgeted work-hours of work performed for a control account can be calculated as follows: 63 .visits and expert interviews. The earned value concept. • Budgeted work-hours of work performed (earned work-hours) signify the value of the completed work. as applied in engineering organizations. it reflects the progress made with comparison to the original plan. Accordingly. revealed that measuring engineering performance and productivity is commonly based on the Earned Value concept as a part of broader engineering control system.3 Earned Value Concept for Measuring Engineering Performance The earned value is a technique for evaluating the performance of a control account. The earned value concept has been elaborated on in various publications (Kerzner 1998. and budgeted work-hours of work scheduled. • Actual work-hours of work performed represent the work-hours actually incurred by engineers or designers in accomplishing the work performed and they are directly collected from the time sheets. budgeted work-hours of work performed. The basics of the earned value concept as they are applied to engineering control systems will be discussed in the subsequent section 6. Thamhain 1992).

Based on the indicators calculated above. Among the common evaluators are the work-hours variance and the performance factor (PF). • Budgeted work-hours of work scheduled represent the original planned work-hours to be completed by date of evaluation. the performance of a certain engineering control account can be evaluated. This value signifies the planned progress and calculated as follows: Budgeted work-hours = estimated unit rate x budgeted quantity to date Where. which are calculated as follows: work-hours variance to date = actual work-hours of work performed – earned workhoured of work performed Actual work-hours of work performed PF = Earned work-hours of work performed Also the PF is sometimes reported as: Earned work-hours of work performed PF = Actual work-hours of work performed 64 . estimated unit rate is the estimated time for completing an individual unit of the corresponding engineering deliverable or work task. estimated unit rate is the estimated time for completing an individual unit of the corresponding engineering deliverable.Earned work-hours = estimated unit rate x actual quantity to date Where.

unit rate.. i. a basic element of the system. the following predictions can be made for that account: Estimated work-hours at completion = Total budgeted work-hours + Work-hours variance to date Estimated work-hours to complete = Estimated work-hours at completion – Actual work-hours of work performed Review of several engineering control systems reveals that engineering productivity (input/output ratio) is not a common outcome of these systems. However. The terms unit rate and productivity are sometimes used interchangeably in literature. If productivity is defined as the ratio between input and output. In the case that productivity is defined as the ratio between output and input. the two terms coincide.e. 6. is equivalent to productivity in a sense that it is a ratio between productive work-hours and the quantity of work performed. the unit rate is the reciprocal of productivity.4 Components of the Engineering Performance Control Systems The success of the earned value concept in measuring engineering productivity or evaluating engineering performance depends 65 on the ability to accurately .Assuming the original unit rate will be followed for the remaining work of a certain engineering control account.

.estimate/calculate several system elements. e. e.g. 6. e.g. A significant portion of the reviewed RT156 control systems gather both input and output information at either the group of work tasks/deliverables or the specific activity level. isometrics for specific 3D submodel. piping • Group of work tasks / deliverables. The WBS primarily defines the various work packages required to deliver the facility. information can be 66 . Afterwards. an engineering control system depicts a hierarchy of control accounts that are tied to the WBS of the project. • Specific Activities (activity codes).4. production of orthographic drawings. This section will discuss this issue in detail with some illustrated examples from the engineering control systems in use by RT-156 companies. • Individual deliverables.g.g. • Engineering disciplines.... e.1 Level of control Similar to construction control systems. A basic characteristic of an engineering control system is to specify the level of detail at which information would be gathered. A hierarchy of engineering control accounts would contain the following levels: • Total engineering on a project. all listed isometrics for specific sub-model.

1 Levels of Tracking Engineering Performance 67 . The results of the questionnaire survey reveal that most engineering organizations track performance of their engineering disciplines in each project. On the other hand. Figure F. 91% 100% 77% 68% 59% 14% 14% Individual personnel Others Individual deliverables Specific Activities Group of work tasks/deliverables Discipline 0% Total Engineering 27% Approach Figure 6. it is rare for engineering organizations to track performance of individual personnel in a certain project.1 illustrates the findings of the survey. They further pointed out that controlling by WBS work packages (equivalent to the specific activities level of control) is the sufficient and effective method in engineering project control.1 (Appendix F) illustrate a sample list of specific activities (activity codes) control accounts. Figure 6. Interviewed experts advised that engineering organization not to track work-hours of individual deliverables or individual personnel.accumulated to provide a means to evaluate performance of engineering design at higher levels of the hierarchy.

C. then these estimates are reviewed and accumulated with other disciplines by Project Manager and/or Project Engineering Manager. Several approaches are common industry-wide for estimating the total earnable work-hours to be expended on a job. A discipline Chief Engineer with his/her principal engineers and design supervisors estimate work-hours needed for engineering work on the job.6. the earnable work-hours (that can be earned) for the various project control accounts are estimated.2 Engineering budget At the early stages of a project. Engineering Manager and the discipline Chief Engineer 68 . where unit rates associated with each deliverable/task is obtained from the company database/experience. Figure F. B. The estimate is based on the simple equation: Budgeted work-hours (control account) = estimated unit rate x budgeted quantity The accuracy of both estimates constitutes the mainstay for a successful control system.4.2 shows a sample list of company-developed database for various unit rates. The general approaches for developing the engineering budget include: A. A mutual effort between Project Manager. Project Manager and/or Project Engineering Manager breakdown total budgeted work-hours among engineering disciplines.

for internal use by engineering control personnel. Design leaders provide the basic estimate that is reviewed and approved by the Chief Engineer and then accumulated with other disciplines by Project Manager and/or Project Engineering Manager.2 100% Frequency 59% 55% 32% 18% 14% 18% 0% Approach Approach Approach Approach Approach (A) (B) (C) (D) (E) Others Approach Figure 6.D. Figure 6. The RT-156 survey indicate that the 2nd and 4th approaches. However. The contract type for a job greatly determines the level of detail in the initial estimate of total earnable work-hours.2 Approaches for Estimating Engineering Workhours Budget at Project Authorization 69 . A feedback that starts from designers and engineers up through the hierarchy. an accurate estimate of engineering workhours is crucial for the control of engineering tasks and activities to be performed in the project. as listed above are the more popular industry-wide. the utmost degree of detail is crucial as opposed to cost reimbursable contracts. For lump sum fixed price contracts. E.

3 Input (Work-Hours) Engineering work-hours that are expended to accomplish the work of a control account constitute the main input for the engineering and design process.g. The findings are illustrated in Figure 6. expended (actual).. which are filled out on a periodic basis.1 illustrate the breakdown of piping work-hours as tracked in the engineering control system (at the group of deliverables/work tasks level of control) of one RT-156 organization. However. along with their estimated percentage out of the total piping engineering work-hours. Reporting of these work-hours is usually done through the time sheet of engineers and designers. to maintain an efficient control system. Table 6. Sample Breakdown of Piping Workhours Piping Task Percentage Layout 44% Isometrics 20% Stress analysis 15% Material requisitions 5% Underground piping 5% Piping support 11% Total 100% The survey further investigated the components of the engineering control system that are used for collection of the estimated. weekly.1.4. Table 6.6.3. e. the breakdown of engineering work-hours must be consistent with the work to be performed (Thomas and Kramer 1988). 70 . earned and predicted work-hours.

Tasks are supporting activities that are not classified as deliverables. 71 .4 Output (Quantity) Tracking of engineering quantities completed to date or during a certain period is the more challenging task of the engineering control system. 73% Projected % completion.41% Forecast workhours at completion. and material requisitions. specifications.3 Overview of Engineering Control Systems Regarding Work-hours 6. Generally. Deliverables depict physical engineering products such as drawings. while being essential for the production of other engineering deliverables.4. Remaining workhours (forecast workhours to be completed) 82% 73% Earned workhours to date 55% Earned workhours this period 100% Expended (actual) workhours to date 86% Expended (actual) workhours this period Revised budgeted workhours 95% Original budgeted workhours 100% 0% 25% 50% 75% 100% Frequency Figure 6. engineering activities to be tracked in the control system are classified as either deliverables or tasks.

drawings and specifications. percent complete. design models are recently becoming more of a deliverable especially with the interest of some owners to have the model submitted with the other conventional engineering deliverables. and start/finish. Experts interviewed pointed out the common techniques include units completed. Thomas and Kramer (1988) provides a clear definition of each of the aforementioned techniques. level of effort. e. Yet. milestones. as follows: • Units completed: An actual counting of the units of work completed. To assist in measuring progress or the amount of work completed for a control account. the engineering control systems adopt the same set of techniques that are used for measuring construction activity progress.g.. In an earlier CII study. • Percent complete: A subjective approximation of the percent complete for relatively minor tasks and where development of a more complicated intermediate milestone or level of effort formula is not justified.Design 3-D models are a typical example of an engineering task since they constitute an inevitable engineering activity for all consequent engineering deliverable production processes. • Level of effort: A process of assigning a pre-determined percent complete to a task on the basis of completion of various measurable subtasks. 72 . Usually the effort of tracking those individual subtasks is numerous and costly.

100% Frequency 68% 45% 41% 32% 23% 0% Units completed Percent complete Level of effort Incremental milestone Start/finish Method Figure 6. etc. e.4 represents an overview of the industry in terms of the techniques used to estimate the amount of engineering work completed. system description. etc.• Incremental milestones: Similar to level of effort method except for that it is used when only a few non-measurable subtasks exist.. e.. Minor tasks or tasks with no clear intermediate points are better candidates for methods like %complete or finish/start. piping isometrics. The use of each of the listed methods depends primarily on the nature of the engineering activity under consideration. draft of calculations. Figure 6.g. • Start / finish percentages: A process of assigning either 0% or 100% to a task with no intermediate milestones. equipment list.g.4 Methods for Tracking Engineering Work Progress 73 . Tasks with intermediate points or defined sub-tasks are better candidates for methods like level of effort and intermediate milestones.

Based on the piping engineering and design process.2 Tracking of Work Completed in Piping Engineering and Design Method Engineering deliverables and work tasks Units completed Valve list Percent Complete Initial draft of calculations Plant layout model / Equipment general arrangement model Piping 3D model Orthographic drawings Underground piping drawings Stress analysis reports Pipe support drawings and specs Level of Effort Piping isometrics Bill of materials Incremental Milestones Process flow diagram (PFD) Piping and instrumentation diagrams (P&IDs) Equipment List Piping design criteria Piping design specifications Plant layout model / Equipment general arrangement model Piping 3D model Start/Finish System description Certified spool sheets Constructing a control system for the piping deliverables and work tasks that are better tracked through the level of effort or intermediate milestone methods is more 74 . Table 6. Table 6. A survey of the industry was conducted to investigate the extent to which each of the listed techniques for tracking the amount of engineering work completed are used for each of the piping deliverables and work tasks.2 shows the industry-wide popular approaches for tracking engineering progress for piping. a list of piping deliverables and work tasks was developed. Appendix D.

Expert interviews from RT-156 companies showed that these structures could differ significantly between organizations. a percentage of completion is required to accompany the identified intermediate points. A sample of some of the level of effort and intermediate milestone structures as adopted by a company of the RT-156 for the piping function is given in Figure F. whenever these methods are used for the corresponding deliverable or work task. 75 .5. actual.1 (Appendix F) for the listing of these levels of effort/milestone structures. The researchers through the expert interviews and site visits of several RT-156 companies developed a comprehensive guiding list of level of effort / intermediate milestone structure for various piping deliverables and work tasks. Refer to table F. Similar to engineering work-hours. and remaining work quantities. the survey further investigated the components of the engineering control system that are used for collection of the estimated.3 (Appendix F). Furthermore.difficult. It requires the identification of an intermediate point structure that signifies the subtasks that compose the deliverable or work task into consideration. The findings are illustrated in Figure 6.

50% Remaining quantity (to be completed) 64% Actual % complete to date 82% Finished quanitity to date 77% ` Actual % complete this period 59% 59% Finished quantity this period 45% Variance of quantity from revised budget.4. 86% Revised budgeted quantity 82% Original budgeted quantity 0% 25% 50% 75% 100% Frequency Figure 6. Yet.Quantity at completion. the term productivity is used mistakenly to 76 .5 Overview of Engineering Control Systems Regarding Work Quantities 6. 41% Projected % complete. it is important to recognize that engineering productivity in this context is part of an engineering control system rather than a measurement and improvement system.5 Unit rate and productivity The engineering process inputs depicted by the work-hours and its outputs depicted by the completed quantities represent the two components of productivity measurement and evaluation in engineering organizations. Expert interviews and surveys indicated that the unit rate is used more often to signify the ratio between the input and the output. On the other hand.

As mentioned earlier. As productivity measures constitute a base for both internal and external benchmarking. engineering work-hours and quantities can be collected at different levels of detail. the collected information can be further accumulated to represent the higher levels of the hierarchy (e. the RT-156 examined the use of productivity metrics for benchmarking purposes.. • Total engineering productivity on a project measured as the ratio between total billable hours for engineering and the installed cost they designed on project.g. Also. engineering disciplines). and the annual company-wide engineering productivity. • Annual company-wide engineering productivity measured as the ratio between total annual billable hours for engineering and the installed cost they designed each year The survey showed that the annual company-wide engineering productivity is the least tracked productivity metric among all three.signify what is known as performance factor (PF) in the literature (presented in a preceding section). total engineering on a project. The levels examined are the engineering discipline productivity. total engineering productivity on a project. A tentative metric was developed for each as follows: • Engineering discipline productivity measured as the ratio between total billable hours for a discipline and the installed cost they designed on project. More than half the respondents indicated that their companies do not measure engineering productivity on an annual 77 .

basis. 6.6.6 Measuring Productivity for Engineering Disciplines Not tracked 15% Total billable hours/installed cost 15% Others 70% Others include: • Earned hours / billable hours • Earned hours • Total billable hours / contracted hours • Earned hours / expended hours • Actual hours / budgeted hours • Actual hours / earned hours Figure 6. Figures 6.8 represent the findings of this survey. earned work-hours and budgeted work-hours are more dominant in measuring engineering productivity for either a discipline or total engineering in a project. Not tracked 11% Total billable hours/installed cost 22% Others 67% Others include: • • • • • • Earned hours / billable hours Earned hours Total billable hours / contracted hours Earned hours / expended hours Actual hours / budgeted hours Actual hours / earned hours Figure 6. it was apparent that the ratios relating expended work-hours.7 and 6.7 Measuring Total Engineering Productivity on a Project 78 . Furthermore.

number of equipment pieces) C) Engineering Productivity = Actual number of work-hours spent Production Unit (eg barrels of crude oil refining) 79 .8 Measuring Annual Company-Wide Engineering Productivity Although the tracking of engineering productivity through deliverables and work tasks is the more common practice industry-wide. The survey further investigated the various possible approaches that can be adopted in measuring the piping engineering productivity.Others 28% Total billable hours/installed cost 17% Not tracked 55% Others include: • Total billable hours / contracted hours • Actual hours / budgeted hours (on monthly basis) • Actual hours / earned hours Figure 6. interviews and surveys indicated that some organizations adopt other practice. one of the companies visited use the number of piping work-hours expended per the length of pipe completed (in linear feet) as the basic measure for productivity of the whole piping package. A) Engineering Productivity = Actual number of work-hours spent Actual Quantity of Engineering Completed (eg drawings. isometrics.g. instead of tracking it at the more detailed levels. etc) B) Engineering Productivity = Actual number of work-hours spent Actual Quantity of Materials To Be Installed (eg linear feet of pipe. e.. For instance.

9. Figure 6. The refined tentative list of metrics that are currently being used or can be used for measuring the piping engineering productivity include: • Total Piping Work-hours / drawing • Total Piping Work-hours / ISO • Total Piping Work-hours / Line • Total Piping Work-hours / Line (2Dwg) • Total Piping Work-hours / Line (3D Dwg) • Total Piping Work-hours /Equip Piece 80 .9 Basic Approaches in Measuring Piping Engineering Productivity With the proved practice of using untraditional metrics for measuring the engineering productivity for the piping engineering and design. the RT-156 investigated several metrics that can be used for such a purpose. No response 23% Actual workhours / actual quantity of engineering completed 35% Not tracked 5% Actual workhours / actual quantity of materials to be installed 5% Others 32% Actual workhours / Production Unit 0% Others include: • Earned hours / billable hours • Actual hours / earned hours • Actual hours / completed lines Figure 6. support the claim that using work-hours expended per deliverables or work tasks completed is the more common practice industry-wide.Findings of the survey.

• Piping MTO Work-hours/ Line where. MTO stands for material take-off • Piping MTO Work-hours/ Requisition • Total Piping Work-hours/ LF of Pipe • Total Piping Work-hours/ Valve Count Total Piping Workhours / Valve Count Total Piping Workhours / LF of Pipe Piping MTO Workhours / Requisition Not collected but could be collected Piping MTO Workhours / Line Total Piping Workhours / Equip Piece Currently collected Total Piping Workhours / Line (3D Dwg) Total Piping Workhours / Line (2Dwg) Total Piping Workhours / Line Total Piping Workhours / ISO Total Piping Workhours / drawing 0 5 10 15 Frequency Figure 6.10 Feasibility of Collecting Piping Engineering Productivity Data The survey investigated the feasibility of using one or more of these metrics in measuring the piping engineering productivity. This is done through identifying whether the data supporting these metrics are currently collected or could be collected in the various contacted companies. The results were quite encouraging as almost all companies 81 .

where total 82 . Figure 6. The survey requested identifying the top three most important/appropriate metrics for measuring piping engineering productivity. and then.11 Suitability of Piping Productivity Metrics Also. Total Piping Workhours/ LF of Pipe Piping MTO Workhours/ Line Rank 3 Rank 2 Total Piping Workhours / Line (3D Dwg) Rank 1 Total Piping Workhours / Line Total Piping Workhours / drawing 0 2 4 6 8 Frequency Figure 6.11.indicated that the data that can be used for calculating these metrics are either available or could be collected. Findings of this survey is given in Figure 6. ranking them from most important (1) to third in importance (3).10 illustrates the response for this investigation. the investigation focused on identifying the set of metrics that are more accurately represent the engineering productivity of the piping engineering and design.

100% 73% Frequency 75% 50% 41% 45% 36% 32% 18% 25% 14% No response Not tracked Forecast (at completion) Forecast (to be completed) To-date This period Basline 0% Figure 6.12.6 Performance Evaluation The review of the earned value concept earlier in the chapter illustrated that the evaluation of performance is the common end product of the engineering control systems. i. the survey further investigated the basic productivity/unit rate measures in the engineering control systems.piping work-hours per line and total piping work-hours per drawing were selected as the most important/appropriate metrics for measuring piping engineering productivity. and budgeted work-hours of work scheduled. Both work-hours variance and performance factor can be calculated based on three indicators. actual work-hours of work performed. As with work-hours and quantities..4.12 Overview of Engineering Control Systems Regarding Productivity/Unit Rate 6. budgeted work-hours of work performed. The preliminary reviews 83 . The findings are illustrated in Figure 6.e.

especially the performance factor. Rather.13 Approaches for Calculating the Engineering Performance Factor Variance or deviation is the other measure that signifies the engineering performance in an organization.5 and F. Figures F. A sample performance factor evaluation is given in Figure F. 100% Frequency 75% 50% 41% 27% 25% 18% 14% 5% 9% 0% Budget Actual Earned Earned workhours workhours workhours workhours / earned / budget / actual / earned workhours workhours workhours workhours Not tracked No response Figure 6. the various approaches are used with varying degrees of popularity.6 (Appendix F) illustrate samples of 84 .13 shows the findings of the survey. Figure 6.4 (Appendix F). Four basic approaches were encountered: A) PF = Earned work-hours / budget work-hours (> 1 is GOOD) B) PF = Earned work-hours / actual work-hours (> 1 is GOOD) C) PF = Budget work-hours / earned work-hours (> 1 is BAD) D) PF = Actual work-hours / earned work-hours (> 1 is BAD) Survey indicated that there is no specific trend for calculating the performance factor.indicated that companies differ in the way they calculate these measures of performance.

• Identify deviations.7 Practical examples The control systems used by the various interviewed/surveyed companies are quite similar. No response 9% Others 5% Budget revisions 59% Corrective action plan 27% Log of deviations 36% Quantity completed overrun or underrun 5% Hours overrun or underrun 86% $ overrun or underrun 64% 0% 25% 50% 75% 100% Frequency Figure 6.14. The findings are illustrated in Figure 6. The basic operations performed by a typical control system are: • Measure actual progress against planned.calculating and visualizing deviations of engineering performance from members of the RT-156. 85 . • Measure performance of work executed.4. The survey also investigated the variance-related measures in the engineering control systems.14 Overview of Engineering Control Systems Regarding Variance or Deviations 6.

• Calculate both earned work-hours and budget work-hours. where. according to the following equations: Earned work-hours = estimated unit rate x quantity completed to date Budget work-hours = estimated unit rate x planned quantity to date • Calculate the engineering % complete for the control account. % Complete = earned work-hours / budget work-hours • Calculate the control account performance factor (PF) as follows PF = actual work-hours / earned works. Or others 86 . The engineering control system measures the engineering performance for a specific control account according to the steps listed below: • Calculate the physical progress in producing engineering deliverables or performing engineering tasks for a specific control account.• Implement corrective actions. • Forecasting. This value represents the quantity of work performed. The techniques used for this process would primarily belong to one the five techniques mentioned earlier.

number of lines. 6. especially with the use of 3D CAD models.5 Summary The industry lacks a unified approach for measuring productivity of engineers and designers. Interviews with industry personnel revealed that such approach is becoming outdated and unreliable. and so forth.7 and F. specification sheets.8 (Appendix F). 87 . and so forth. The common practice is to use metrics based on the engineered deliverables such as isometrics. However. to estimate and assess productivity of engineering disciplines.The presented procedure is primarily followed in the two samples presented in Figures F. each depicts the earned value concept application for the piping engineering and design and the total engineering on the project. which depend primarily on engineered units such as length of piping in LF. The survey further indicated that companies have the capabilities to collect data pertinent to these metrics. While productivity is sometimes addressed in the engineering control systems employed by engineering organizations. The samples are taken from different companies. With regard to the piping discipline. material requisitions. no stand-alone systems exist to measure and track productivity in specific. an industry survey conducted by the RT-156 showed that a number of productivity metrics can be more reliably used to measure the productivity of such discipline.

Accordingly. An enumeration of the various levels of control for measuring engineering productivity. First. whether a standalone engineering productivity or engineering productivity as part of total company/project productivity. Measurement of engineering productivity in the industrial sector can be considered from various perspectives.1 Basics As introduced in chapter 6. Using industrial project data. 7. or 2) productivity = input / output. the inputs and outputs of a specific system into consideration constitute the mainstay of measuring this system’s productivity. The chapter then covers the similarities between such approach and methods used for productivity measurement in the software industry. each of which applies to a different level of control. the conceptual framework for measuring the effective engineering productivity in an industrial project is presented. The literature uses one of either two forms to measure productivity: 1) productivity = output / input. productivity is a relationship between an output and the means to produce this output (input). This calls for varying inputs and outputs to be collected for productivity measurement purposes. statistical techniques are employed to demonstrate the use of the new approach for measuring productivity in the piping industry.CHAPTER 7 ENGINEERING PRODUCTIVITY MEASUREMENT SYSTEM The chapter introduces the new approach for measuring engineering productivity as defined by the RT-156. would include: 89 .

V. Consider more closely the measurement of engineering productivity at the discipline level. identifying the quantity delivered as representative of the output of the productivity measure can be more challenging. total material take-off (MTO) hours were previously identified as another alternative for the input of the productivity measure. the total discipline work hours constitute the basic resource that is expended to deliver the various engineering products of that discipline. they serve as a prime measure of the engineering discipline inputs. VI. Discussions among the RT-156 members led to a consensus that the research activity should primarily focus on level IV of the above hierarchy. level V and level VI. Individual Person’s Performance on a Deliverable. III. Company’s EPC Work Process. Engineering Performance on a Project. II. such as 90 . Thus. and possibly level III. Project. Figure 7.I. measuring productivity at the more detailed levels. Engineering Discipline Performance.1 shows several illustrations of measuring engineering productivity at some of the levels of control listed above. Discipline Activity Deliverables. Contrary to the inputs. would be limited to the summary of existing practices as presented in chapter 6. IV. Review of the current engineering practices showed that engineering deliverables. Besides. Also.

2 illustrates the preliminary scope model for the RT-156 study. Person Hrs/ Field. Nevertheless. Quantity Project # CompanyA Engr&Const Hrs/Field Quantity Comp B Comp A CompanyB Project # Project # Figure 7.1 General Approaches of Measuring Engineering Productivity 7. pointed out that engineering deliverables are probably misleading measures of the delivered product. the study focuses on the engineering work processes and the engineering disciplines that carry out such processes. would represent a more reliable measure of the engineering output product. and the productivity measures that relates both of them. The survey results shown in chapter 6 illustrated that other measurable outputs. such as number of lines of piping. length of lines of piping. 91 .drawings and isometrics. several of the interviewed industry practitioners. Hours Hours/Deliv. Both the inputs and the outputs of these work processes have been addressed in the earlier CII studies as explained in a previous chapter. The research addresses the expendable inputs. the quantity of outputs of the engineering work processes.2 Conceptual Framework for Measuring Engineering Productivity Figure 7. are the major outputs tracked in the engineering control systems.. in addition to the RT-156 members.. In general. and so forth.

etc.PRODUCTIVITY = INPUT (Engineering Resources) OUTPUT (Quantity) WORK PROCESS ENGINEERING WORK PROCESS INPUT WORK PROCESS OUTPUT INPUT QUALITY RESOURCES QUANTITIES (INPUT) (OUTPUT) RESULTS CII Publication 8-1 Evaluation of Design Effectiveness CII Publication 8-2 INPUT VARIABLE IMPACT ON DESIGN EFFECTIVENESS Factors: * Accuracy of Design Documents * Usability of Design Documenrts * Cost of Design * Constructability * Economy of the Design * Performance against Schedule * Ease of Start-up Factors: * Scope Definition * Owner Profile and Participation * Pre-Project Planning * Project Objectives * Basic Design Data * Selection and Qualification of Designer * .2 CII RT-156 Preliminary Scope Model 90 . . . Figure 7.

Examples are project size ($TIC). Examples are completeness of scope definition. also known as PDRI score. • Differences between projects are not only characterized by the project inputvariables but also the scope and complexity of the work to be performed. project objectives and priorities. Therefore. constructor input. level of owner experience with process technology. level of designer experience with process technology. and so forth. 1987) identified a set of input variables that believed to be the major drivers of engineering performance. a true measure/estimate of engineering productivity must take into considerations the major drivers that cause productivity to differ among projects. This is primarily due to the varying conditions that characterize each of these projects. 91 . comparative level of complexity with projects of the same sector and so forth. designer qualifications and capacity. • Earlier studies by the CII (Chalabi et al. basic design data. number of mechanical equipment pieces. impact of site and environmental conditions on design practices.Based on the preliminary scope model and the extended discussions held by the RT-156 members. Among these issues are the following: • The ratio between inputs and outputs that signifies engineering productivity can differ significantly between projects. A later research activity of the CII (Gibson and Dumont 1996) resulted in developing an index that can be used in measuring the completeness of scope definition. several crucial issues concerning the measurement of engineering productivity were drawn into attention.

which evaluates the quality of the project input variables that have major impact on engineering performance and productivity.• While productivity is a basic measure of performance. Another previous CII study addressed the issue of a comprehensive design performance index (Tucker and Scarlett 1986). or: Input Quality Factor X Scope & Complexity Factor X Raw Productivity X Output Quality Factor According to Zenge (1999). it can be quite misleading if considered on a standalone basis. as representative of design performance. these factors recognize that the quality of project definition (front end load) and the type of project greatly influence the designers personal productivity. Scope & Complexity. Using an objective matrix approach. less number of work hours expended on a job would not be favored if a substantial amount of design errors and omissions existed. It also recognizes that a gain in design productivity should not come at the expense of lower construction productivity and high re-work cost. For instance. the study quantifies a design effectiveness index. such as the PDRI score. and Output Quality. and this can be done by adjusting the raw productivity with several factors: Input Quality. A designer's productivity is 92 . The main factors are as follows: 1) Input Quality Factor is an index. Conceptually. we need to measure effective productivity.

and special piping such as sanitary.higher when the design intent is well defined.” Note that planned choices to do portions of the design in the field are not penalties. etc. Similar factors for other disciplines can be further developed. Examples of items considered for the piping discipline would be pipe size. such as the one developed by Tucker and Scarlett (1986). and 4) Deliverables Quality Factor is an index that accounts for design effectiveness. Key input documents include Piping & Instrumentation Diagrams (P&ID's). high pressure. equipment arrangement drawings and certified vendor prints for equipment and vessels. average run length. e. 3) Raw Productivity is the design work hours per installed unit. or in the Scope component of the Scope & Complexity Factor. Factors to consider are: design 93 . I've had more work to do in the field. tie in lists. piping specifications. Apparent engineering design productivity must be modified to reflect the impact of the design quality on construction and total project success. 27 hours/1000 feet of 2 inch pipe. and is lower when less defined. congestion. “Every time I've encountered an engineering productivity improvement effort.g. rather should be reflected in raw productivity input hours. 2) Scope/complexity factor is an index accounts for the differences between projects of the same industry sector. As one member company's senior construction manager stated. line lists.

some other fields proved helpful in providing a number of ideas for the research. productivity of programmers. productivity is defined as the ratio between the number of lines of code (including both data declaration and comments) and the number of hours expanded to complete these lines of code (Duncan 1988). many parallels have been identified between the research area and that of the software industry.interpretation hours required in the field. loss of schedule.e. The importance of software development productivity. 2000. 7. has been recognized for several years. In particular. and can be artificially inflated in much the same way that one can inflate lines of code 94 . i. Such measure of productivity is further adjusted to account for the number of defects and level of complexity (through what is commonly known as function point methodology). The conclusions further drawn by RT 156 based on its assessment of the function point methodology and productivity measures in other industries include the following (Walsh.3 Software development Productivity: Similarities and Benefits Although the review of literature revealed that little work has been done in the area of measuring productivity for the engineering profession. etc. Appendix C): • The drawbacks to using the lines of code measurement which are widely understood in the software industry are similar in many ways to those which can be described for drawings in the EPC industry. Generally. % re-work due to design errors and omissions.. The measurement of drawings is ambiguous in the database design environment.

with comment statements or inefficient coding. Furthermore. RT 156 sought a better measure of size than the number of drawings. cases won/billing. Measures of quality are typically not built directly into the productivity measurement in other industries. • Development of the function point methodology for assessment of project size has required over 20 years and thousands of staff hours so far. the quality of the input to the engineering phase and the quality of the output from the engineering phase.g. as a minimum. • Families of measures are needed to assess the overall performance of the engineering effort. and to seek instead a simpler measure of project size. if ever. As a result of this ambiguity. but may be determined using related calculations (e. the drawings required can vary from owner to owner even for similar projects. for example. These related measures must include. 95 . In the function point method. This observation led RT 156 to conclude that the development of “engineering design points” is not likely to occur in the near future. It would also be possible to consider complexity by producing different ranges of productivity for projects of differing complexity. and the standardization effort is not yet completed. delivered defects/corrected function point). patient mortality/total patients. • Assessment of project complexity must be made. this assessment is hard-wired into the system through the overall complexity adjustment factor.

software firms usually deliver a final product (the application). Measure should show the effectiveness of designs using reusable code The software industry parallel was a key component in the selection of final installed quantities as part of the engineering productivity calculation. First.) Support different design methods (e. while design is an 96 . but has not to date really been thought of as a measure of project size for engineering.g. As important as the analogy to the software industry is. there are also a number of important differences between software development and the engineering process which must be kept in mind. manufacturing. 3-D database vs. Probably different benchmark values can be developed over time. Same as above None exist Probably different benchmark values can be developed over time Installed quantities should be independent of design method Quantities will increase with project size. etc. RT 156 looked at the Jones (1999a) requirements. traditional) Support all sizes of project Support all programming languages Support all sizes of application Support new or recycled code Workable for projects at end of scope definition Support design-from-scratch or reusable designs Met by Installed Quantities? Quantity take off is a relatively standardized process Some ambiguities depending on which quantities are selected None exists None exists Can estimate installed quantities at this stage and modify as design proceeds. However. process industry. and recognized that installed quantities served many of the criteria listed (modified to suit EPC): Jones (1999a) Criteria Standardized method for measuring size Unambiguous EPC Criteria Standardized method for measuring quantity. and for which quantities to measure Unambiguous rules for take-off Formal users group Adequate published data using metric Workable for new projects Formal users group Adequate published data using metric Workable for legacy systems Have conversion rules for related measures Deal with all deliverables Support all kinds of software Workable for retrofit/remodel Have conversion rules for related measures Not directly translatable Support all kinds of projects (e.g.• RT 156 recognized that the installed quantities at the job site are a measure of the size of the project. This measure is already available as part of the construction and as-built assessment process.

and fewer companies involved.4 Overview of the Conceptual Approach of Engineering Productivity Measurement The basic conceptual approach recognizes three factors to be used for adjusting the raw engineering productivity. This means that counting and attribution of the function points is more straightforward (Walsh. the RT-156 directed its focus to a better understanding of both the Raw Productivity and Scope & Complexity Factor. functional specifications or requirements documents for software are generally much more detailed than the scoping documents produced for designers.intermediate step. Third. Appendix C).3 illustrates the focus area of the research regarding the conceptual approach for measuring engineering productivity. This is pursued for the pre-selected field of piping engineering and design. The functional specifications usually include a very detailed data model. usually there are fewer discipline-specific players (such as the engineering subcontractors or vendors in the EPC industry) in the software arena. screen shots which show the content of every screen. 2000. Second. and very detailed descriptions of the underlying algorithms. In general. Rationales behind the selection have been elaborated on in chapter 3. 97 . the studies of software development productivity provide a reliable platform for the research activity to proceed with the basic conceptual approach as introduced earlier. As a number of previous CII studies addressed both Input Quality Factor and Deliverable Quality Factor. 7. Figure 7.

As the RT-156 experimented with metrics that are less commonly used for measuring piping engineering productivity. While the analysis will be conducted for the piping discipline throughout the remaining of this chapter. responses on the questionnaire survey revealed that some of these metrics more appropriately represented the piping discipline productivity. the RT-156 members established raw productivity as piping design work hours divided by the installed length of pipe (Zenge 1999). among a host of others. The results of the survey supported the original claim that there are other productivity metrics that can better represent the engineering performance other than the ones that are commonly used (e.g. The set was initially developed by members of the RT-156 based on their experience with the piping practices in the industrial sector.Input Quality Factor X Scope & Complexity Factor X Raw Productivity X Deliverables Quality Factor Research Focus Figure 7. similar procedure can be followed for the other discipline. and total piping design work hours per number of orthographic drawings). An enumeration of these variables for each discipline is included in the results of the 2nd subteam of the RT-156 (represented in Appendix B). Research Focus Regarding the Conceptual Approach A set of raw productivity metrics for the piping discipline has been introduced in chapter 6. total piping design work hours per number of isometrics. total piping design work hours per length of lines of piping. Accordingly.. 98 . Examples include total piping design work hours per number of lines. The RT-156 further identified the various variables defining engineering productivity metrics for the other engineering disciplines.3.

as developed by the RT-156. # of finished products 5. includes: 1. Client/owner design participation 8. The list of variables that describes a project scope. Relative size of project/plant 7. Contract type 9. Greenfield) 2. # of raw materials 4. Site conditions 5. Owner previous experience w/ process technology 3. Size of plant (ft2) 3. Legal and environmental impact 99 . Split engineering The list of variables that describes a project complexity. Project size ($TIC) 6. Project type (Grass Roots vs. Engineering-design schedule 2. includes: 1. # of equipment pieces 10. as developed by the RT156.Similar to the metrics defining engineering productivity for a particular discipline. Designer previous experience w/ process technology 4. the RT-156 pursued a parallel effort for identifying the list of variables depicting the scope and complexity of a project.

The list includes: 1. Vendor data 100 . Objectives and priorities 6. Constructor input and constructability 11. Amount of standard module re-use 8. Use of Integrated Databases 13. Project Manager Qualifications 10. Owner profile and participation 4. Use of 3D CAD modeling 12. (1987) and Gibson and Dumont (1996). Design standards used 10. Type of contract/contract terms 9. Designer qualifications and capacity 8. Basic design data 7. Pre-project planning 5.6. the list of variables depicting design input quality was adopted after Chalabi et al. Relative level of complexity 11. Method of acceptance testing 7. PDRI score 2. Quality/completeness of scope definition 3. Use of Electronic Data Interchange Besides. Owner previous experience w/ designer 9.

limiting the data set available for analysis purposes.7. The CII BM&M survey is going on year-round on the Internet. this data source resulted in collecting data for only 7 projects. Therefore the data of a total of 22 projects were collected during the first survey and 17 of them were identified to be usable for building piping productivity model. thus. In order to make the RT-156 questionnaire survey easier. the projects collected were limited to the process industrial sector and required to be fully engineered in the US. The survey was carried out twice. With the support of the RT-156 members. However. In addition. The first questionnaire was based on BM&M questionnaire and had some extra questions. For normalization purposes. The second survey was made on-line through CII’s BM&M on-line questionnaire survey. the RT-156 pursued a parallel effort for data collection among a selected set of other CII member companies. The data of 18 projects were 101 . data for a total of 15 projects were collected from the RT-156 companies. The team first simplified the questionnaire and then took out the questions that were not included in BM&M questionnaire and put them together as an add-on questionnaire of BM&M survey (Appendix K). The first survey was through mailing out and receiving traditional paper-based questionnaires.5 Data Collection The questionnaire survey prepared by the RT-156 (Appendix E) addresses all the variables listed above in addition to the raw productivity metrics. A total of 30 companies were identified for the process. the research team decided to explore the efficiency of on-line survey.

owner participation. Projects with $TIC varying between $1 million and more than $100 million are represented. greenfield projects. but only 18 of them were used in final analysis. constitute the majority of the collected projects. industry types.6. Therefore. and so forth. Chemical manufacturing plants represent the main type of process industrial facilities covered in this survey.9 examine various characteristics of these projects including project size ($TIC).6 Regression Model Development 7. Figures 7.1 Project characteristics The 40 industrial projects that have been collected with means of the questionnaire survey cover a wide spectrum of the industry. practices such as lump 102 . where an addition is made to an existing facility. the data of 40 projects were gathered and 18 of them were used in building piping productivity model. Non-split engineering represents the dominant practice for these projects and the owner participation in engineering and design activities is mostly limited to the initial scope development. In summary.4 through 7. Such limitation of available data sets had an adversarial effect on the robustness of the developed model as indicated in later sections of this chapter. engineering practices. through the two surveys.obtained from the second survey and unfortunately only 1 of them was usable for model building. 7. With respect to engineering contracts.

are fully represented.sum.5 Project Size – TIC (18 projects) 103 . Grass Roots 28% N/A 6% Greenfield 66% Figure 7. time and material. and cost plus with incentives. guaranteed maximum.4 Project Type (18 projects) <$25MM 17% >$100MM 17% $25MM-$50MM 22% $75MM$100MM 27% $50MM-$75MM 17% Figure 7.

7 Design Contract Type (18 projects) 104 .6 Client/Owner Participation (18 projects) Alliance 6% Incentive w/ performance factors 43% Lumpsum fixed price 28% Unit prices 0% Guaranteed maximum 6% Time and material 17% Figure 7.N/A 22% Total engineering design 6% Initial scope development 55% Engineering up to detailed design 17% Figure 7.

9 Project Industry Type (18 projects) 105 10 .8 Split Engineering Practices (18 projects) Synthetic Fiber Air Separation Plant Natural Gas Processing Consumer Products Manufacturing Refined Products Storage & Distribution Terminal Electrical (generating) Oil Refining Pulp and Paper Chemical Manufacturing 0 2 4 6 8 Figure 7.Yes 22% No 78% Figure 7.

the two variables will be removed from the predictor variable list (Xi1. βi2.. p −1 + ε i (7.7.σ2) The preliminary investigation of the questionnaire responses revealed that two of the identified variables above.p-1)..2 Modeling Basics In statistical terms.6. The first-order regression model for engineering productivity would have the general form: Yi = β 0 + β1 X i1 + β 2 X i 2 + .1) where Yi is the observed response (effective productivity) Xi1. Xi. the qualitative variables are usually converted into numerical values of an appropriate scale. Review of the predicator variables shows that some of them are qualitative in nature. had a low response rate (refer to Appendix G).e. 1996). For instance: 106 . …. a regression model for engineering productivity as represented above can be described as an exploratory observational study (Neter et al. For instance. Xi2. plant size (ft2) and PDRI score. only 3 projects have a recorded PDRI score. Therefore. Xi. …. + β p −1 X i .p-1 are the p-1 predictor variables (all variables constituting scope & complexity factor and input quality factor) βi1. i. ….. Xi2. βI. In regression models. p-1 are the model parameters (unknowns) εi is the error term (observed – predicted) ~N(0.

6. at each step adding or deleting an X variable based on F-statistic calculations. 1996). The engineering productivity model has a total of 32 predictor variables (input variables). The forward stepwise procedure is probably the most widely used of all (Neter et al. a significant number of variables are considered in the model. the model can be used for the descriptive and/or predictive purposes (Neter et al. The general steps of the procedure are as follows (Neter et al. Developing a first-order regression model using all identified variables for measuring the engineering productivity of the piping function would be futile. this search method develops a sequence of regression models. This calls for the refinement of the model so as to keep the very few variables that govern the change in the response (Y). if Yes No 7. Following the identification of “a dependable subset” of the X variables. 1996). 1996). Literature cites several automatic search procedures for the variable reduction. while removing all variables with insignificant effect. which could be highly intercorrelated. is difficult to develop. if Split engineering =  0.1. An experimental regression model with all 32 variables proved this to be true. Basically. Any regression model with numerous exploratory variables. None of the variables were found significant.3 Reduction of Exploratory Variables As is common with exploratory observational studies. 1996): 107 . maintain and interpret (Neter et al.

MSE(Xk) (error mean square) = SSE(Xk) (error sum of squares) / dferror (error degrees of freedom). SSR( X k ) = ∑ (Yi − Yˆi ) 2 . of predictor variables in model SSE ( X k ) = ∑ (Yi − Yˆi ) 2 . For each simple linear regression model. where Xu is one 108 . dfreg = no.2) where k=1. 2.1. the routine terminates with no X variable considered sufficiently to enter the regression model. The stepwise regression routine first fits a simple linear regression model (a regression model that has only one predictor variable) for each of the p-1 variables. Otherwise.dfreg The X variable with the largest F-statistic value is the candidate for first addition. the F-statistic is calculated as follows: Fk = MSR( X k ) MSE ( X k ) (7. The stepwise regression routine fits all regression models with two X variables. p-1 MSR(Xk) (regression mean square) = SSR(Xk) (regression sum of squares) / dfreg (regression degrees of freedom). …. If the F-statistic value exceeds a predetermined level. 2. the X variable is added to the model. dferror = (n-1) . Assume variable Xu is the variable entered at step 1.

p-2 MSR(Xk|Xu) (regression mean square) = SSR(Xk|Xu) (regression sum of squares) / dfreg (regression degrees of freedom). MSE(Xk . SSR( X k | X u ) = SSE ( X u ) − SSE ( X k . X u ) (7.Xu) (error sum of squares) / dferror (error degrees of freedom). 2. ….of the pair. 3. For each regression model. the partial F-statistic is calculated as follows: Fk = MSR( X k | X u ) MSE ( X k . At this stage.dfreg The X variable with the largest F-statistic value is the candidate for addition in the second stage. dfreg = no.3) where k=1.Xu) (error mean square) = SSE(Xk . dferror = (n-1) . the partial F-statistic is calculated as follows: 109 . the X variable is added. Now. of predictor variables in model SSE ( X k . Suppose Xv is added at the second stage. X u ) = ∑ (Yi − Yˆi ) 2 . If this F-statistic value exceeds a predetermined level. the stepwise regression routine examines whether any of the other X variables already in the model should be dropped. Xu. X u ) . Otherwise the routine terminates. there is only one other X variable in the model.

it is retained.. i. then examines whether any of the model variables should be dropped. the 32 variables will be divided into 3 groups. Suppose both Xu and Xv retained in the model. At such point. scope.4) At the later stages. Otherwise. X v ) (7. there would be a number of these F-statistics. 7.4 Implementation of variable reduction technique To facilitate the variable reduction process.Fu = MSR( X u | X v ) MSE ( X u . The stepwise regression routine examines the remaining p-3 variables for the next candidate to enter the model. If this F-statistic value falls below a predetermined limit. 110 . Each group of variables will be used in developing a separate model with the effective productivity value as the response (Y). and so forth until no further X variables can be added or deleted from the model. 4.6. The variable for which the Fstatistic value is the smallest is the candidate for deletion.e. the routine will be terminated. complexity and input quality. and further undergoes the variable reduction procedure. for each of the variables in the model besides the one last added. the variable is dropped from the model.

owner previous experience w/ designer. Running the forward stepwise procedure for variable reduction results in two variables to remain in the scope model. owner previous experience w/ process technology. Complete details of the various steps can be found in Appendix H. and split engineering.1288 7. client/owner design participation. Summary of Stepwise Procedure for Scope Model Step 1 2 Variable Number Entered Removed In EQUIP TIC 1 2 Model R**2 C(p) F Prob>F 0. design standards used. # raw materials. site conditions.2022 Complexity model: The complexity model has 13 predictor variables. engineering-design schedule. The two variables are # of equipment pieces and project size ($TIC).3049 0. Running the forward stepwise procedure for variable reduction results in four variables remaining in the complexity model.3786 -4. legal and environmental impact.0175 0. use of Integrated Databases (IDB). contract type. # finished products.7789 0. relative level of complexity. designer previous experience w/ process technology. project size ($TIC).0185 1. including.Scope model: The scope model has 9 predictor variables. # of equipment pieces. Summary of the stepwise procedure is shown below. use of 3D CAD modeling. amount of standard module re-use. method of acceptance testing. The four variables are design 111 . relative size of plant. and use of Electronic Data Interchange (EDI). including.0768 -3. project type.

constructor input and constructability.6602 0.2401 -0. amount of standard module re-use. Summary of Stepwise Procedure for Dependent Variable PRODVTY Step 1 2 3 Variable Number Entered Removed In CONSTR PM SCOPE 1 2 3 Model R**2 0.0248 -0.1125 Input quality model: The input quality model has 10 predictor variables. Summary of Stepwise Procedure for Complexity Model Step 1 2 3 4 Variable Number Entered Removed In STAND ENVIRO REUSE OPROC 1 2 3 4 Model R**2 C(p) F Prob>F 0.3916 0. owner profile and participation.3828 45. including.3687 2.1789 2. designer qualifications and capacity.6258 3. and owner previous experience with process technology.0710 .1702 0. basic design data.1739 0.0255 54. project manager qualifications.8185 Prob>F 0.0003 0.0889 0. objectives and priorities.5025 62. type of contract/contract terms.8979 0. Complete details of the various steps can be found in Appendix H.7863 3. The three variables are quality/completeness of scope definition. project manager qualifications.5911 F 3.0851 0.1293 0. Summary of the stepwise procedure is shown below. pre-project planning. Complete details of the various steps can be found in Appendix H.1606 0.standards used.2787 0. Running the forward stepwise procedure for variable reduction results in three variables to remain in the complexity model. Summary of the stepwise procedure is shown below.2817 21. quality/completeness of scope definition.9927 37. and constructor input and constructability.5979 2. and vendor data.7330 112 C(p) 15. legal and environmental impact.

15 or 0. using the stepwise procedure served for a preliminary refinement of the model and identifying a more concise set of predictor variables.05 for the stepwise procedure. amount of standard module reuse. Piping productivity model: The input quality model has 9 predictor variables. However. Complete details of the various steps can be found in Appendix H. a total of nine predictor variables remain in the regression model. Running the forward stepwise procedure for variable reduction results in only one variable remaining in the piping productivity model. Summary of the stepwise procedure is shown below. The variable is # equipment pieces. A relatively large value was used to guarantee that no important variables were left behind. legal and environmental impact. Now. On the other hand. Nevertheless.25. It is to be noted that using a significance level of 0.10 will be used. several of the identified variables would be insignificant with smaller values for (α). and constructor input and constructability. a significance level (α) of 0. results in the same variable to be selected.10. An examination of the regression model including the nine variables showed that no significant variables could be identified. instead of 0. project manager qualifications. 113 . owner previous experience with process technology.The variable reduction procedure for the three models used a α−level of 0. including project size ($TIC). quality/completeness of scope definition. This calls for a second stage of variable reduction using stepwise routine. # equipment pieces. design standards used.

00134 at 98% significance level Regression models can be predictive and/or descriptive (Neter et al. While a model can be used for predicting the response value.6378 F 7.0175 A regression model of piping productivity with the # equipment pieces can thus be constructed as following The model has the form: Y = β 0 + β1 X 1 + ε i where Y is the actual measured piping productivity. the predicted value of piping productivity would be: Predicted piping productivity = 0.Summary of Stepwise Procedure for Dependent Variable PRODVTY Step 1 Variable Number Entered Removed In EQUIP 1 Model R**2 0. is the predicted piping productivity ε i is the error term According to the regression model in Appendix H.44985 +0. its parameters may not describe the actual effect of the predictors on the response. The above model depicts the hours 114 .00134 x number of equipment pieces Model significance level: 98% β 0 = 0. 1996).0185 Prob>F 0.44985 at 90% significance level β 1 = 0.3049 C(p) 34. Yˆ = β 0 + β1 X 1 .

10 Piping Productivity Regression Model (PIPING 1) Removing the outlier. It further shows that an outlier exists in the data sample. According to the regression model in Appendix H (model PIPING 2).10 illustrates the second piping regression model. Outlier Hours/Length of Pipe 4 3 2 1 0 0 200 400 600 800 1000 1200 1400 1600 # Equipment Pieces Figure 7. another regression model could be developed. Figure 7.expended on a certain length of piping to increase with the increase in the number of equipment pieces. the predicted value of piping productivity would be: 115 .

00168 at 99% significance level In this regression model. Hours/Length of Pipe 4 3 2 1 0 0 200 400 600 800 1000 1200 1400 1600 # Equipment PIeces Figure 7. Figure 7. the number of equipment pieces is significant at α-levels of 0.0001. Piping Productivity Regression Model (PIPING 2) 7.00168 X number of equipment pieces Model significance level: 99% β 0 = 0.11. In addition.Predicted piping productivity = 0.16918 + 0. the R2 and adjusted R2 values reach 0.73 and 0.11 illustrates the third piping regression model.7 Remarks on the Implementation of the Conceptual Model The regression model for estimating piping productivity depicts a basic value for hours per length of piping (corresponding to raw productivity) which needs to be 116 .71 respectively.16918 at 85% significance level β 1 = 0.

12. In this respect. the regression model relates to the conceptual model as follows: Research Focus Effective Productivity = Input Quality Factor x Scope & Complexity Factor x Scope & Complexity Factor X vs Raw Productivity x Output Quality Factor Raw Productivity Y Y = β 0 + β1 X 1 + β 2 X 2 + Λ + β n X n Y = β 0 + β1 X 1 Figure 7. Implementation of the Conceptual Model The research focus led to the effort of relating raw productivity to scope & complexity factor with the raw productivity being dependent variable and the scope & 117 . Due to the limited number of data points used for model development. a number of variables depicting input quality were found as only marginal.modified according to the number of equipment pieces in the project (corresponding to scope and complexity factor).

Regarding the regression model developed earlier. without output quality factor to be taken into model. Chapter 8 will elaborate on this approach and a generic model will be presented. It is unfortunate that the number of project available for the RT-156 did not serve for the purpose of investigating the true significance of the marginal variables. Since a single multi-variable linear regression model could not be developed. several of these variables have been found as statistically “marginal. it is recommended to gather more data points and further investigate the significance of such variables. Therefore. variable reduction technique was resorted to and the final regression model only has one significant variable. With the relatively large number of variables considered in the model.” This term is commonly used to signify the variables that are statistically “insignificant. effective productivity can not be explained and measured. Effective engineering productivity can not simply be measured through a numerical factor such as engineering hours per length of piping. 118 . However. that is. a set of measures need to be developed to assess the real effective engineering productivity. Had enough data points been made available for the analysis. a robust model could have been attained with clear significance for the considered variables.” In such instances. it is important to point out that the statistical analysis has been greatly influenced by the limited number of projects used in the analysis.11. the RT-156 herein cautions the readers of the statistical significance of the piping productivity model represented in Figure 7. the number of equipment pieces.complexity variables being independent variables.” while being within of narrow margin of becoming “significant.

However. few companies were familiar with such concept or collected enough information to calculate it.8 Summary The current methods for measuring engineering productivity are primarily dependent on the engineering discipline deliverables. PDRI scores. For instance.A number of the RT-156 members noted that some of the variables to be found as statistically “marginal. Several industry practitioners have 119 . Limitations encountered throughout the analysis process (particularly the number of data points) prevented a further investigation of these variables. thus. Thus. The authors would like here to pinpoint that the impact of these marginal variables cannot be ruled out. Also. 7. At the time of the data collection process. the study provides a framework that can be followed by engineering organizations to establish robust and reliable systems for measuring the productivity of the various engineering disciplines in a project. However. in earlier stages of the analysis. a member of the RT-156 noted that the amount of standard module re-use has a very noticeable impact on the productivity of engineers in his organization. the industry has become a long way since that time and more companies are using PDRI score as an important representative of scope definition quality.” were found to have a sizable impact on the productivity and performance of their engineering organizations. some of the variables such as PDRI score have been removed from the process. can be used as an integral part of productivity measurement system for engineering organization (particularly the input quality factor). the response rate for the PDRI score as a representative of quality of scope definition was quite low.

criticized such approach for its inconvenience. particularly the forward stepwise technique. As the chapter primarily introduced a framework for productivity measurement. Statistical methods for variable reduction. and output quality factors. Several parallels have been drawn between this approach and methods used for measuring productivity in the software industry. scope and complexity. have been applied to demonstrate the functionality of this model for the piping engineering discipline. such as. the analysis demonstrated the soundness of the model and pinpointed a number of variables for further consideration. the limitations of the data collected have prevented a fully robust model to be developed. a similar process can be followed by engineering organizations for establishing models for measuring engineering productivity for each discipline. 120 . and so forth. length of piping. Nonetheless. number of equipment pieces. A productivity measurement model was further introduced that accounts for variability in productivity among projects through input quality. However. The chapter represented a new and innovative approach for measuring engineering productivity that ties engineering productivity to the installed engineering units. Larger data pools are required for all future efforts for developing industry-wide models in engineering productivity measurement.

a single numerical measure such as engineering hours per piping length does not tell us the whole story of the real engineering effectiveness. Limited by the number of data sets and initial research focus. The regression model in Chapter 7 can predict the engineering productivity. Effective engineering productivity should be measured through a set of measures that can be used to assess the engineering output effectiveness from different perspectives. the regression model itself has many limitations due to inadequate data source. Tucker and Scarlett (1986) used an objective matrix method to list. 121 . Chalabi et al. the regression model only depicts a statistical relationship between raw productivity and number of equipment pieces. and weight several criteria that collectively depict design effectiveness in a project. Effective engineering productivity can not be explained through that model. A linear regression model was formulated in Chapter 7. However. (1987) further identified various project input variables to have the largest impact on design effectiveness. such as cost overrun or underrun and schedule delays. the approach to arrive at the model does provide engineering practitioners a general idea on applying the approach to other engineering disciplines. categorize.CHAPTER 8 INTEGRATED SCHEME FOR ASSESSMENT OF ENGINEERING PERFORMANCE Earlier CII research addressed the concept of design effectiveness as a comprehensive approach for measuring engineering performance in an industrial construction project. As stated in the summary of Chapter 7.

8. and a comparison between the linear regression models and the neuro-fuzzy model is also carried out. scope and complexity factors with the resulting measures of output effectiveness.The scope model introduced earlier in this research integrates input variables.1. Because of the lack of prior knowledge of the magnitude of change in measures of engineering performance pertinent to a specific change in project input variables.1 Research Approach The assessment of engineering performance. linear regression model is developed for each of the output measures. there is a need to use analytical schemes that have the 122 . this chapter presents a generic model for measuring effective engineering productivity using neuro-fuzzy networks to explore the correlation of these input. and output quality in measuring the actual engineering productivity in a project.1 Effectiveness Concept To complement this research. raw productivity. This concept leads to a scheme for measuring effective engineering productivity as shown in Figure 8. requires identifying these cause-effect relationships in an appropriate and valid form. scope and complexity factors. In addition. as a function of the project input variables having an impact on such performance. Input Quality Factor X Scope Factor X Complexity Factor = Output Effectiveness Figure 8.

fuzzy logic addresses the imprecision of the input and output variables. While neural networks model the complexity of the non-linear relationships. These analytical schemes. presuming these measures collectively depict the total engineering performance in a project. when provided with pairs of actual inputs and outputs of a certain domain. A high level of complexity and variable imprecision characterizes the domain between the project input variables and the resulting measures of engineering performance.2 illustrates the research approach pursued by the RT-156 for integrating the inputs and outputs of the engineering performance scheme. Appendix G reviews various theories and illustrations of the functionality of both neurofuzzy systems and multiple attribute utility functions. Figure 8.data-driven inferential capabilities. which allow further flexibility in system description. For identifying the relative importance of the various measures of engineering performance. the researchers of the RT-156 utilized neurofuzzy systems as the primary modeling tool for this study. can formulate the relationships between the inputs and outputs of such domain. A neurofuzzy intelligent system provides the capabilities of both neural networks and fuzzy logic together in a single integrated system. 123 . Therefore. Multiple attribute utility functions are further used to allow the simultaneous assessment of various measures of engineering performance. the eigenvector prioritization method of the Analytical Hierarchy Process (AHP) is integrated in the development of the multiple attribute utility functions.

FNN2. and start-up and commissioning phases in an industrial construction project. Figure 8.2 Generic System Architecture A total of 25 project input variables were identified for the generic system development. level of automation. design effectiveness could be measured through a total of 10 measures that signify engineering contribution to a successful implementation of detailed design. Also. Neurofuzzy systems represent the cornerstone in this generic system functionality. general owner attributes. …. general designer attributes. project information inputs.3 illustrates a schematic representation of the generic system architecture. fabrication and construction. a total of 10 fuzzy neural networks (FNN) are used. and project changes. These variables can be classified in seven categories including: general project attributes. FNN10. To improve system adaptation and training. project schedule.2 General Research Approach 8. Each of the 10 connection structures of these fuzzy neural networks is employed to deal with one of the 124 . denoted by FNN1.Identify Relationships Project Input Variables Neurofuzzy Systems Engineering Performance Measures Multiple Attribute Utility Functions Total Engineering Performance Collective Assessment Figure 8.

Following a proper training of the FNNs using actual industrial project data collected by the RT-156.3. y1.. defuzzify the system outputs.. and develop a weight structure that appropriately depicts the non-linear relationships between the system inputs and outputs. x25.. x1. Schematic Representation of Generic System Architecture The major functions performed by the fuzzy neural networks are to fuzzify the system inputs. . ….. FNN 10 Figure 8. based on input variables.. y2. y10 x25 u10 Utility Function w/ Preference Structure FNN 1 FNN 2 . the neurofuzzy portion of the generic system would be capable of 125 . Fuzzification Layer Intermediate Layer Defuzzification Layer Input Layer Output Layer x1 y1 u1 x2 ..... y10.....identified engineering performance measures. …. UT .. x2.

For normalization purposes. the projects considered in the survey are fully engineered in the US. can be developed based on RT-156 member feedback. ym. to an acceptable level of error. The preference structure among the various measures would integrate the individual utilities into a multiple attribute utility function. FNN2. which are associated with the engineering performance measures. respectively. Individual utility functions that map each engineering performance measure. …. xn. is used for acquiring industrial project data pertinent to the 25 project input variables and the 10 measures of design effectiveness. ym. given a specified set of project input variables. 8. it can be used for predicting engineering performance.generalizing the relationships between the project input variables. FNN1.3 Neurofuzzy System Development The development of the neurofuzzy portion of the generic system involves the training and adaptation of the connection structure of the fuzzy neural networks. y2. included in Appendix E. the survey is specified to the process industrial sector of the construction industry. to its corresponding expected utility. y1. 8.3. …. Characteristics of the 40 projects used for the system development have earlier been illustrated in chapter 7. UT. Thus. um. FNN10.1 Variable Representation 126 . and the engineering performance measures. Also. Questionnaire survey (Part A). y10. that depicts the total engineering performance in an industrial construction project.

which transforms numerical/crisp variables into their fuzzy form. µ xi . this degree of imprecision defines the fuzziness in variable description. The term set used is 127 . design-construction overlap.4 illustrates the fuzzy representation of the numerical variable “project size ($TIC)”.. Examples for the numerical variables are project size ($TIC). According to Zadeh (1977). However. The hedge “about” is used to depict the various linguistic terms of such variable with respect to specified reference points along the numerical scale. system inputs and outputs need to be described in a fuzzy form for appropriate system functionality. Numerical and linguistic variables generally have different levels of imprecision associated with them. While numerical variables have exact of near-exact values. and further determine the membership. a successful and consistent fuzzification process for a numerical variable requires identifying a convenient term set. linguistic variables are commonly described on semantic scales. and so forth. Based on the neurofuzzy system architecture of the previous section. % design rework. completeness of scope definition. Tx2 . these linguistic variables depict higher degree of imprecision compared with numerical variables. Txk } . Examples for the linguistic variables are relative size of project. T ( x) = {Tx1 . Figure 8. Thus.The inputs and outputs of the neurofuzzy system depict a wide combination of numerical and linguistic (non-numerical) variables. and so forth.. Appendix G reviewed the process of fuzzification.. %fabrication and construction schedule delay due to design deficiencies. for each linguistic term in this set. as an example for a numerical variable of the neurofuzzy system inputs. use of integrated databases..

project size ($TIC). about $90M or more}. about 8%. about $50M. about 10%. 128 .5 illustrates the fuzzy representation of the variable. % design rework. in contrast with 5 linguistic terms in the case of the previously illustrated numerical variable. The term set depicting this variable has seven linguistic terms. Fuzzy Representation of Numerical Variable “Project Size ($TIC)” Figure 8.T (x) = {about $10M or less. as an example for a numerical variable of the neurofuzzy system outputs. Fuzzy representations with seven linguistic terms are used for the numerical variables of the neurofuzzy system outputs to provide a better spread of values and improve the network training and adaptability. about 12% or more}. this numerical variable has a term set T (x) = {about 0%. µ A~ ( x ) 1.4. about 4%. about $30M. In its fuzzy representation. about 2%.0 About $10M or less $10M About $30M About $50M $30M $50M About $70M $70M About $90M or more $90M Project Size ($TIC) Figure 8. about $70M. about 6%.

and very.5. “completeness of scope definition. very high}. low. moderate. can be used together to develop an appropriate fuzzy representation of these linguistic variables. To depict the higher degree of imprecision in variable description. The various relationships between hedges. it would be assumed that the variable could belong to any of the five linguistic terms at the same time. An appropriate term set would be {very low. not.0 About 0% 0% About 2% About 4% About 6% 2% 4% 6% About 8% About 10% 8% 10% About 12% or more 12% % Design Rework Figure 8. Fuzzy Representation of Numerical Variable “% Design Rework” Linguistic or non-numerical variables are the other type of variables depicted in the neurofuzzy system. such as about. thus. Figure 8.6 illustrates the fuzzy representation of the linguistic variable.” For consistency purposes. a differential scale that stretches between the “very low” and “very high” linguistic terms is used. they have a higher degree of imprecision associated with them.µ A~ ( x ) 1. 129 . high. These variables are usually defined on a semantic scale. similar fuzzy representation is used for the various linguistic variables identified in the system. such as “completeness of scope definition”. Considering a linguistic variable.

and further verification and validation purposes with 26 projects for training. The training process involves the adjustment of the network weight structure for more accurately producing the desired output (target output) of the network.Moderate low high 1. The remaining 28 projects were used for the neurofuzzy system development. ….2 on a Windows98 platform.0 very low very high 5 0 10 Figure 8. FNN1. FNN10. 2 projects for verification and 2 projects for validation.7 illustrates the training process for the connection structure of each of the fuzzy neural networks. Figure 8. 130 .3. FNN2. Fuzzy Representation of the Linguistic Variable “Completeness of Scope Definition” 8. using MATLAB version 5.6.2 System Training and Adaptation 12 projects were excluded from the pool of projects used for neurofuzzy system development because of discrepancies or substantial incompleteness of data.

Goal is 0 0 Perform ance is 0. Goal is 0 0 300 350 400 350 400 FNN2 FNN1 10 200 250 400 Epochs Perform ance is 0.000234077. Goal is 0 0 10 10 -1 -1 10 10 MSE MSE -2 -2 10 10 -3 10 -3 0 50 100 150 200 250 400 Epochs 300 350 10 400 0 50 100 150 Perform ance is 0.00171403.00135221.7 Training Cycles for the Neurofuzzy System 131 300 . Goal is 0 0 10 -1 10 10 -1 MSE MSE -2 10 10 -2 -3 10 10 -4 -3 10 0 50 100 150 200 250 400 E pochs 300 350 400 0 50 1 00 150 2 00 250 400 Epochs FNN4 FNN3 Figure 8.Perform ance is 0.00316591.

Goal is 0 0 10 10 -1 10 -1 10 -2 MSE 10 MSE -2 10 -3 10 -4 -3 10 10 0 50 100 150 200 250 400 Epochs 300 350 400 0 50 100 150 P erform ance is 0. Goal is 0 0 -1 MSE -1 10 10 300 FNN8 FNN7 10 200 250 400 Epochs -2 10 0 50 100 150 200 400 Epochs 250 300 350 400 -2 -3 0 50 100 150 200 400 Epochs 250 300 FNN10 FNN9 Figure 8. Goal is 0 0 10 10 MSE 10 350 400 350 400 Perform ance is 0.00462449. Goal is 0 0 10 -1 MSE MSE 10 10 Perform ance is 0.011722. Training Cycles for the Neurofuzzy System (Continued) 132 .7.10 10 Perform ance is 0.0135164.00107337.00152413. Goal is 0 0 10 -1 -2 -3 10 0 50 100 150 200 400 Epochs 250 300 350 -2 0 400 50 100 150 250 300 350 400 FNN6 FNN5 Perform ance is 0.0004 75656. Goal is 0 0 200 400 E poc hs Perform ance is 0.

designers/constructors. Utility functions map the possible range of a measure to its degree of liking or worth. This includes the identification of the individual utility functions.1 Individual Utility Functions Each measure of engineering performance can extend for a possible range of values depending on the nature of such measure. u10(y10).4 Multiple Attribute Utility Function Development The second portion of the generic system comprises the development of the multiple attribute utility function. Also. or in other words. 8. u1(y1). and construction management service firms in order to achieve diversity in the acquired knowledge. Seven members of the RT-156 with a vast experience of the industrial sector of the construction industry have participated in the study. Two values along this range are of special 133 . For instance. the percentage of design rework might have a possible range of 0%-100%. with a total of 201 years of experience altogether. commonly denoted by its utility. The years of experiences accumulated by the seven members varies between 20 and 35 years. in addition to developing the preference structure among the various measures of engineering performance. u2(y2). how much it worth to the decision-maker to have a specific value as an outcome of the project compared to all other values along this range. the participating members represent various types of business entities including owner organizations. …. UT.8.4. A decision-maker would presumably have a certain degree of liking for each individual value in this possible range.

This utility function would have the form: u j (y j ) = a j y j + bj . UL. such value would have a 0 or 0% utility. Although the risk attitude determines the shape of the utility function. Table 8. and a j . The individual utility functions for the various engineering performance measures are given in Table 8. The first value.interest. …. 134 . UH. along with their average and standard deviation. Solving Equation 8. such value would have a 1. The RT-156 members.0 or 100% utility. y j ∈ [U L . provided their assessment of the UL and UH values for each individual measure of engineering performance.2. y1. it can be assumed that the participating members have a neutral risk attitude allowing a linear function to depict the utility between UH and UL. Using the average values UL and UH. signifies the least favorable outcome for the decision-maker. who participated in developing the multiple attribute utility function.1 represents the expert’s assessment of both UL and UH.1 requires two points on the linear function. y2. The other value. the constants. aj and bj. b j are constants.1) where u j ( y j ) is the expected utility for measure j that is associated with value y j . y10. U H ] (8. can be estimated. signifies the most favorable outcome for the decision-maker for this specific performance measure.

10% 20% 5% 5% low low 5% 5% Expert Assessment of UH Exp 2 2% satis.2% 4.2% 2.0% N/A 4.8% mod low 8.4% 4. 5% 10% 2% 2% low mod.Table 8.Dev. (UH) (UH) 2.0% 8. low 5% 10% Exp 6 15% mod.0% 2. satis. 20% 25% 10% 10% mod. & construction cost increase due to design deficiencies 1% %Construction hours for design problem solving v.0% 5.7% 2.2% satis.3% N/A N/A 4.6% 2. & construction schedule delay due to design deficiencies 5% 10% _ %Fab. & construction cost increase due to design deficiencies 3% 5% _ %Construction hours for design problem solving low high _ %Estimated constructability Dollar saving low low _ %Start-up schedule delay due to design deficiencies 2% 15% _ %Start-up cost increase due to design deficiencies 2% 5% Engineering Performance Measures Exp 1 %Design rework 1% Design document release commitment v. (UL) (UL) 9.2% unsatis.0% 1.8% 1.2% 6. %Start-up schedule delay due to design deficiencies 1% %Start-up cost increase due to design deficiencies 1% Exp 4 _ _ _ _ _ _ _ _ _ _ Exp 5 10% unsatis. 5. v.0% 19. unsatis. 2% 2% Exp 6 5% satis. low high 0% 0% Average Std.2% 1.6% N/A N/A 6. & construction schedule delay due to design deficiencies 2% %Fab.1% 3.2% 2. 16.2% 1. v.5% Average Std.9% .Dev. low %Estimated constructability Dollar saving mod.8% low high 3. 10% 10% 5% 3% low high 10% 3% 135 Exp 3 _ _ _ _ _ _ _ _ _ _ Exp 4 _ _ _ _ _ _ _ _ _ _ Exp 5 2% satis. %Detailed design schedule delay 1% %Detailed design cost overrun 5% %Fab. 0% 0% 0% 0% v.1 Assessment of UL and UH for Each Engineering Performance Measure Expert Assessment of UL Engineering Performance Measures Exp 1 Exp 2 Exp 3 _ %Design rework 6% 5% _ Design document release commitment unsatis.8% 4. 10% 5% 5% 3% low high 5% 5% Exp 7 1% v. _ %Detailed design schedule delay 15% 15% _ %Detailed design cost overrun 15% 15% _ %Fab.4% 6. 20% 20% 10% 6% high low 15% 10% Exp 7 10% unsatis.6% N/A 4. satis.

0% 0  u4 ( y4 ) = − 7.2 < y2 < 4 1. 3=moderate.2 Individual Utility Functions for Engineering Performance Measures Engineering Performance Measures Designation Utility Function %Design rework y1 .4% 0  u 9 ( y 9 ) = − 20.2 < y 7 < 3 1.0 .23 y5 + 1. y7 ≥ 3 0  u 7 ( y 7 ) = − y7 + 3 .8% 0  u6 ( y6 ) = − 25.26 y3 + 1.46 .69 y4 + 1. 5=very high %Estimated constructability Dollar savings y8 . y 3 ≤ 5 . 2=unsatisfactory.2.0 .0%  %Fabrication & construction schedule delay due to design deficiencies y5 .4% 0  u10 ( y10 ) = − 23. 4=high. y1 ≥ 9.54 .83 y 9 + 1. y10 ≥ 6. y4 ≤ 6.3.0 . y1 ≤ 2.2.Table 8. 2=low.29 y1 + 1.2% 1.8%  %Fabrication & construction cost increase due to design deficiencies y6 .75 . 4=high.16.31 .8%  %Construction hours for design problem solving y7 .8% 1. 4=satisfactory.2 > y8 > 4 1.4% 1. y7 ≤ 2  where 1=very low.2%  136 .8% > y6 > 5.6% > y1 > 8.52 .2% 0  u1 ( y1 ) = − 14. y1 ≤ 2.0 y6 + 1. y6 ≥ 5.1. y3 ≥ 16.45 . y6 ≤ 1. 5=very high %Start-up schedule delay due to design deficiencies y9 .0% 1. y1 ≤ 3.0 .8% > y1 > 8.0 .0 .4% 1.0 .2% > y1 > 9. y2 ≥ 4  where 1=very unsatisfactory.2% > y10 > 6. y8 ≤ 2 0  u8 ( y8 ) = 0.2% 1.6. y1 ≥ 8.0 .2%  Design document release commitment y2 . 5=very satisfactory %Detailed design schedule delay y3 . y1 ≥ 8. 3=moderate.81 y10 + 1.0 .2. y4 ≥ 19. y2 ≤ 2 0  u2 ( y2 ) = 0.48 .5 y8 − 1 . y8 ≥ 4  where 1=very low.0% 0  u5 ( y5 ) = − 19. 3=moderate.5 y 2 − 1. 2=low.0 .2 %  %Detailed design cost overrun y4 .0% > y4 > 19.6%  %Start-up cost increase due to design deficiencies y10 .0% 1.0% 0  u3 ( y3 ) = − 9.0 . y10 ≤ 2.0% > y3 > 5.

provides one of the more systematic approaches for such purpose.2. UT The function UT is constructed through integrating the individual utility functions previously illustrated in Table 8. and %start-up schedule delay due to design deficiencies.4. The collective judgment shows that a number of measures exhibit a relatively higher preference compared to others in depicting engineering performance. which is the main technique for deriving the preference structure in the Analytical Hierarchy Process (AHP). The resulting preference structure based on the comparative judgment of the participating members is illustrated in Table 8. %estimated constructability dollar savings. or in other words. %fabrication and construction schedule delay due to design deficiencies. The preference structure further determines the relative weight of each individual utility function in formation of the multiple attribute 137 .8.2 Eigenvector Prioritization Method The preference structure defines the relative weights among the identified measures of engineering performance.3 Development of the Mutilple Attribute Utility Function. 8.3. %fabrication and construction cost increase due to design deficiencies. the extent to which they proportionately depict the engineering performance in an industrial construction project. The eigenvector prioritization method.4. Among the highly preferred measures are design document release commitment.

2a) U T = 0.033u 4 ( y 4 ) + 0.utility function. + w10 u10 ( y10 ) (8..117u 2 ( y 2 ) + 0.2b) 138 .099u 3 ( y 3 ) + 0.062u 7 ( y 7 ) + 0. UT.115u 8 ( y8 ) + 0.139u 9 ( y 9 ) + 0.3.196u 5 ( y 5 ) + 0.067u1 ( y1 ) + 0. Using the average weight vector presented in Table 7.060u10 ( y10 ) (8.110u 6 ( y 6 ) + 0. the function UT would have the form: U T = w1u1 ( y1 ) + w2 u 2 ( y 2 ) + ..

1956 0.1525 0.0600 Std.1390 0.0334 0.1482 0.0576 0. Dev.1590 0.Table 8.0994 0.1116 Average 0.0480 0.0519 0.0858 0.0124 0.0203 0.3893 0.2334 0.1415 0.0430 0.0782 0.0065 0.1167 0.0778 0.2388 0.0286 0.0625 0.0356 0.0416 0.2045 0.0352 0.0297 0.1611 0.0213 0.0717 0.0184 %Detailed design cost overrun 0.1167 0.1495 0.0949 0.0557 0.0255 Expert 4 0.0989 0.0285 0.0389 Expert 7 0.0101 %Fab.0346 0.0078 0.1265 0.2107 0.0342 0.0037 0.0688 0. & construction cost increase due to design deficienci 0.3646 0.2724 0.0105 0.1154 0.0268 0.0766 0.0551 .3075 0.0115 0.0482 0.0292 %Detailed design schedule delay 0.1119 0.0624 0.1116 0. & construction schedule delay due to design deficien 0.1145 0.0086 139 Expert 3 0.0387 0.1309 0.1018 0.0803 Design document release commitment 0.0979 0.2074 0.0975 %Start-up cost increase due to design deficiencies 0.1495 0.0192 0.0044 0.1611 Expert 5 0.1720 %Fab.0706 0.1476 0.0748 %Estimated constructability Dollar saving 0. 0.2365 0.3105 %Start-up schedule delay due to design deficiencies 0.0166 0.3 Preference Structure Based on the Comparative Judgment of the Participating Members Engineering Performance Measures Expert 1 %Design rework 0.1098 0.1747 %Construction hours for design problem solving 0.0325 Preference Structure Expert 2 0.0806 0.0421 Expert 6 0.

The selection of these projects was made to depict varying project inputs and resulting outputs. FNN2. Both projects were previously utilized in the training set for the fuzzy neural networks.8. have been trained for prediction of the engineering performance measures.5. y2. …. have such knowledge encoded in their connection structures. PVAL1 and PVAL2. …. y1. FNN1. Generic System Verification and Validation The fuzzy neural networks. two projects. were selected for such purpose. The trained fuzzy neural networks. Because of the limited number of projects available for system development. Table 8. the prime goal is to have a platform capable of relating engineering performance to the variables that have an impact on such performance. FNN2. Table 8. FNN2. y10.6 illustrates the 140 . While the system is trained using a limited project data set. two projects. …. FNN10. …. FNN1.4 illustrates the characteristics of the two projects used for system verification.5 further illustrates both the targeted and calculated outputs for the various engineering performance measures of each project. Validating the learning process is to confirm that the networks have the proper generalization to calculate the output when given a set of project input variables that was not used in system training. were retained for system validation while the rest were used in training the networks. The neurofuzzy system provides a platform for generalizing the non-linear relationships between the project input variables and the resulting measures of engineering performance. respectively. FNN1. FNN10. To verify that the networks have properly adapted for prediction of system outputs. Table 8. FNN10. The selection of these projects was also made to depict varying project inputs and resulting outputs. PVER1 and PVER2.

20% High High High Low Engineering Performance Measures %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.000 $83. Site conditions (impact) Legal and environmental conditions Owner profile and participation (0-10 scale) Newness of process technology to owner Owner previous experience with designer Split engineering practices Designer qualifications and capacity Newness of process technology to designer Design schedule (conditions) Design-construction overlap Completeness of scope definition (0-10 scale) Completeness of objectives and priorities (0-10 scale) Completeness of basic design data (0-10 scale) Quality of constructor input and constructability (0-10 scale) Quality of vendor data (0-10 scale) Use of 3D CAD modeling Use of Integrated Databases Use of Electronic Data Interchange Percent TIC scope changes Change management procedure Change communication system PVER1 PVER2 $52. & construction schedule delay due to design def.7 further illustrates the targeted and calculated outputs for the various measures of engineering performance for the validation projects. Table 8.000 Cost plus (incentive w/ Lump-sum performance factors) Moderate Low Medium Large Low "little impact" Low "little impact" Moderate "typical Low conditions" 10 3 Moderate "50% new Very low "scale up/down of technolgy" existing technology" Moderate"2-10 projects" Very low "0 projects" No No 10 2 Low "25% new technology" Low "25% new technology" Moderate "Normal" Significant "Fast track normal" 30% 80% 10 2 10 7 10 2 10 2 9 5 Signifcant Very insignificant Significant Significant Significant Moderate 0% 5. Characteristics of Projects PVER1 and PVER2 Project Input Variables Project size ($TIC) Contract type Relative level of complexity Relative size of project.865. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 141 PVER1 PVER2 2% Very satisfactory 0% 0% 0% 0% very low very high 0% 0% 10% moderate 14.000.characteristics of the two projects used for system validation.4.60% 60% 8% 5% Very high Very low 5% 10% . % Fab & construction cost increase due to design def. Table 8.

0% High to ery high low 5.0% very low to low high to very high 0.0% %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab. & construction schedule delay due to design def.3% 0.0% 0. % Fab & construction cost increase due to design def.0% moderate 14.6% 60.4% 8.7% 59.0% 0. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 142 Calculated 9.9% moderate 14.5.0% 10. Targeted and Calculated Outputs for the Verification Projects Project "PVER1" Engineering Performance Measures Targeted 2.0% Project "PVER2" Engineering Performance Measures Targeted 10.0% 8.0% %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab. % Fab & construction cost increase due to design def.0% 5.0% 0.0% Very high Very low 5.0% Very satisfactory 0.0% 0.Table 8. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences Calculated 2.0% 0.0% very low very high 0.0% 5.1% 0.0% 10.0% .0% Very satisfactory 0.0% 0. & construction schedule delay due to design def.

& construction schedule delay due to design def.000.0% 5.7% 17.6.20% High High Pval1 Pval2 2.0% low high 0.0% Very satifcatory 0.0% 0.0% 0. Characteristics of Projects PVAL1 and PVAL2 Project Input Variables PVAL1 PVAL2 $120.Table 8.0% 0.000 Cost plus (time and material) Low Medium Low "little impact" Low Very High Medium High "More than normal" Signifcant "stiff requirements had to be met" 10 Very low "scale up/down of existing technology" Very high ">20 projects" No 10 Very low "scale up/down of existing technology" Significant "Fast track normal" 40% 9 10 9 9 10 Very low Moderate High 3. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 143 .0% Satisfactory 6.100. Site conditions (impact) Legal and environmental conditions Owner profile and participation (0-10 scale) Newness of process technology to owner Owner previous experience with designer Split engineering practices Designer qualifications and capacity Newness of process technology to designer Design schedule Design-construction overlap Completeness of scope definition (0-10 scale) Completeness of objectives and priorities (0-10 scale) Completeness of basic design data (0-10 scale) Quality of constructor input and constructability (0-10 scale) Quality of vendor data (0-10 scale) Use of 3D CAD modeling Use of Integrated Databases Use of Electronic Data Interchange Percent TIC scope changes Change management procedure Change communication system Engineering Performance Measures %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.0% 10.0% 5.0% Project size ($TIC) Contract type Relative level of complexity Relative size of project. % Fab & construction cost increase due to design def.000 Lump-sum $76.7% 3.0% Low to moderate Low 5.0% 1.75% High High 4 Very high "100% new technology" Very high ">20 projects" No 5 Very high "100% new technology" Significant "Fast track normal" 20% 8 8 8 8 7 Very high Low Very high 2.

ability to model non-linearity.0% 0.0% Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.3% 10.0% 5.7% 3.6% Validation Project "P VAL2" Engineering Performance Measures Targeted 5. Targeted and Calculated Outputs for the Validation Projects Project "PVAL1" Engineering Performance Measures Targeted %Design rework 2. & construction schedule delay due to design def. particularly linear regression models.0% 8.7. Comparison with Statistical Approaches Statistical methods.1% 0.0% Low High 0.0% Very low to low Very low to low 0.0% 0. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences Calculated 1. & construction schedule delay due to design def. 144 .7% 17. and their systematic procedure for modeling linguistic variables.3% 0.3% 0.6.0% %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.0% Very satifcatory 0.0% 1.2% Moderate 11. The choice of neurofuzzy systems over regression models is justified based on their fault-tolerance. Nonetheless.0% Moderate Low to moderate 1.0% Low to Moderate Low 5.0% Satisfactory 6.0% 10.Table 8.1% 0.9% 0. % Fab & construction cost increase due to design def. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences Calculated 5.0% 3. could have been used as a research approach for assessing engineering performance. linear regression models can still be used for approximating the relationships between project input variables and the resulting measures of engineering performance. % Fab & construction cost increase due to design def. The development of these models can be further used for comparison purposes with the calculated outputs from the neurofuzzy system.5% Very satifcatory 0.7% 0.0% 0.

071*resize – 0. 99% % estimated savings due to constructability • • • • Project size Basic design data Constructor input Percent TIC scope changes 90% 99% % start-up schedule delay due to design deficiencies • • Site conditions Owner previous experience w/ process tech.031*owner – 0.048*constr.089*split + 0.288*overlap – 0.013*designer – 0. Designer previous experience w/ process tech.042*site + 0.029*comm.199 + 1.005*constr.825*percent % delay =0.8.88E-9*size + 0. Linear Regression Models for Estimating the Engineering Performance Measures.535 0. Scope completeness Designer previous experience w/ process tech.44E-9*size – 0.030*scope % overrun = 0.66E-9*size 0.508 – 8. …. y2. + 0.053 0.520 0.114*owner – 0.004*cad 0. 95% %savings = 0.248*site % overrun = 0.047*designer 99% 85% % delay = 0.023*designer 85% % start-up cost overrun due to design deficiencies • • • • 95% 85% 145 99% 99% 95% 99% .042*scope % delay = 1.009*site + 0.018 + 0.088 + 0. Commitment =0.062*designer – 0.007*constr.Table 8. y1.555+ 1.313 0.085 0.038*constr.008*id 99% • • Design-construction overlap Constructor input 95% %hours = 0. y10 Measures of Engineering Performance Significant input variables from variable reduction Variable reduction significance Level 95% % engineering rework • • • Design document release commitment • • • • Constructor input CAD Change communication system Split engineering practices Project size Relative size of project Scope completeness % detailed-design schedule delay • • Project size Site conditions 95% % detailed-design cost overrun • 85% % fabrication and construction schedule delay due to design deficiencies • • • Owner previous experience w/ process technology Designer qualifications and capacity Site conditions Designer previous experience w/ process tech.033*data + 0. 95% Constructor input Use of integrated databases 95% % overrun = 0. – 1. + 0.008*constr. Constructor input % fabrication and construction cost overrun due to design deficiencies • • % construction hours for design problem-solving and field design Linear Regression Model Significance Level % rework = 0.

…. FNN1. the predicted values were compared to their counterparts from the neurofuzzy system. more precise linear models could have been developed and better prediction could have been provided by linear regression model. Had more data sets been collected. FNN2.9.8 illustrates the developed linear regression models along with the level of significance of each. The developed models have significance levels that vary between 85% and 95%.8. the various measures can be predicted using the linear regression models of Table 7. Table 8.The forward stepwise procedure for the reduction of exploratory variables is applied for each measure of engineering performance. Also. Table 8. FNN10. PVER1 and PVER2. 146 . In Table 8.9 shows the resulting predicted values for both the linear regression models and fuzzy neural networks. Similar to the neurofuzzy system. the developed linear regression models were used for prediction of the individual measures of engineering performance associated with the verification projects. the linear regression models did not give acceptable predictions while the neuro-fuzzy approach resulted in a much better prediction. For illustration purposes.

& construction schedule delay due to design def. % Fab & construction cost increase due to design def.2% moderate moderate 14.4% 0.0% 0.4% 10.0% 2.7% 29% 59.9% 5. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiencies % Start-up cost increase due to design deficiencies 147 Pred. (neurofuzzy) Pred.9.Table 8. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiencies % Start-up cost increase due to design deficiencies Targeted Pred. (regression) 2. & construction schedule delay due to design def.0% moderate 14.0% High High to very high Very low to low Low to moderate 15. Engineering Performance Measures Project "P VER1" %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab. % Fab & construction cost increase due to design def.0% 2.0% %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.4% 41% 8.9% 5.0% 0.3% 55% 0.0% 10.0% 0.0% 0.0% Engineering Performance Measures Project "P VER2" Targeted 10.0% .0% 6.3% 0.0% 0.9% 8. (neurofuzzy) Pred.0% 5.0% 2.0% Very high Very low 5. Predicted Performance Measures Based on Neurofuzzy System and Linear Regression Models.9% 6. (regression) 9.5% 0.6% 60.7% 0.0% 6.0% 7.0% 8.5% Very satisfactory Very satisfactory Very satisfactory 0.0% 0.0% Very low to low Very low Very low to low Very high High to very high High to very high 14.8% 0.1% -2.

…. x25. were not used in system training. Using Equation 8. characterizing a project. y2. …. UT. it reflects the total engineering performance in the project. Nonetheless. 8. assessment of total engineering performance UT. x2. FNN2. This platform can be used in a variety of practical applications. the knowledge encoded in the connection structure of the fuzzy neural networks. is developed to provide a collective assessment of engineering performance in an industrial construction project.8. assessment of relative engineering performance UR. PVAL1 and PVAL2.7. y1. demonstrated the ability of the neurofuzzy system to predict the engineering performance measures. UT The multiple attribute utility function.2b along with 148 . Thus. sensitivity analysis for individual and categorical project variables. FNN1. Examples are prediction of individual engineering performance. FNN10. x1.7. presented earlier in the chapter.2 Assessment of Total Engineering Performance. when introduced with the set of input variables. 8. …. y10. resulted in predicting the corresponding engineering performance measure to an acceptable degree of error. Demonstrated Use Integrating neurofuzzy systems with multiple attribute utility function techniques provides a versatile platform for the assessment of engineering performance in an industrial construction project.1 Prediction of Individual Engineering Performance Measures The validation process. among others. Both projects.7.

0 1.the individual utility functions of Table 8.60 1.117 0.062 0. based on their actual engineering performance measures.00 0.115 0. y1. y2.099 0.000 Table 8. ….033 0.139 0. UT.7% % Fab.0 1. PVAL1 and PVAL2.10 and 8. & construction schedule delay due to design def.069 0.033 0. 5.10 0.033 0. the value of total engineering performance can be calculated.115 0.069 0.0% % Fab.0% Design document release commitment Satisfactory %Detailed design schedule delay 6.139 0.060 UT wm*um(ym) 0.2.0 wm 0.71 0.11.0% % Detailed design cost overrun 0. 0. for Project PVAL2 ym Engineering Performance Measures %Design rework 5.188 0.060 1.0% % Start-up cost increase due to design deficiences 10.117 0.110 0.0.0 1. Table 8.098 0.060 UT wm*um(ym) 0.10. y10.0% Design document release commitment Very satifcatory %Detailed design schedule delay 0. Estimated Total Engineering Performance.085 0.0 1. 3.86 0.00 wm 0.062 0.003 0.099 0. 1.50 0. for Project PVAL1 ym Engineering Performance Measures %Design rework 2.196 0.0% um(ym) 1. The resulting values can vary between 0.196 0.099 0.20 0.110 0.11 illustrate the estimated values of UT for the validation projects.000 0. UT.062 0.0 1.96 0.00 0.069 0.0% % Construction hours for design problem solving Low % Estimated Dollar savings due to constructability High % Start-up schedule delay due to design deficiences 0.0 1.117 0.115 0.117 0.0% % Fab & construction cost increase due to design def.031 0.041 0.139 0.000 0.0 and 1.0 1.586 .0 1.110 0. Estimated Total Engineering Performance. Tables 8.0 1.7% % Detailed design cost overrun 17.0% % Fab & construction cost increase due to design def.0% % Start-up cost increase due to design deficiences 0. & construction schedule delay due to design def.0% 149 um(ym) 0.022 0.196 0.0% % Construction hours for design problem solving Low to Moderate % Estimated Dollar savings due to constructability Low % Start-up schedule delay due to design deficiences 5.

869 as illustrated in Table 8.8. x2. x1. which is used for the relative assessment of engineering performance. FNN1. assessment of engineering performance. it does not reference such performance to some industry norms. for project (PA) can be defined as the ratio between the total engineering performance of this project. PREF. provides the means of an absolute.12 represents the reference project.7. with the most likely values for the input variables. rather than relative. The corresponding value for the total engineering performance.13. There can be a variety of alternatives for identifying these reference norms for comparison purposes. UT. This project would represent the average of the 22 projects. UT. PREF. UT(PA). This can be represented in the following form: 150 . …. x25.3 Assessment of Relative Engineering Performance. The relative engineering performance. whose data were acquired through questionnaire survey (Part A) in Appendix E. FNN2. the resulting performance can only be obtaining by using the prediction capabilities of the set of fuzzy networks. and the total engineering performance of the reference project PREF. for the reference project. …. PREF. UR The introduction of the multiple attribute utility function. Table 8. FNN10. UT(PREF). While the estimated values of UT for different projects allow the comparison of engineering performance among these projects. would be 0. One alternative is to form a reference project. Because such project is not an actual construction undertaking of an industrial facility. UR.

586 = 0. PREF).0 = 1. Using Equation 8.0 for UR indicate an engineering performance that is above the industry-average project.67 0.15 0.3) Whenever the relative engineering performance.U R ( PA ) = U T ( PA ) U T ( PREF ) (8.0. can be calculated as follows: U R ( PVAL1 ) = 1 .3. the project into consideration resembles the performance of the industry-average project (reference project. Values above 1. the relative engineering performance for the validation projects. PVAL1 and PVAL2. Likewise.869 151 . values below 1. equals 1. UR.869 U R ( PVAL 2 ) = 0.0 for UR indicate an engineering performance that is below the industry-average project.

2% High High PREF %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.9% Satisfactory 2. & construction schedule delay due to design def.6% 1.4% Moderate Moderate 1.Table 8.5% 7. Characteristics of the Reference Project PREF Project Input Variables PREF Project size ($TIC) Contract type $58.12. % Fab & construction cost increase due to design def.682 Cost plus (time and material) Moderate Relative level of complexity Relative size of project. Moderate Site conditions (impact) Legal and environmental conditions Owner profile and participation (0-10 scale) Newness of process technology to owner Owner previous experience with designer Split engineering practices Designer qualifications and capacity Newness of process technology to designer Design schedule (conditions) Design-construction overlap Completeness of scope definition (0-10 scale) Completeness of objectives and priorities (0-10 scale) Completeness of basic design data (0-10 scale) Quality of constructor input and constructability (0-10 scale) Quality of vendor data (0-10 scale) Use of 3D CAD modeling Use of Integrated Databases Use of Electronic Data Interchange Percent TIC scope changes Change management procedure Change communication system Engineering Performance Measures Moderate "typical conditions" Moderate "typical conditions" 8 Low "25% new technolgy" Moderate"2-10 projects" No 10 Low "25% new technology" Signifcant "Normal-Fast track" 46% 7 8 8 7 8 Moderate Moderate Moderate 2.1% .7% 2. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 152 2.391.0% 1.

196 0.6% 1.029 0.4 Sensitivity Analysis for Individual and Categorical Project Input Variables The establishment of the neurofuzzy generic system provides a universal platform for assessing the influence of the project input variables on its resulting engineering performance. % Fab & construction cost increase due to design def.110 0. consider 153 .0% 0.13. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences ym 2.00 0.139 0. for Project PREF Engineering Performance Measures %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.00 wm 0.060 0.099 0.4% Moderate Moderate 1. as it represents the industry-average attributes and conditions for an industrial construction project.060 UT wm*um(ym) 0. Estimated Total Engineering Performance.00 1.7% 2. would be a convenient alternative for the base project.00 0.099 0.000 0.117 0.062 0.110 0. For illustration purposes of the functionality of the generic system in assessing the influence of a particular input variable on the resulting engineering performance.062 0. & construction schedule delay due to design def.88 1. conducting a sensitivity analysis would allow the assessment of performance improvements that are associated with specific changes in an individual input variable or the set of input variables of a certain category.Table 8.139 0.9% Satisfactory 2.869 8.50 1.00 1.196 0. The reference project.90 1.5% 7. Based on the knowledge encoded in the network architecture.033 0.069 0. The sensitivity analysis would require identifying a base project where changes in the project attributes and conditions can be conducted and corresponding changes in the resulting performance can be measured.058 0.1% um(ym) 0. UT. PREF.00 1.117 0.7.00 0.115 0.

9% 1. respectively.6% 1.2% 1. Calculated Measures of Engineering Performance for Varying Conditions of the Input Variable “Completeness of Scope Definition” Engineering Performance Measures Completeness of scope definition 1 3 5 7 9 %Design rework 3.the variable “completeness of scope definition.2% 1.” on the total and relative engineering performance. and 9.0% 0.2% 1. 1. 3. 5.7% 1.0% 2.4% 1.5% % Fab. a sensitivity analysis can be conducted regarding this variable’s influence on the resulting engineering performance. 3.3% 2.1% 1.9% Design document release commitment Satsifactory Satsifactory Satsifactory Satsifactory Satsifactory %Detailed design schedule delay 1.” Using the individual utility functions of Table 7.3% 2.7% 7.8% 4.6% 1.0% 1.5% 14.” The reference project PREF has a value of 7 depicting the project’s completeness of scope definition (on a 0-10 scale.6% 1. Figure 8.2% % Construction hours for design problem solving Moderate Moderate Moderate Moderate Moderate % Estimated Dollar savings due to constructability Moderate Moderate Moderate Moderate Moderate % Start-up schedule delay due to design deficiences 1.8% 154 . y2. y10. & construction schedule delay due to design def. 1.6% 1.0% 1.2% 4.0% 1. 5. UT. ….3% 2.9% % Fab & construction cost increase due to design def. Changing the value of the variable “completeness of scope definition” while fixing the values of the remaining input variables. y1. 7.2% % Detailed design cost overrun 9. and 9 for the variable “completeness of scope definition.2% 2.4% 2.14. can be estimated for the cases of completeness of scope definition being equal to 1. the total engineering performance. where 0 represents very low and 10 represents very high).0% 10. Table 8.8 shows the influence chart for the input variable “completeness of scope definition.1% 2. Table 8.5% 0. associated with the values 1.2. 7. UT and UR.2 along with Equation 8.14 illustrates the calculated engineering performance measures.6% % Start-up cost increase due to design deficiences 2.

the category of project information inputs includes the variables: completeness of scope definition.978 1.869 0. The simultaneous change of the input variables comprising a certain category allows examining the influence of the entire category on the resulting performance. and 9. and quality of vendor data.850 0.98 0. Influence charts can be developed for each of the seven input variable categories through 155 . where the variables of each category collectively depict it. completeness of basic design data.000 1. 5. 7.80 UR 0.90 0.75 UT 0. 3. the five variables comprising the “Project Information Inputs” category change simultaneously to take the values 1.8. Influence Chart for the Input Variable “Completeness of Scope Definition” Project input variables are organized into seven categories. completeness of objectives and priorities. Table 8. For instance.00 0.55 0.880 10 0.85 0.68 0.838 0. In this analysis.1.965 0.70 0.15 illustrates the sensitivity analysis conducted on the “Project Information Inputs” category.58 2 3 4 5 6 7 8 9 UR 0.65 0.60 0.50 Completeness of Scope Definition Figure 8.88 0.95 1. quality of constructor input and constructability.013 UT 0.08 0.78 0.

6% 1.00 0. % Fab & construction cost increase due to design def.7% 1.275 0.1% Unsatis.60 0.867 0.2% 11.9 Influence Chart for Category “Project Information Inputs” 156 0.6% 0.1% 7 9 2.5% Moderate Low 2.343 0.40 0.3% 0.10 0.40 0.20 0.530 0.20 0.4% 13.084 UT 0.6% Moderate 10.5% 4.90 1.942 10 Project Infomation Inputs Figure 8.60 UR 0.8% 1.00 .2% Moderate 10. & construction schedule delay due to design def. % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 1 7.0% 0.3% Moderate Low 4.00 0.50 UT 0.30 0.5% 18. Calculated Engineering Performance Measures for Various Cases of Project Information Inputs Engineering Performance Measures Project Information Inputs %Design rework Design document release commitment %Detailed design schedule delay % Detailed design cost overrun % Fab.9% Moderate Low 4.proper sensitivity analysis.8% 11.5% 8.9% 1.2% 15.3% 1.15.998 1.298 0.80 0.0% 0.239 0.70 0.9% 4.00 0 1 2 3 4 5 6 7 8 9 UR 0.4% 11.2% Moderate Low Moderate Moderate 1.8% 1.9 illustrates the influence chart for the “Project Information Inputs” category.0% 0.9% 1.80 0.0% 3 6.0% 9. Table 8.0% 5 4.9% Satsifactory Satsifactory 1. 12.6% 11.610 0.4% 4. Figure 8.

construction and start-up phases. reduction in split engineering practices. and increase in designer experience with process technology results in reducing the amount of design rework and also the delays and cost overruns of the detailed design phase. Improving the information inputs results in a consistent and gradual reduction in design rework. Based on the knowledge encoded in the neurofuzzy system.16. The maximum improvement in total engineering performance.703. as illustrated in Table 8. and the construction hours spent for design problem solving. Also. and constructability savings. UT. fabrication and construction. and start-up and commissioning phases. construction and start-up phases. it contributes to increasing design document release commitment. schedule delays in detailed design.The complete sensitivity analysis for the seven categories of project input variables is given in Appendix H. It affects each measures of engineering performance throughout the detailed design. associated with the “Project Information Inputs” category reaches a high value of 0. • Both “General Designer Attributes” and “Level of Automation” categories have a sizable impact on engineering performance. The increase in the designer qualification and capacity. cost overruns in detailed design. several findings have been identified. A summary of the primary findings of the sensitivity analysis is given below: • The “Project Information Inputs” category has the greatest impact on the resulting engineering performance. This impact of designer attributes further extends to 157 .

this increase in delays and overruns is counterbalanced with reduction in construction hours for design problem solving. Improving the attributes as previously indicated leads to reducing fabrication and construction delays due to design deficiencies.045 0. Increasing the level of automation causes a reduction in both schedule delays and cost overruns of the detailed design phase until a certain level. UT.703 0. Maximum Improvement in Total Engineering Performance.094 General Project Attributes General Owner Attributes General Designer Attributes Project Schedule Project Information Inputs Level of Automation Project Changes • The “Project Changes” category has a noticeable impact on the various measures of engineering performance related to the detailed design phase. for the Different Project Input Variable Categories ∆U T “maximum Project Input Variable Category improvement in UT” 0. 158 .068 0. reduction of the construction hours spent on design problem solving and the further increase in the constructability savings. ∆U T . The “Level of Automation” category widely affects the schedule delays and cost overruns of the detailed design phase.201 0. the use of these automation tools on the extensive side causes an increase in schedule delays and cost overruns.053 0. thus resulting in an overall improvement in the total engineering performance. Table 8.218 0.16. However.the construction phase. Afterwards.

experience with process technology and previous experience with designer results in reducing the delays and cost overruns of the detailed design phase. 159 .Improving the “Project Changes” category. Also. The remaining “General Project Attributes” and “Project Schedule” categories have less noticeable impact as no particular change trends in engineering performance measures are detectable. through reducing percentage of changes and improving change communication. leads to a gradual reduction in design rework. detailed design schedule delays and cost overruns. • The “General Owner Attributes” category also has an impact on the detailed design schedule delays and cost overruns. The increase in the owner participation in the design process. this improvement in the “Project Changes” category reduces the construction hours spent on design problem solving.

The review of the current practices revealed that the common approach is to track engineering productivity based on the production of engineering deliverables such as isometrics.CHAPTER 9 SUMMARY AND CONCLUSIONS 9. An industry-wide survey conducted by RT-156 on CII BM&M data demonstrated that engineering is a critical link in constructing an industrial facility. and field rework. specification sheets. especially with the introduction of the data-driven engineering tools on an industry-wide basis. if not properly managed. Engineering activities become a source of schedule delays. and easy to use system for measuring engineering productivity. material requisitions. Outcomes of engineering activities have far-reaching impact on the entire life cycle of the project and quality of the constructed facility. It primarily 161 . project changes. Such system would act as a muchneeded component in the entire project control and management scheme. The research addresses the current need for having a valid. reliable. Such approach draws several parallels with the methods used for productivity measurement in the software industry. The CII RT-156 developed a new and innovative approach for measuring productivity in engineering organizations. This approach has shown its insufficiency in many engineering organizations. and so forth. cost overruns.1 Research Summary Engineering productivity/performance is a prime determinant of the successful implementation of industrial construction projects.

Additionally. It further recognizes that the actual (effective) engineering productivity broadly varies from project to another. input quality. A regression model that relates piping productivity to the number of equipment pieces (as representative of project scope and complexity) was further developed. Although the data limitations prevented the development of a fully robust model for piping productivity measurement. This variability is found to depend on parameters pertinent to project scope and complexity. Due to the substantial number of variables used in the model. a number of project variables showed critical levels of significance that requires further future investigation. 162 . Statistical approaches were used for analyzing the collected data based on their simplicity and ability to define the model relationships in an explicit-function form. such as number of equipment pieces. such as the BM&M questionnaire survey. The piping discipline has been selected to demonstrate the functionality of the new approach. Through questionnaire surveys. the RT-156 collected data for 22 industrial construction projects that were mostly executed by RT-156 companies. and so forth. were sought and found unsuccessful. length of piping. statistical variable reduction techniques were used to screen the variables and identify those with significant impact on piping engineering productivity. for the purpose of tracking the engineering progress in a project. it showed a great deal of consistency and potential for using this approach in industry-wide practices. The data collection process outside the RT-156 has experienced extended difficulties. Other data-collection avenues. and output deliverable quality.addresses the engineered units.

In a parallel effort. the RT-156 extended its effort to the other engineering disciplines. The experimentation with the piping discipline acts as the road map that can be followed by engineering organizations in developing similar engineering productivity models.2 Research Conclusions The construction industry lacks a unified approach for an easy and reliable measurement of engineering productivity. and also methods to quantify the resulting performance. and the 163 . prediction of engineering performance measures. the RT-156 also addressed the broader scheme of engineering performance. Multiple attribute utility functions were further used to integrate various measures of engineering performance and allow a collective assessment of such performance.Although piping was selected as the experimental field for the newly developed model. The developed integrated platform was used for several practical purposes. This encompasses developing analytical schemes that relate engineering performance to the set of project attributes and conditions that have an impact on such performance. and so forth. Neurofuzzy systems were chosen because of their robustness and fault-tolerance. assessment of total and relative engineering performance. including. Although it is quite common to track engineering productivity through the production of drawings. specification sheets. This was done through specifying the set of variables critical to each discipline and that can be used in defining the engineering metrics for each of these disciplines. A utility-based neurofuzzy approach was used for this purpose. sensitivity analysis of individual project input variables. 9.

future efforts similar to those pursued by the software industry are seemingly needed to accomplish the task of developing industry-wide robust models for measuring productivity for each engineering discipline. The RT-156 research lays the foundation for measuring productivity in engineering organizations. input quality. the advancing technologies that drive the use of several engineering tools such as 3D CAD. In particular. start-up and commissioning). This concept has indices to account for project scope and complexity. causes the use of such conventional approaches to become out of date. EDI. This extends the definition of productivity beyond only selected metrics (such as hours/length of piping) to the concept of effective productivity. The RT-156 introduces a new approach for the measurement of engineering productivity that ties it to the engineered units instead of the produced engineering documents.like. and so forth. 164 . It addition. The conceptual model for measuring engineering productivity accounts for variability in this productivity among various projects. it relates the engineering effort (depicted by engineering workhours) to the actually designed engineering units of the project. Several interviews with industry experts strengthened this conception. However. procurement. This has significant potential for creating a universal platform for project control throughout its major stages (detailed design. the software industry has been active in developing what is called “function points” to account for complexity levels in programming tasks on industry-wide basis. The software industry has made several parallel advancements in this area that were noted by the RT-156. construction. output deliverable quality.

To complement this research. due to the limited data set and the particular application in the piping engineering area. 9. several areas have been highlighted as influential in driving engineering performance in a project. The RT-156 effort to develop an integrated scheme for assessment of engineering performance showed neurofuzzy systems as capable platforms worth future consideration by the industry. these models can be customized to 165 . allowing the achievement of an acceptable level of confidence in the developed models. the numerical data is replaced with linguistic data. In addition. It demonstrated notable flexibility in variable description due to the fuzzy-based modeling of system variables. the learning capabilities of neural networks allowed the modeling of complex non-linear relationships.3 Recommendations for Future Work Although the study results are not entirely conclusive. the RT-156 used an integrated scheme for assessment of total engineering performance in an industrial project that uses both neurofuzzy systems and multiple attribute utility functions. and level of automation. it showed the potential of using the new approach to measure engineering productivity on a broader basis. such task requires future effort in collecting project data for the various engineering disciplines. However. In this context. designer attributes. variables depicting quality of information inputs. In particular. were found through sensitivity analysis to have the broadest impact on the resulting engineering performance. thus. Based on the knowledge encoded in this integrated scheme. Furthermore.

The integrated model introduced the concept of relative engineering performance. UR. This effort. the definition of UR should be linked to some practical industry norms. Neurofuzzy systems represent a powerful tool that can be used for implementation of the conceptual model for measuring engineering productivity. This scheme can be used as a powerful decision-making tool to improve the general practices in engineering organizations. input quality. This concept is based on identifying a reference project PREF. Developing refined workable models that include only the variables. The definition presented in this research needs more refinement 166 . A strength of these platforms is that the system relationships can be depicted in explicit function-like form (If-Then rules). However. however. would enhance the overall capabilities of the system and reduce its errors. relevant to the selected productivity metrics and the set of variables that influence them. which have substantial impact on engineering performance. Also.depict weighting factors for varying scope and complexity. The availability of actual project data. the system can be customized to handle specific category or categories of project variables. and output quality. whose attributes represent the industry-average values. requires sufficient pool of data to handle the substantial number of variables used in system description. to address the variability in engineering performance among projects industry-wide. similar to those employed in the software industry. The use of the utility-based neurofuzzy system in assessing engineering performance showed that some project input variables have more significant impact on engineering performance than others. would improve future model implementation practices.

construction. They represent powerful analytical tools that can be used for different fields in the construction industry. It is worthwhile that the industry further explores these analytical tools and the various areas of interest where they can be used. This generic system can be used for performance assessment in various applications in the construction industry. The strength of neurofuzzy systems has been notable in this research.such that UR can be used for a more reliable company-wide and industry-wide benchmarking. 167 . The research employed a generic system for the assessment of engineering performance. Analysis of questionnaire surveys is a particular field that can seemingly benefit from the capabilities of these systems. it can be customized to assess the performance of other functions involved in the project such as procurement. For industrial projects in particular. and so forth.

APPENDIX A RESEARCH PROPOSAL 169 .

B. 170 . III. Scope. Short Title: Engineering Productivity Measurement II. quality and operating cost objectives. startup. This research will not focus on previously CII researched areas such as "Input Variables Impacting Design Effectiveness" (8-2) or "Evaluation of Design Effectiveness " (8-1). The data-based driven engineering tools that are rapidly becoming the industry norm do not readily permit the capture of hours expended on discrete deliverables.RESEARCH SUBJECT PROPOSAL I. Background: Reliable engineering productivity measurement is a critical mechanism for project performance evaluation and improvement. Statement of Problem: Presently applied engineering productivity measures concentrate on hours expended on "deliverables" such as drawings and specifications with degrees of difficulty assigned to each deliverable. Benefits: This research will provide a system for engineering productivity measurement. drawing scale changes can increase the number of deliverables and "apparent" productivity with no actual benefit to project outcome. This inability to reliably measure engineering productivity results in difficulties with engineering performance evaluation and consistent benchmarking. Several apparent drawbacks to these approaches exist: • • • • They depend on subjective weighting for each deliverable. easy to use system for engineering productivity measurement that leads to real reduction in engineering cost and total installed cost while achieving project schedule. and Objectives A.g. Engineering suppliers need a means to drive improvement as engineering costs as a percentage of total project cost continue to rise. quantify and monitor internal work process improvement • Facilitate external benchmarking activities • Eliminate wasted engineering Purpose. B. but will include assumptions as to input conditions and standards of effectiveness in our research. Purpose: The purpose of this research is to develop a valid. The engineering activities to be covered include all project phases with concentration in the detail Design phase. Reliable engineering productivity measurement can be used to: • Drive. e. They depend on accurate documentation of all hours expended on each deliverable. Problem Statement: A.. Scope-Limitation: This research will focus on domestic industrial projects. C. Owner/operators need a means to evaluate performance of engineering suppliers. They focus on intermediate project products which in themselves may have little value to project outcome.

C. There are common methods of measuring engineering productivity. In order to verify the hypotheses. 4. Analyze Data and Test of Potential Productivity Measures: The data will be analyzed using appropriate methods. Expected Products of Research Deliverables will include: • • • • • • V. D. survey. Specific Objectives: 1. and interviews. Definition of engineering and productivity Definitions of input and output variables Productivity data gathering tool Validated productivity measurement system Research Report Implementation Resource Manual Recommended Research The research is intended to verify the following hypotheses: 1. To prioritize those measurements with most potential for project impact and collect project data related to those prioritized measures. determination of the types and format of collected data. 2. etc. To establish measurements that presently are considered valid. Data will be acquired from the CII member companies. and site visits. site visits. B. interviews. The method(s) will be tested within the research team member companies. A. Gather Data: After the data gathering methods are validated. IPA and/or any other potential data sources with CII’s endorsement. To analyze collected data and to construct a framework that allows uniform application of measurements of engineering productivity. IV. C. Design Data Gathering Methodology: This step involves prioritization of focus areas. CII Benchmarking Committee. surveys. sensitivity analysis. the research will advance in accordance with the following major stages. 171 . To develop implementation procedures for measurement of engineering productivity. Review of Past Research and Literature: relevant past research and literature conducted by CII task forces or others will be reviewed for a better understanding of the subject. and establishment of data gathering tools such as questionnaires. It is possible to standardize the methods to reliably measure engineering productivity. To understand past and current practices through literature search. 5. 2. regression analysis. 3. Some possible candidates for the analysis are analysis of variance. factor analysis.

Schedule A. VI. Principal Investigator for the project will be Dr. The investigator will have one or two research assistants and has the benefits of the institution's research infrastructure. B. to serve as a resource to investigate and to liaison/coordinate with other relevant CII ongoing research. reviewed and submitted to the CII and the work will be presented at the CII Annual Conference Presentation in August. to monitor the project. VII. Finalize Research Products: Those products previously described in Section IV will be prepared. Annual Conference Presentation: August. Integrate with Design Effectiveness Studies: The Design Effectiveness studies previously performed by other CII task forces will be explored for integration with the work provided from this study. The CII Engineering Productivity Measurement Research Team will be divided into subcommittees reflecting member expertise in the several areas of emphasis. Budget A. Develop Productivity Measurement System: This step involves the establishment of the engineering productivity measurement and hence achieving the primary goal of the study. to provide technical guidance. 1998. 172 . etc. C. 2000. G. maintenance. 2000.03580 B. Anticipated future expenses for update. Funding amount requested: $115. Luh-Maan Chang. to help gather data. B. Associate Professor of Civil Engineering at Purdue University. Possibility of fund leveraging: None anticipated VIII.: None C. 2000. Proposed Research Organization and Responsibility Assignment A. F. Initiation Date: July 1. Final Report Submittal December.E.

APPENDIX B RESULTS OF RT-156 SUB-TEAMS 173 .

Civil/Structural includes site development such as grading. underground utilities. etc. b.. combining civil and structural. seismic. major equipment supports. 174 .SUB-TEAM 1: Industry Driver Analysis Leader: Thomas L. The sub-team looked at CII membership groupings. railroads. Utility Systems and Vessel Design. 2. We chose the CII Benchmarking. 3. drainage. iso’s and detailing from material selection and stress analysis? If so. light industry. and of course there is some overlap with Process (Chemical). We did try to err on the side of fewer rather than more as a way to focus the research. Disciplines. will want to be very clear in data collection. CII Benchmarking survey and SIC codes. believing the nature of the primary process to be as important as scale. DISCIPLINES 1. Industry Groupings. building frames. blast resistance). how? We left them combined. but went ahead and included to have a complete picture. geotechnical studies and reports. and Structural elements such as foundations. analysis for special loading (reciprocating equipment. heavy industry and infrastructure. “Mechanical” had more spread than most. The primary drivers (X) we chosen based on the judgement of the sub-committee. Piping—do we separate routing. Include or exclude project management and procurement? We do didn’t think they would ever be key drivers. security. This could be significant on building projects. c. Issues included: a. We tried to consider the “normal” project rather than the exception. Driven more by how various firms designed their organization. CII really doesn’t have member groupings by industry type and SIC codes didn’t aggregate at summary levels very well. etc. roads. BACKGROUND 1. d. CSA—separate or combined? Split the difference. The Benchmarking survey 1996 report breaks into 4 groups—building. We chose to include HVAC. but leaving architectural separate. than by skill set. For projects where this could be a key driver. The earlier RT-156 lists and IPA lists were very similar. Zenge This is to provide further definition for DISCIPLINES and INDUSTRY groups as used on the “Engineering Productivity Drivers” matrix. CII definitions only included Control Systems and Process (Chemical). with further split of heavy and light industry into chemical and mechanical. parking lots.

machine control systems.) 9. Manufacturing Process (Mechanical) includes design and integration of the equipment that provides the intermediate or finished product of the facility. site and building master plans. special vessel design (tanks. Instrument/Controls/Automation includes design of process control systems. inventory. cost management. isometrics. measurement. contract negotiation. etc. Electrical includes design of power distribution systems. delivery/transportation. etc. expediting. 7. stamping. etc. schedule management. etc. Architectural includes building exterior presentation. The processes may also be known as dry or non-chemical. P&IDs. thermodynamic. material movement. quality monitoring. power distribution panels. landscaping. detailing from P&IDs. Project Management & Controls includes project execution planning. reaction. interior design. floor plans. utility systems (steam. etc. refrigeration). 8. bid tabulation. heat and mass balance sheets and process calculations. communications. stress analysis. etc. where those processes are assembly. packing. 3D presentation. design coordination. motor control centers. physical conversion. Piping includes routing. inventory control. (CII—Control Systems--Engineering discipline responsible for 175 . (CII— Engineering discipline responsible for process engineering. 10. etc. insulation. Prepares PFDs. Manufacturing Process (Chemical) includes design and integration of the equipment that produces the intermediate or finished product of the facility where those processes are chemical conversion. lighting. compressed air. plumbing. etc. 3. Procurement & Materials Management includes inquiry. first and second level computer systems. potable water). Mechanical includes building services (HVAC. material/people flow. 6. hangers.2. pressure vessels). Also known as chemical or wet processes. 4. substations. machining. 5. mixing. etc.

and judgments will be made based on one’s individual background and experiences. chemical property transformation. etc. Infrastructure.instrumentation. etc. light would include production of items of smaller physical size. Prepares instrument things. c. commercial. A mine mouth electrical generating station would have major elements of both heavy process and heavy mechanical. Process versus Mechanical. 3. use less energy to produce. A broad spectrum of private and public works facilities ranging from bridges and highways. dams. Some facilities may include all four elements. a site with a pulp mill. physical separation from residential and commercial zones.g. pipelines. institutional and recreational facilities. There is no clear boundary between the two. some projects may need to be split into two or more subsets for proper analysis. thermodynamic type production. bulk production. Process tends to be wet. physical property transformation type production. Heavy connotes big. b. But in general. be perceived as “clean” industry. Buildings. public. Non-manufacturing structures whose primary function is human or storage (warehouse) occupancy.. 176 . Industrial a. and converting systems to produce facial tissue.) logic diagrams and data sheets among other INDUSTRY GROUPS 1. e. etc. Broad spectrum of private. Light versus Heavy. 2. Pulp and paper making would be heavy process. marine terminals. while converting would be light mechanical. be built in I-1 and M-1 zoning. Mechanical tends to be dry. paper machines. sewer systems. For purposes of this research. large energy use.

natural gas processing. consumer products and general manufacturing. heavy electricals. consumer products. X X HEAVY INDUSTRIAL-PROCESS including oil refining. streets & lighting. and metals refining & processing. oil processing. pulp & paper making. and environmental. (DESIGN & PROCESS PROCESS ELECTRICAL CONTROLS MGMT UTILITIES. medical equipment. X INFRASTRUCTURE to include electrical trans & dist. military. oil exploration. hotels. X X X X X OTHER . schools. and forest products converting. labs. automotive. X X X X HEAVY INDUSTRIAL-MECHANICAL including aircraft. highway. maintenance facilities.PROJECT Discipline Industry CIVIL/ ARCHITECTURAL STRUCTURAL PROCUREMENT MECHANICAL PIPING MANUFACTURING MANUFACTURING INSTRUMENT/ MGMT & & MATERIALS (HVAC. shipbuilding. prisons. oil production. metal rolling & fabrication. parking garages and retail. machine tool. stadiums. chemical mfg. flood control. thermal & nuclear power generation. silicon wafers. waste water collection & treatment. bridges. LIGHT INDUSTRIAL-MECHANICAL X including electronics. tunnelling. hospitals. construction equip. beverage. airport. mining. warehouses. X BUILDINGS to include offices. marine. theatre. foods. precision machinery. government. rail. petrochemical. telecom. water distribution. MECHANICAL) (MECHANICAL) (CHEMICAL) AUTOMATION CONTROLS/ X X VESSELS) LIGHT INDUSTRIAL-PROCESS including pharmaceuticals. electrical generation. museum. transportation terminals. navigation. hydro power.

Gary Whatley .SUB-TEAM 2: Input–Output Matrix & Definition of “Engineering” Leader: J.

Attachment A CII RESEARCH TEAM 156: ENGINEERING PRODUCTIVITY DISCIPLINE DEFINITIONS 179 .

railroads. Detailed Tasks: Assist in establishing the soil survey and the property survey which includes the topography survey. Engineering hours includes the hours of engineering discipline team members (this does include designers and other support personnel). drainage. This will allow us to gather consistent data. Home Office Engineering – Engineering hours includes hours for all engineering work performed in the Home Office from the receipt of the preliminary process design (black book) until home office engineering completion. CII is attempting to develop a system that will be useful to all of you. blast resistance). 180 . The hours for these areas are collected separately in the “other “ category. field materials management. construction management. field cost and scheduling. seismic. geotechnical studies and reports. etc. analysis for special loading (reciprocating equipment. Engineering hours do not include the hours of non-engineering disciplines such as secretarial support and accounting. major equipment supports. Please use the definitions provided when assigning hours to each discipline.and structural elements such as foundations. Please carefully review the definitions and detailed task descriptions as given. DISCIPLINES 1. parking lots. Prepare preliminary cut and fill and site development analysis required for a preliminary plot plan. Please complete one set of input sheets for each project. or initial operations. roads. start-up assistance. information about all areas is being gathered. This does not include construction. building frames. This does not include construction. field procurement. Civil/Structural includes site development such as grading. underground utilities. or initial operations. etc. Although the purpose of this study is primarily to analyze engineering productivity. Establish specific job requirements for all civil work and add to General Specifications. We hope you will be able to use this data to analyze how your hours/dollars are spent. start-up assistance. Definitions: Home Office Hours – Home office hours includes hours for all work performed in the Home Office from the receipt of the preliminary process design (black book) until home office engineering completion. security..Because each industry and corporation operates differently and defines work in a specific manner. we have created standard descriptions and discipline definitions to allow CII to gather data and prepare productivity reports that will be useful to all businesses that provide or utilize engineering services.

Prepare schematic design of all buildings to include all the client’s requirements. Provide technical bid review. Develop and design special items such as monorails. Design and prepare drawings for underground storm water sewer and electrical trenches. basins. interior design. Prepare requisitions suitable for purchasing the supply and installation for items as indicated in the Material Management Plan. parking area. pipe and cable racks and trenches including size and location of anchor bolts and locations and details of embedded items. Detailed Tasks: Establish number and type of buildings and locate on plot plan. lighting. trolleys. site and building master plans. etc. Provide assistance for the resolution of construction problems. electrical pull boxes. and large pipe supports. Prepare design calculations for all equipment foundations. dikes. Prepare Piling Plan. catch basins. material/people flow. Design piles. showing size and location of reinforcing for detailing by the rebar fabricator. sumps. culverts. elevated concrete structures. Design special lifting frames and spreaders. sumps. Prepare cut and fill plan for site development. evaluate civil materials and subcontracts and review Vendor drawings. sewers and ventilation specifications. manholes. etc. sewers. Prepare paving and grading drawings and layout of sumps. etc. Prepare Site Preparation Drawings showing roads. bridges. floor plans. review pile drawing records. Architectural includes building exterior presentation. Prepare all equipment foundation drawings including elevated concrete structures. Prepare specifications and design details for all roads and fencing within the plant battery limits. Prepare specifications for all buildings in terms of area required for each housed activity. Establish industrial hygiene requirements for buildings including air conditioning. if necessary. Prepare general arrangement drawings. railroad spurs. 2. steel equipment structures and platforms in detail for purchase of fabricated steel and for preparation of shop details by steel fabricators. basins. landscaping. specific details and design computations for all steel pipe racks. Prepare cost estimates 181 . heating. drawings.Adapt standard drawings to specific job requirements. fences and gates (including cut and fill balance).

etc. Limited to those job hours spent during the engineering phase of the project through final issuance of all “issue for construction” documents. Estimate schedule impact of Contract Change Orders. Estimate cost of alternative designs. Landscape architecture.Prepare design drawings for all buildings including Architectural which will be used for construction. Develop schedule trends and forecasts. Report deviations from schedule and recommend remedial action. Prepare periodic cost summaries and detail reports. Prepare CPM network for project as well as detail logic ties for each activity. Report cost deviations. design coordination. Prepare layout and specifications for control house with control panels. schedule management. cost management. Space planning Field Inspection Presentation work. Prepare design specifications for all buildings including prefabricated buildings. Schedule Control Prepare Early Work Project Schedule. Develop cost trends and forecasts. Detailed Tasks: Cost Control Monitor project costs and take appropriate action for cost control. Prepare documents for bidding phase. Interior design. 3. 182 . Project Management & Controls includes project execution planning. Organize and prepare Master Project Schedule. Monitor actual project activity progress and compare with schedule requirements. Engineering Schedules and Construction Worksheets. Perspective and renderings Graphics. Prepare Contract Change Orders.

bid tabulation. monitor and control the Material Management Plan Prepare Project Procurement Instructions Perform procurement coordination activities. Does not include on-site procurement or materials management job hours Detail Tasks: Develop the Project Procurement Plan Prepare. contract negotiation. control and follow up on all procurement activities Review and present all bid tabulations to Project Management and the Client Issue the Requisition and Purchase Order Status Report and Subcontract Status Report Purchasing Establish country/countries of origin for supply of materials Identify major sources of supply of special materials to be purchased Prepare vendor list in conjunction with project management. expediting. Procurement & Materials Management includes inquiry. etc. 4. engineering and the client Prepare and release inquiries for all equipment and materials Analysis and tabulate commercial portion of bids and make recommendations in conjunction with engineering to purchase Issue purchase orders for equipment and material to approved vendors Subcontracts Prepare subcontract coordination plan Prepare subcontract control index Prepare project subcontract procurement instructions Prepare project terms and conditions of subcontract Prepare project subcontract bidders list Prepare subcontract request for quotation Prepare analysis and tabulation of bids Prepare and issue subcontract purchase orders to selected bidders Monitor issued subcontracts for scope changes and supplement requirements 183 .Prepare periodic progress summaries and detail reports highlighting problems and corrective actions taken. inventory. delivery/transportation. act as control liaison of all contacts in project procurement matters Review terms and conditions for project applicability Review project schedule with regard to procurement activities and deliveries Monitor.

materials. fabrication. motor control centers. water treatment units. Initiate action to resolve technical problems identified.Inspection Develop inspection plan including detailed inspection checklist Qualify the shop facilities and workmanship Establish need for Inspection of Subsuppliers Coordinate engineering and client participation as required for witness and hold points Check that vendors follow the required specifications. Items normally subject to inspection include: • • • • • • • • • • • • • • • • • • • • Towers and internals Pressure vessels Shell and tube heat exchangers and double pipe exchanger of special alloy or high pressure Air-cooled exchangers and surface condensers Steam generation equipment Special pumps and compressors. testing and shipping in accordance with the project schedule and purchase order. Check conformance of a realistic shop schedule for engineering. switchgear. Check that the correct manufacturing procedure is used Check the equipment against the certified drawings Witness hydrostatic and certain pneumatic and performance tests Issue partial inspection reports. Check the progress of work in such a way that corrective action for anticipated delays may be taken. pumps. drivers and other equipment Expediting Insure the purchase order confirmation is received. and clarifiers Special panel boards. 184 . including the recommendations in case of rejections Evaluate test reports Issue Final Inspection Reports and Release Notes for inspected materials and equipment. quality control and testing procedures. by size Centrifugal compressors with motors and/or turbine drives Heater coils and piping Cast alloy and stainless heater fittings Packaged heaters Stacks Packaged water treatment units Sub-contractors vendor supply Expansion joints over 24 inches or custom design Special valves or fittings Structural steel when special paint or galvanizing is required Lubricating and seal oil consoles Refrigeration units Packaged boilers. transformers. drawings.

instrumentation and materials from design through field erection. inspection. Establish fabrication and delivery priorities for all shop fabricated pipe and valving. refrigeration). compressors. when it is Companies responsibility to do so. field run testing or start-up of equipment. compressed air. Arrange for the routing and handling of all shipment with special attention to critical. to include import licensor. Mechanical includes building services (HVAC. Arrange for shipping agency and export licenses. Make necessary arrangements to ship by air freight items on emergency shipments. 5.) special vessel design (tanks. potable water). process systems (rotating pumps. Make shop expediting visits to vendor and subvendor shops and issued expediting reports. pressure vessels).Expedite vendor drawings from Vendors. Supervise Service for Packing and Shipping as follows: Establish with the vendors. adequate inner and outer packing that is consistent with the materials to be shipped and the markings. etc. Check the freight rates on large shipments to make sure the best rates possible are being obtained. heavylift or over-size loads. Detailed Tasks: 185 . utility systems (steam. Issue Material Status/Tracking Reports. Traffic/Logistics Prepare Logistics Plan Establish the need for and select Marshalling Yard. Material Control Prepare a Material Control Plan identifying responsible parties for the purchase and inspection of all materials by code of account Monitor status and control movement of all equipment. Insure receipt of all Vendor data. File claims when equipment and/or materials have been lost or damaged in transit. Does not include on-site follow up. Monitor detailed take-offs for all commodity materials. freight forwarder and export packer. plumbing. Trace shipments of equipment and/or materials which require special attention or are lost in transit. etc. Establish priorities for field fabricated material. Arrange for importing direct or through agency. customs clearance. filters. agitators.

. Check Vendors’ welding procedure specifications. Complete all drawings of towers. Review Vendors’ data sheets and drawings for conformance with specifications and detailed connecting piping design. Issue Purchase Order Requisitions. Review supporting steel and platform drawings. baffles. Prepare drawings showing arrangement of removable internals including number and locations of caps. external appurtenances. Review and evaluate Vendor bids and prepare technical recommendations. Prepare technical evaluation of bids. Prepare requisitions suitable for purchase of design and supply of equipment. Review vessel foundation drawings. 186 . etc. ladders and pipe supports to permit preparation of shop drawings. Prepare drawings showing overall dimensions and nozzle schedule for all storage tanks. Design each vessel and prepare drawings showing wall thickness of heads and shells. Prepare inquiries suitable for quotation of the design and supply of each exchanger. details for nozzles. in sufficient detail to permit purchase of vessels and preparation of Vendors’ shop drawings. Issue Purchase Order Requisitions. risers. • Heat Exchangers Prepare thermal and mechanical design including preparation of mechanical data sheet and setting plan to permit purchase of equipment on a hardware basis. Heat Transfer • Fired Heaters Including Stacks and Boilers Prepare thermal and mechanical design including preparation of detailed drawings to permit purchase of equipment on a hardware basis. tanks and vessels for nozzle orientation to conform to detailed piping drawings or model. Review and make technical evaluations and recommendations of Vendor bids. weirs and downflow sections in sufficient detail to permit Vendors to prepare shop details. with recommendation. Prepare inquiries suitable for quotation of vessels. Check Vendors’ drawings for conformance with design drawings and specifications. internals. supports.Vessels Prepare drawings showing clips for platforms.

Prepare insulation schedules for equipment. Establish type. Review and technically evaluate Vendor bids. Fireproofing Establish fireproofing mechanical and material specification. insulation methods and add them to General Specifications. Packaged and Special Equipment Prepare project specifications. Prepare drawings covering specific details for fireproofing. Review and make technical evaluations and recommendations of Vendor bids. Review Vendor’s data sheet and drawings for conformance with specifications. Insulation and Noise Abatement Establish mechanical and material specifications for hot. setting plans and detailed drawings for conformance with specifications. packaged and special equipment in Vendor shops or field. Painting Prepare material and application specifications for all paint work. Prepare fireproofing material and installation specifications. Prepare painting materials and application specification. Establish noise criteria for the plant and equipment. Issue Purchase Order Requisitions. packaged and special equipment. Develop fireproofing requirements for all buildings and equipment. Establish type thickness and covering of insulation for piping. cold and acoustical. Prepare requisition suitable for purchase of painting materials and application. thickness and covering of insulation for each equipment item. Rotating. Prepare requisitions suitable for purchase of insulation materials and installation. 187 . as specified. Establish specific job requirements for insulation items. Prepare inquiries suitable for quotation of rotating. orientation of nozzles and location of supports. Prepare painting schedule. Prepare requisition (where no direct hire basis exists) for all subcontracts. HVAC Prepare design specification and drawings of HVAC for all buildings. Witness testing of rotary.Review Vendors’ data sheets.

188 . P&IDs. docks. Prepare Utility Balance and Flow Diagrams (AFD). thermodynamic. Prepare AFC (approved for construction)orthographic piping drawings. Prepare specialty items requisitions. Establish mechanical and material specifications for all piping services classifications. Manufacturing Process (Chemical) includes design and integration of the equipment that produces the intermediate or finished product of the facility where those processes are chemical conversion. roads. insulation. valve and fittings.) Detailed Tasks: Prepare Process Flow Diagrams. Piping includes routing. Extract piping isometric drawings suitable for fabrication and erection. heat and mass balance sheets and process calculations. Prepare block model and general arrangement drawings. Prepare Process Piping and Instrument Flow Diagrams (AFD). fire water. Prepare piping 3D design model. railways. reaction. 7. Provide technical bid review of all piping quotations where needed. jetties. Prepare Utility and Offsite Piping and Instrument Flow Diagrams (AFD). etc. (CII—Engineering discipline responsible for process engineering. oily water sewers. Prepare site location map showing connections to major pipelines. Prepare overall plot plan and area plot plans. sanitary sewers and chemical sewers. detailing from Piping and Instrument Drawings (P&ID’s). Design underground piping systems and prepare all necessary arrangement and detail drawings. Design and detail special piping equipment and fittings. Prepare bill of material sheets for each isometric. Prepares PFDs (Process Flow Diagrams). Detailed Tasks: Develop specifications for specialty items. stress analysis. material balances (AFD (approved for design)).6. Prepare preliminary and final bulk requisitions to permit early ordering of pipe. Also known as chemical or wet processes. These systems may include cooling water. Review piping stress and pipe support design criteria. 3D presentation. electrical transmission lines and waste effluent routes. Prepare stress calculations and pipe support details as required. isometrics. supports. storm sewers. specifications and piping materials.

Approved for Construction (AFC). Prepare process vessel and tank sketches Heat Transfer • Fired Heaters. Water Treating Establish process requirements and operating conditions for each heater including fueling plan. Provide technical data and engineering documents for the project in order to obtain approvals and permits from regulatory agencies. finalize plot plan and prepare general arrangement drawings showing location of all equipment and extent and type of structures Finalize utility balance and flow diagrams (AFC). building. Vessels Establish required dimensions. Prepare vessel and tank data sheets for all pressure and vacuum services. Establish capacity and processing scheme for boiler feedwater treating unit. Boilers. Perform all dispersion and emission calculations. boilers and water treating units. Establish industrial hygiene requirements for buildings and semi-enclosed areas. electrical and instrumentation. Establish capacity and operating conditions for the boiler including fueling plan. Prepare Catalysts and Chemical Specifications and Requirements.Prepare Line Schedule (AFD) for P&IDs. Develop utility and offsite P&IDs through (AFC) status Where required (AFC). Identify specialty items on the Piping and Instrument Flow Diagrams. Establish safety relief and Flare System Design Basis. Prepare “as built” drawings of process plant equipment. Develop Process Flow Diagrams. valving and safety system requirements and show on Piping and Instrument Flow Diagrams. Prepare process data sheets. • Heat Exchangers 189 . Prepare technical data sheets for fired heaters. Establish all piping. piping. Prepare a supervisory type operating manual. Define quantity and quality of effluents and wastewater treatments. nozzle locations and design conditions as well as materials of construction and corrosion allowance for each vessel and include details for special internals. Develop process P&Is through to AFC status.

Define packaged equipment scope. Prepare preliminary single line diagrams. Prepare technical equipment data sheets. Establish process duty specifications and finalize data sheets for flare package including Knock Out drums. The processes may also be known as dry or non-chemical. • Cooling Tower and Raw Water Treating Establish process duty specifications and finalize data sheets for cooling tower. radiation. Indicate electrical areas (transformer pads. 8. Prepare flare balance. operating and design conditions. motor control centers. Manufacturing Process (Mechanical) includes design and integration of the equipment that provides the intermediate or finished product of the facility. where those processes are assembly.. Establish treatment requirements for and finalize process data sheets for water treating and dozing of cooling tower. Detailed Tasks: Establish mechanical and material specifications for all electrical equipment and wiring. Prepare specification sheets. for flare header. stamping. etc. operating conditions and materials of construction for rotating. Packaged and Special Equipment • Flare System Establish flare loads. power distribution panels. material movement. 9. mixing. etc. switch yards) on plot plan. Rotating. machining.. substations. communications. physical conversion. operating conditions and materials of construction for heat exchangers. packaged and special equipment. 190 . etc.Establish performance requirements. Perform all process calculations such as dispersion. Electrical includes design of power distribution systems. • Other Rotating and Packaged Equipment Establish performance requirements. Prepare hazardous area classification. packing. lighting. to meet environmental and safety requirements. pumps. Establish design conditions such as back pressure. etc. etc.

motor control centers and long delivery cable. Prepare electrical and instrument underground duct bank location drawings using Plot Plan as a basis. lighting requirement at grade and on platform and structures with specific details as required. Develop lighting system and prepare drawing showing the arrangement of lighting panels. Prepare requisitions suitable for purchase of major electrical transformers and switchgear. Check electrical equipment vendor’s drawings and data for conformance with specifications and drawings. Requests for Quotations (RFQs). Prepare drawings for lightning protection. Provide field construction and start-up assistance. Vendor Drawings (VD). Prepare technical review of electrical equipment. Prepare requisitions suitable for subcontracting the installation of all electrical items including the supply of all commodity materials. Establish and design an emergency lighting and electrical power scheme. Prepare inquiry requisitions for communication systems. Prepare grounding drawings and details. Layout and development of electrical system for instrumentation. Prepare requisitions for selected engineered commodity materials. quality monitoring. Develop layout and detailed drawings of power cables. Assist Mechanical Group in the evaluation of electric drivers. 10. Prepare electrical and instrument cable tray location drawings using Plot Plan as a basis. machine control systems. inventory control. Assist mechanical group in review of specifications. Instrument/Controls/Automation includes design of process control systems. Design electrical distribution system. conduit and electrical ducts with specific requirements for switchgear and motor control center. Prepare layout of substations including arrangement of switchgear. Layout and develop electrical power supply system for instrumentation. Prepare requisitions for uninterruptible power supply.Prepare electric motor specification. Complete single line wiring diagrams. material and subcontract. measurement. Motor Control Centers (MCC) and outdoor transformers. and packages. first and second level computer 191 .

Specify control and operating philosophy desired and integration expected to any related control panels and control systems(Distributed Control System (DCS). Prepare bid tabs. Develop process (P&I)Piping and Instrument and Utility Diagrams in cooperation with process and project groups including numbering. subcontracts. and flow devices. etc. symbols and identification.). review Vendor Drawings. Configure regulatory and sequential systems. PLC. etc. (CII—Control Systems—Engineering discipline responsible for instrumentation. Prepare inquiry requisitions for analyzers and shelters. Review all mechanical package unit inquiries and purchase order requisitions where instrumentation is to be furnished by a Vendor as part of the package unit. control valves. Prepare instrument data sheets (ISA format). Configure graphic displays. if appropriate. relief valves. Prepare inquiry requisitions for control systems (DCS. Prepares instrument logic diagrams and data sheets among other things. Establish Instrument Engineering Design Package database. Prepare Instrumentation Installation Details. TMR. 192 . Provide dimensional data for all in-line instruments to Plant Design. Prepare control room layout drawing. PLC. recommend Vendors and prepare purchase order requisitions. Vendors’ loop and wiring drawings and schematic drawings by package unit Vendors.systems. Review instrument equipment.) Detailed Tasks: Establish instrumentation scheme for process control and safety systems and depict on Piping and Instrument Diagrams. Prepare inquiry requisitions with supporting documents for instrument installation. Check instrumentation Vendor documentation and drawings. etc. Size all in line instruments. Assist mechanical in preparation of mechanical package equipment. Identify redundancy philosophy for all control systems utilized. per the Material Management Plan. ESO. Configure event and history logs. Prepare Instrument Index which comprises all loop components. Prepare inquiry requisitions for all tagged instruments. valves and fittings.). Prepare requisitions for materials including piping. Prepare technical bid tabulations for bids received on instrument installation and subcontracts. machine monitoring.

Provide field start-up assistance. They do not include prorated overhead. Prepare Logic and/or Cause and Effect Diagrams for sequence and program control. instrumentation sub-systems and interface equipment. Prepare Logic and/or Cause and Effect Diagrams for interlock and alarm systems. Provide specific requirements and specifications for all instruments. Initial Operations Start-Up Construction 193 . Check-out control system at jobsite.Prepare Instrument Location Drawings. 11. etc. Provide DCS configuration/assistance. OTHER . control systems. Secretarial Quality Control Engineering Aides IT Legal Accounting Hours Vendor Document Control NOTE: ACTIVITIES LISTED BELOW ARE NOT INCLUDED IN THIS STUDY.These activities include only specific job related charges. using piping drawings and vessel elevations back grounds.

Attachment B CII RESEARCH TEAM 156: ENGINEERING PRODUCTIVITY MEASUREMENT BY DISCIPLINE 194 .

cut.9 lbs/ft.) Heavy (> 41 lbs.) Medium (22-40 lbs.) Heavy (> 41 lbs.) Medium (22-40 lbs.) c) Length of pipe Small Bore = 3” Large Bore > 3” d) Manual valves e) Length of wire Power Wire Control Wire f) Light fixtures g) Area of cable tray h) Motors > 200 hp = 200 hp i) Number of control systems I/O’s j) Balance of field instruments k) Length of control wire l) Discipline hours 5) Mechanical a) Pieces of mechanical equipment b) Discipline hours 6) Piping a) Length of pipe Small Bore = 3” Large Bore > 3” b) Manual valves c) Discipline hours 195 ./ft.CII RESEARCH TEAM 156: ENGINEERING MEASUREMENT BY DISCIPLINE 1) Civil/Structural a) Volume of concrete b) Volume of dirt./ft./ft.9 lbs/ft.) d) Total plot area (improved. and fill c) Weight of steel Light (< 21./ft. developed) e) Discipline hours 2) Architectural a) Area of buildings b) Discipline hours 3) Project Management and Controls a) Discipline hours 4) Procurement and Materials Management a) Pieces of mechanical equipment b) Weight of steel Light (< 21.

7) Manufacturing Process (Chemical) a) Pieces of mechanical equipment b) Discipline hours 8) Manufacturing Process (Mechanical) a) Pieces of mechanical equipment b) Discipline hours 9) Electrical a) Length of power wire b) Light fixtures c) Area of cable tray d) Motors > 200 hp = 200 hp e) Discipline hours 10) Instrument/Controls/Automation a) Number of control systems I/O’s b) Balance of field instruments c) Length of control wire d) Discipline hours 11) Other a) Discipline hours 196 .

Number Number of Control Systems I/O's Balance of Field Instruments.) Length of Pipe Small Bore < 3" Large Bore > 3" Manual Valves.) Heavy (>41 lbs/ft.Other Inputs Mechanical Equipment. ft.9 lbs/ft. & Fill Area of Buildings Weight of Steel Light (<21. cu. ft.) Medium (22-40 lbs/ft. Number Light Fixtures. Quantity Comments . Number Length of Wire Power wire Control wire Area of Cable Tray Motors. sq. count/hp count/hp sq. sq. Cut. sq. count count count count ft. yds. ft. yds. Number/Horsepower < 200 hp > 200 hp Total Plot Area Improved Developed Examples for Measuring Productivity Units of Measure pieces cu. Number Volume of Concrete Volume of Dirt. ft. ft. tons tons tons ft. ft.

Mgt Hours Pieces of Equipment = 52031 455 = 114 hrs.67 hrs. 5 Mechanical Mechanical hours Pieces of Equipment etc…. Mgt Hours Balance of field instruments = 52031 2787 = 18. 198 . per balance of field instruments Proc & Mtl. per piece of equipment = 51010 455 = 112 hrs..78 hrs. 4 Procurement and Materials Management etc…. 6 Piping etc….. per small bore pipe Piping Hours Length of Pipe (large bore) = 154332 198751 = 0. 2 Architectural Architectural Hours Area of Building etc…..25 hrs per area of building Proc & Mtl. per small bore pipe etc……. per Ton = 2250 9000 = 0. per piece of equipment Piping Hours Length of Pipe (small bore) = 154332 178900 = 0.86 hrs.1 Civil/Structural Civil/Structural Hours Volume of Concrete = 44804 21998 = 2 hrs per cubic ft of concrete Civil/Structural Hours Weight of Steel (Light) = 44804 425 = 105 hrs.

looked for gaps and developed four additional questions: 1) How new was the “Process Technology” for this project? 0 1 2 3 4 5 Duplicates existing technology Scale-up / down of existing technology 25% new technology 50 % new technology 75% 100% of new technology . 41 Input Quality Differentiates Industry Differentiates Project Type Contract Type Output Effectiveness Location Engineering Tools Used The team also reviewed other data questionnaires. 39 (PDRI Checklists) 7 8. 17b 10 9. 16 3 36.13. 40.SUB-TEAM 3: Other Project Data Leader: E.14. 37. William Buss STATUS/ RECOMMENDATIONS: The team reviewed the CII Bench marking questionnaire and broke the CII Benchmarking questions into the following “Project Characterization Categories”: PROJECT CHARACTERICS CII 1998 BENCHMARKING QUESTIONS 15.

the subteam would like to propose a change in data gathering methodology that would minimize our data collection requirements. Screen current CII Benchmarking Database and Determine how many projects are: • Project completed in 1997 and 1998 (when CII Benchamrking questionnaire added PDRI questions) • Grass root Projects • Have PDRI scores (filled out while project was in progress or “after the fact” for CII Benchmarking questionnaire) NOTE: Feedback from CII Benchmarking committee is that there are 3 projects in 1997 Data Collection with completed PDRI scores but 1998 data is being gathered as we speak and there should be a larger screening match after 1998 data is collected.2) How many times has Owner Company worked with Detailed Design Organization before this project ? > 20 projects 10-20 projects 1-10 projects 0 projects (First time with Detailed Design Organization 3) What Engineering Design Standards were used on this project ? Industry Standards Alone were used Design Contractor Standards Alone Were Used Owner Company Design Standards Alone Were Used Project Specific Design Standards Were Used 3) How do you currently measure / monitor Engineering productivity ? 4) How do you currently manage / track Engineering progress? SUBTEAM RECOMMENDATIONS In reviewing the CII Benchmarking questionnaire and the PDRI checklists. 200 . expedite the data normalization process and reduce the size/ complexity of our data requests to the companies. The proposed methodology would be: 1.

) 3) Target an industry with enough projects in CII database (Validate enough projects in Heavy Industrial Process) 4) Combine our “additional questions” with Subteam 2 Input / Output questions (Note Subteam 2 should look at CII Benchmarking question #17 for additional equipment definitions).2) Take Screened projects list and sort by industry (Determine how many of these are in our proposed Industry Group AND whether any belong to companies with members on Engineering Productivity Team.continue to next discipline OR issue questionnaire to rest of CII Benchmarking projects Owners that meet screening criteria outlined in Step #1. (Decide which path to take based on results of step 6.) 6) Analyze only Piping discipline data to validate methodology 7) Based on piping results. 5) Have Our Team Member companies fill out “Engineering Productivity Questionnaires FOR THE PROJECTS WHICH THEIR COMPANY HAS ALREADY COMPLETED CII BENCHMARKING QUESTIONNAIRE. (Test questionnaire / Gather data for all disciplines where it is available.) 201 . The combination of Subteam 2 and 3 questions would constitute first pass “Engineering Productivity Questionnaire”.

APPENDIX C REVIEW OF OTHER INDUSTRY PRACTICES By: Kenneth D. Walsh 203 .

Probert and Lew. 1984. the excellent summary collection by Christopher and Thor. by making use of the technique under discussion. 1993. measures in the econometric literature tend to focus on the measurement of manufacturing productivity. However. ergonomic office design (Hacker. productivity can be simply expressed as the amount of production or output (perhaps measured in completed pieces) per some unit of input resources (perhaps time or quantity of input resource). 1994). Winter. Graham.Summary of Productivity Literature: Engineering and Other Fields Kenneth D. particular software for integrated circuit. or particular approaches to engineering management (Repic. Sackett and Evans. Benayoune and McGreavy. one can improve the productivity. In such cases. or civil engineering design (Girczyc and Carlson. 1997). 1992). quality. and a rich econometric literature on the subject exists (see. Performance or quality measures are somewhat more complex. Fenton (1993) found that the claims of dramatic productivity improvements in the software engineering industry are usually not defensible when performance data are actually collected. In the manufacturing environment. 1993. certain specialized software engineering techniques (Bidanda and Hosni. Walsh Associate Professor. there is surprisingly little attention in the technical literature to the question of measurement of engineering productivity and performance. et al. In these studies there is an abundance of conclusions and a lack of data. and can be found in all branches of engineering. one frequently finds references to the idea that. However. A recent review uncovered discussion of potential performance improvements resulting from the use of concurrent engineering schemes (Isbell. 1990. Arizona State University Curiously. 1993). 1996. Examples of such claims are many. process. for example. Productivity and performance measures are certainly common in other industries. 1996). 1994. and/or performance of engineers. The most common place where one finds reference to the general topic is in literature with a bias for a particular tool or technique. Jundt. but are generally statistical interpretations of the product 204 . 1992). Thamhain. 1985.

and control construction projects (Thomas. quite repetitive. or pipe hangars installed per labor hour). the quality of the performance itself can be measured. 1990. with construction productivity measurement a clearly defined process. 1995). For example. productivity is generally defined as the output of work put in place for some unit of input (for example. 1998. although they define productivity as input/output instead of the more common output/input. et al. many of the processes which take place on a construction site are. 1993). using a statistical regression process. However. in turn. yards of concrete finished per labor hour. Thomas and Napolitan. but serves to point out how the many complex factors which might affect construction productivity can be evaluated given an accepted measurement. to an improved ability to estimate. Thomas and Kramer (1988) provide an excellent and frequently used summary of construction productivity measurement. When multiple steps are required for a portion of the work. wait times between consecutive trade processes (Howell. Performance measures are also commonly made. Sanders and Thomas (1991) consider a number of factors which influence masonry productivity.testing failure rate. In this case. however. The increase in understanding leads. or some kind of survey-based customer satisfaction rating. are used. et al. This is by no means an exhaustive list. et al. using an index such as the ratio of actual productivity to planned productivity or the work completed to date compared to the scheduled completion as of that date. the product return rate from the ultimate consumer. 1998). Construction is often likened to manufacturing. methods for allocating partial completion. taken by themselves. et al. as there are many similarities and the primary differences relate to the relatively unique nature of a given project and the lack of control of the environment at the work space. and even geographic location (Proverbs. Sanders and Thomas. 1993). usually based on a weighted earned-value analysis. plan. Other studies have considered the impact of overtime (Thomas. changes during construction (Hanna. it has been possible to complete detailed evaluations of the factors which might affect it on a particular job. It is interesting to note that. and as a result construction productivity and performance measurement builds heavily on manufacturing methodologies. 1992). In addition. the most well-known being materials testing for QA/QC activities during construction. This measure is so important it is a common field of study in 205 .

et al. complex processes are boiled down to a bottom-line measure of an individual. because the right output to measure is quite unclear (students. 1994). and perform or directly supervise nearly all activities related to it. Wimo. 1995. individual engineers handle by themselves only the very smallest projects. The use of bottom-line performance also appears to be common in the professions frequently used in comparison with engineering. 1990. in which the winning percentage of the team over the season is taken as a measure of managerial productivity or performance. However. Ohtake and Ohkusa. usually employed in several different companies (Anderson and Tucker. as well. 1966. which is ironic given that it was not that long ago that the problem was considered too complex by practitioners (Thomas and Kramer. As applied to engineering. As a result. For example. 1995). research projects. 1994). The productivity of professors is very complex.construction management curricula in the United States and is commonly used in the industry. however. papers. and the inputs to that product more nebulous. most engineering projects require the activities of dozens of engineers. Attorney productivity may be measured using client billings per year (Rebitzer and Taylor. service activities. In many professions. There is a long tradition of measurement of individual productivity 206 . et al. as they often are in essentially sole charge of a given case or “project”. 1982. Stephan and Levin.). 1997). but journal publications per year appears to be a fairly common measure (Pelz and Andrews. The measurement of productivity by measurement of bottom-line performance is fairly common in service and technical industry sectors. etc. measures such as those outlined above have significant limitations. 1993. The productivity of doctors is often reported in terms of patients seen (or related measures such as billings or bookings) per given time period (Lee. Goodwin and Sauer. the productivity of the individual engineer is of interest only to their direct supervisor in charge of continuing employment review. 1993). but certainly not to the owner of the project on which that individual toils. Reuben. measurement of the productivity or performance of individual doctors or attorneys is sensible. The measurement of productivity and performance is considerably more complicated outside the manufacturing environment. The clearest example comes from a number of studies of professional baseball managers (Porter and Scully. where in many cases the product is more difficult to define. 1988).

Furthermore. but that this measure is inappropriate for projects requiring significant degrees of collaboration. using the measure of average value-added (in essence. but again it is largely qualitative. and usually made in the context of a recommendation for a given product. Engineering quality is a field which has generated a great deal of literature. but less so for use in identifying avenues of possible improvement and understanding the quality of the performance. discussions of engineering quality can be described as study of the efficacy of certain management schemes for improving 207 . Mairesse and Kremp (1992) measured company productivity on average in France. point out this shortcoming for the measure of bookings/period for health care workers. The measure of billings per year for attorneys. estimating the cost or schedule for new projects. Stephan and Levin (1997) point out that individual productivity may have been studied so much because it is relatively easy to measure. profit-based measures of productivity (such as billings/year) are very useful for making employment decisions about individuals. or differentiating better performing engineers. For the most part.in service industries. for example. Fisher (1991) proposes a similar method for engineering company owners to use to identify strongly and weakly performing branch offices. describing the factors which influence performance. and propose additional consideration of the patient mortality/period as a measure of the quality of physician practice. these references chiefly relate to the productivity of individual engineers and their ability to do more work in a given time. a few attempts to make quantitative measurements of group performance have been made. However. Wimo. and in fact most engineering companies would likely be unwilling to report them. This measure is interesting to owners of engineering companies. describes profitability but gives no indication of the number of cases for which hours were billed that the attorney ultimately won for their client. do not provide a measure of quality. but does not provide a management tool for identifying good performance. then. As has been described previously. are not particularly useful to the owner. references to productivity and/or performance of engineers in the literature are typically qualitative in nature. Profit-based measures. or better work in the same time. et al (1993). As such. total corporate profit reported by engineering firms in the nation) per employee.

et al. final project cost. The index has been shown to be correlated to project success (with R2=0. et al (1997) report the development of a Project Definition Rating Index (PDRI). Thompson. constructability. 1987.4). quality of design. Stull and Tucker. Packer. performance. they pointed out the importance of the input of the owner and the work of previous phases in the procurement process to the performance of the engineer.qualitative performance. Russell. 1996. 1993. The PDRI consists of 70 elements which describe the state of the project definition and scoping process. Warne. effectiveness relating to the adequacy of work done on the task after it is completed. For example. Certainly the most rigorous evaluation of engineering performance to date arises from work sponsored by the Construction Industry Institute (CII) in the middle 1980’s (Chalabi. It was previously pointed out that the construction phase. Chalabi. and is now frequently used by a number of large 208 . 1986). The parameters they identified were: final project schedule. In this way. is reasonably well quantified in terms of performance. These techniques are undoubtedly important. which built on the work of Chalabi. et al (1987). et al (1987) identified seven general parameters which described the quality of the design work. They then set about identifying what owner inputs from the planning process most directly influenced which of these parameters. 1996). et al. which utilizes the output from the design phase in traditional delivery methods. and safety. productivity relating to the use of resources to complete a task. 1994). Some research has tied project success to use of a number of management techniques during the design process or to the ability of the engineering manager (Anderson and Tucker. there are quantitative methods for evaluating the scope definition and planning phase which creates the input to the design phase. 1990. Furthermore. 1994. also using a qualitative survey method. but this literature also provides no means of evaluation or quantification. Dumont. engineers. Porter. based on an extensive survey of owners. and designers. combined into a single index with a standardized weighting scheme. These authors pointed out the distinction between productivity and effectiveness. Burati and Oswald. 1994. there is a significant body of literature which carries the Total Quality Management banner to engineers (Brown and Beaton. plant start-up. 1995. 1993). et al. Partnering has received a considerable amount of attention as a means to improve the quality of the engineering activity (Featham and Sensenig.

Piping was selected because it is a demonstrably important part of the overall project. “accuracy” = number of drawings requiring revision/total number of drawings). in which the value of each parameter was mapped to a standard for that parameter. constructability and usability). et al. while others were evaluated using a 1-10 scale (e. However. 1997). Through this process. but are proprietary methods developed by corporations. However. and certainly the complexity of the project and the readiness of the owner would be expected to influence this parameter regardless of the ability of the engineer (Glavan and Tucker. constructability. Some were evaluated quantitatively (e. performance against design schedule. economy of the design.g. and ease of start-up. They then attempted to evaluate these parameters. and plant utilization. budget. In recognition of the complexity of the design effort for an entire project. they considered only the piping design process and the subsequent construction of the piping described by the resulting design documents. usability of the design documents. this procedure suffers from the subjectivity of the scoring of individual parameters and the weighting of those parameters into a combined score. cost of the design. (1987) but because they were attempting to develop a quantitative measure they selected slightly different parameters to describe the effectiveness of the design. Other similar indices exist. revision of 20% of the design documents might be acceptable. the performance of the design effort could be evaluated. The correlation to project success was performed by comparing the PDRI to an index function of ratios of planned to actual schedule. namely: accuracy of the design documents. 1991. plant capacity. For example. All of these factors were then entered into an objectives matrix. this work does validate the idea of evaluating design effectiveness. at least semi-quantitatively. An interesting parallel to the assessment of engineering performance and productivity can be drawn to the assessment of the performance and productivity of 209 . Stull and Tucker (1986) attempted to make a quantitative evaluation of the effectiveness of the design process itself.industrial owners to guide their pre-project planning process. Dumont. Then all of the mapped parameter scores were combined into a single score ranging from 0-1000 by weighting the individual scores. et al. They were aware of the ongoing work of Chalabi. but a different threshold might be appropriate on a larger project.g. on a small project.

A 1995 survey reported that better than half of all software projects are challenged. ultimately experience at least 100% cost overruns. Twenty-five percent rated design as the weak link in the process of developing a new facility. thousands of worker hours and multiple project managers. While these measures are fairly dramatic. in many cases. A similar debate can be had about the efficacy of a threedimensional database to hold the design of a complex industrial facility. 189% cost overrun. subcontracted design Same Different quality of project scope completion (PDRI) exists Same A further parallel is the tolerance for late/overbudget performance in both the EPC and software industries. Software productivity cannot satisfactorily be measured in a direct measure such as lines of code per day.O.software development activities. and can be outlined as shown below: Software Lines of code/day an unsatisfying early idea Measurement complicated by new tools and higher order languages. 1999). Because of these parallels. and almost 80% were satisfied with the quality of their new facility. et al. largely because the effect of those lines of code is a function of the environment in which the programming is done.D. Anderson and Tucker (1994) report a similar percentage of late/overbudget performance on 465 projects nearly a decade earlier. on average. vendor design. but more importantly the argument points out that the first job in assessing productivity in software development is to develop a means of quantifying the output of the software 210 . reused libraries Different quality of final product is possible Different quality of design spec is possible Measure of size of project is complex Engineering Drawings/day an unsatisfying early idea Measurement complicated by automated design tools and technologies. Challenged projects experienced. The Department of Defense estimates that half of the software projects completed for the D. For example. and were delivered with only 61% of the specified features. A recent issue of ENR (Post. RT 156 studied the software productivity measurement literature. Software application development is a complex business involving. (Nelson. 1998) reported a survey of owners of facilities constructed over the preceding two years. The parallels in the two problems are many. software development performance measures are even more impressive. 222% schedule overrun. one-third of the projects were over budget and nearly half were delivered late. 1996). of course. a single statement in a high-level database may duplicate the effect of 50 to 100 lines of assembly code (McConnell.

1999a. and was experiencing significant difficulties in scheduling and estimating software development projects using source lines of code as a measure of size. The function point method is well established in the industry. Major development activities followed. It was first developed by a team within IBM headed by Allan Albrecht in the early 1970’s. including the Department of Defense and the Internal Revenue Service (Jones. a standardizing organization (the International Function Point Users Group. 1990). 1996. although a number of deficiencies were identified in the original IBM system as PC environments and client/server development became more important in the 1980’s. The function point method is the most commonly used means of performing such an assessment. it has been shown to give reasonably consistent answers for different operators (Abran and Robillard. et al. Function point analysis (FPA) relies on a standardized scoring system similar in some respects to the PDRI scheme. cost estimating. Function points are a synthetic measure of program size. Tichenor. 1997. 1996). Jones. 1989). assessment of product quality. Low and Jeffrey. and has been adopted by many major software users. IFPUG) was set up in the middle 1980’s. IBM at the time was the world’s largest developer of software. Nelson et al. which can be easily calculated from a requirements specification and updated periodically in the development process if the scope of the program changes (McConnell. 1995. Matson and Barrett. The function point analysis is a commonly used measure of program size. 1997. While it incorporates some subjectivity in the assessment of complexity. development of schedules. The idea caught on rapidly. and many other common uses. 1994. The function point methodology has a fairly long history of development. and IFPUG has continued to keep up a standardized 211 .development process. The IBM method was developed and used internally within IBM for achieving a more independent measurement of project size. McConnell. and has been successfully used as a part of the evaluation of productivity. 1999). Zincke. with standard university course materials and a classic text (Dreger. Nelson. 1999. and object-oriented programming made the basis for the function point count more ambiguous. and has required substantial investment of time and resources. 1993. To answer these problems. and was shared with outside software developers in 1979.

outputs (screens. reports. In the IFPUG system. The function point score is calculated from a count of the number of five major items in the software project: inputs (screens. inquiries (In). data files. the complexity factors are obtained from Table 1. but this process has proven to be slow and contentious (Jones. international efforts are underway to consolidate the function point methodology under the ISO umbrella. 1994) for counting. because it requires only a listing of items found in the functional specification for the software project. a given item (say for example an input) can require a different level of effort depending on its complexity (for example. about 25 systems are in use at the present time. outputs (O). The complexity factors are developed from a standard counting system. The calculation of the function point score for a given project can be done at an early stage in the design and development process. a simple box in which the user chooses “continue” or “cancel” is less effort than a complex input screen with several embedded subforms). inquiries (queries). other groups have also broken off and developed their own systems. or messages). and to offer training and certification for function point specialists. graphs. the following discussions will relate directly to the IFPUG system. and interfaces (connections to other systems or databases). The overall calculation is thus completed using the equation below: FPRaw = N I cI + N O cO + N In cIn + N DF cDF + N If cIf where: FPRaw = raw function point score for the software application N_= number of inputs (I). the most important being the Mark II system commonly used in Europe and Canada and the Object Oriented metric MOOSE. data files (DF) and interfaces (If) c_= complexity factor for each item counted. forms. or messages). 1999b). Currently. The complexity of the item is also captured in the weighting system. Furthermore. Over the same time period. controls. A different amount of effort is associated with satisfactory completion of these items. dialog boxes. Because of its prevalence in the United States.system (Garmus. A complete description 212 . Altogether. which is captured in the function point analysis by use of different weighting factors. but other systems operate similarly.

choose the score based on the degree of influence on the project below Degree of % Application Score Influence Affected 0% None 0 1-20% Minor 1 21-40% Moderate 2 41-60% Average 3 61-80% Significant 4 81-100% Strong 5 The score is totaled for all 14 elements. measurements of quality. A processing complexity factor is applied to the raw function point score.35. and then mapped using a standard chart to Ac Once the function point count has been performed. as shown below: FPCorrected = FPRaw Ac where: FPCorrected = function point score corrected for overall complexity of the application Ac= Correction factor for overall complexity of the application. it is then factored to account for a number of potential complications. 1994) Item Counted Inputs Outputs Inquiries Data Files Interfaces Low 3 4 3 7 5 Complexity Factor. but largely controlled by the rules contained within the standard. Table 2: Development of Overall Application Complexity Factor (after Jones. Table 1: IFPUG Complexity Factors for FPA (after Garmus. and any 213 . cost estimates. c_ Medium 4 5 4 10 7 High 5 7 6 15 10 Once the raw function point count is established.of the process of assessing complexity from low to medium to high for each item would be very lengthy and is beyond the scope of this discussion. and is selected as described in Table 2. the resulting value can then be used in calculation of productivity.65 to 1. but it is important for the reader to note that the determination of complexity is partly based on subjective evaluation. The Ac-value can range from 0. 1999a) Complexity Factors Distributed Data/Processing Data Communications Performance Objective Heavily Used Configuration Transaction Rate On-Line Data Entry End User Efficiency On-Line Update Complex Processing Reusable Code Installation Ease Operational Ease Multiple-Site Use Facilitate Change Scoring System For each of the factors at left.

or the size of the maintenance staff which should be available.22 0. 7. 214 . the function point can be used as a cost estimating tool.25 0. The metric should support all sizes of applications. 1999a) Measure Productivity Unit Rate Quality Calculation FPCorrected Staff Month Staff Month FPCorrected Defects FPCorrected Typical Values 4 .5 – 9 Other potential uses of the function point score include its use in a number of important rules of thumb. Examples of uses in the software industry are shown in Table 3.0. 3. using the unit rate based on historical information on projects and the hourly rate for the staff to be involved to determine the expected cost of an application for which a function point score can be estimated at the beginning. the size of the programming staff required. These are: 1. There should be adequate published data to assess accuracy and to benchmark. The metric should be standardized. Jones (1999a) lists 12 criteria that software metrics should meet. Also. 10. The metric should be unambiguous. such as estimating the number of pages of documentation which will be required. The metric should give some input to all deliverables. There should be a users group. 6. The metric should support all kinds of software. 5. 11.05 . 4.other measure that is best expressed as a ratio of the project size. 9. 8. The metric should be useful as an estimating tool for new projects. The metric should be useful for assessing resources for and estimating costs for changes to legacy projects. Table 3: Typical Uses of the Function Point Scale (after Jones. The metric should support all programming environments. There should be conversion rules for related metrics. 2.

12. RT 156 sought a better measure of size than the number of drawings. Assessment of project complexity must be made. the drawings required can vary from owner to owner even for similar projects. 2 and 11 above.g. RT 156 studied measures of productivity in a number of other industries. In the function point method. but may be determined using related calculations (e. Measures of quality are typically not built directly into the productivity measurement in other industries. The conclusions drawn by RT 156 based on its assessment of the function point methodology and productivity measures in other industries include the following: The drawbacks to using the lines of code measurement which are widely understood in the software industry are similar in many ways to those which can be described for drawings in the EPC industry. which explains the failure on item 11. The measurement of drawings is ambiguous in the database design environment. Jones indicates that function points meet outright all but items 1. The function point system is reportedly inaccurate for very small projects (less than 15 FP). as a minimum. RT 156 studied the function point methodology because it was believed to be very instructive for the EPC industry. this assessment is hard-wired into the system through the overall 215 . Families of measures are needed to assess the overall performance of the engineering effort. and can be artificially inflated in much the same way that one can inflate lines of code with comment statements or inefficient coding. In addition. cases won/billing. Furthermore. These related measures must include. As a result of this ambiguity. delivered defects/corrected function point). The metric should support new and reusable code. The function point system fails on item 1 because competing standards exist. patient mortality/total patients. for example. the quality of the input to the engineering phase and the quality of the output from the engineering phase. and on item 2 because there are some ambiguities in the selection of complexity factors and the counting rules.

complexity adjustment factor. traditional) Support all sizes of project Support all programming languages Support all sizes of application 216 Met by Installed Quantities? Quantity take off is a relatively standardized process Some ambiguities depending on which quantities are selected None exists None exists Can estimate installed quantities at this stage and modify as design proceeds.g. and recognized that installed quantities served many of the criteria listed (modified to suit EPC): Jones (1999a) Criteria Standardized method for measuring size Unambiguous EPC Criteria Standardized method for measuring quantity. This measure is already available as part of the construction and as-built assessment process. RT 156 looked at the Jones (1999a) requirements. . RT 156 recognized that the installed quantities at the job site are a measure of the size of the project.) Support different design methods (e. process industry. This observation led RT 156 to conclude that the development of “engineering design points” is not likely to occur in the near future. if ever. but has not to date really been thought of as a measure of project size for engineering. and the standardization effort is not yet completed. etc. However. and for which quantities to measure Unambiguous rules for take-off Formal users group Adequate published data using metric Workable for new projects Formal users group Adequate published data using metric Workable for projects at end of scope definition Workable for legacy systems Have conversion rules for related measures Deal with all deliverables Support all kinds of software Workable for retrofit/remodel Have conversion rules for related measures Not directly translatable Support all kinds of projects (e.g. It would also be possible to consider complexity by producing different ranges of productivity for projects of differing complexity. Probably different benchmark values can be developed over time. manufacturing. Development of the function point methodology for assessment of project size has required over 20 years and thousands of staff hours so far. 3-D database vs. Same as above None exist Probably different benchmark values can be developed over time Installed quantities should be independent of design method Quantities will increase with project size. and to seek instead a simpler measure of project size.

Y. Austin. J. 35-44. T. Benayoune. “Reliability of Function Points Productivity Model for Enhancement Projects (A Field Study). References: Abran.. ASCE.” Computers in Industrial Engineering. 10 (4). “A Concurrent Engineering System for Process Design and Operation.R.L. “Improving Project Management of Design. (1990). IEEE.” Journal of Management in Engineering. A. (1994). (1994).J. “Reverse Engineering and Its Relevance to Industrial Engineering: A Critical Review... First.. Construction Industry Institute.D. 342-349. and Tucker. 343-348. pp. pp. Ireland. 217 . and Hosni. M.. B. Rugby. Dublin. (1993). there are also a number of important differences between software development and the engineering process which must be kept in mind. (1993). (1987). This means that counting and attribution of the function points is more straightforward.N.” Proceedings. and very detailed descriptions of the underlying algorithms. As important as the analogy to the software industry is. software firms usually deliver a final product (the application). H. Beaudin. and McGreavy.L. usually there are fewer discipline-specific players (such as the engineering subcontractors or vendors in the EPC industry) in the software arena. Burati. G. and fewer companies involved. Third. Anderson.” Journal of Management in Engineering. The functional specification usually include a very detailed data model. A. Bidanda. “Implementing TQM in Engineering and Construction. B. “Input Variables Impacting Design Effectiveness..H... Chalabi. 6 (3). ASCE. Second. functional specifications or requirements documents for software are generally much more detailed than the scoping documents produced for designers. C. Institute of Chemical Engineers. screen shots which show the content of every screen. 26 (2). Looking Forward to Construction.. 4th European Symposium on Computer Aided Process Engineering. 9 (4). S. UK. and Robillard. TX. P.. R. and Oswald. ASCE.Jones (1999a) Criteria Support new or recycled code EPC Criteria Support design-from-scratch or reusable designs Met by Installed Quantities? Measure should show the effectiveness of designs using reusable code The software industry parallel was a key component in the selection of final installed quantities as part of the engineering productivity calculation. Jr. pp. Brown. 367-374. “Looking Back at Design.D.A. Conference on Software Maintenance.” Source Document 26. 80-87. 28-30. and Beaton. 1994. and Salazar. while design is an intermediate step. Mar. pp.F. 456-470. pp.” Proceedings. C..” Journal of Management in Engineering.

D. IEEE. 63-70. Glavan. Hacker.B. (1993). Institute of Engineers Australia. pp. Mar 11-16. and Carlson. T. 119 (4). R. Westerville. “Managing Programmer Resources in a Maintenance Environment with Function Points. (1993). J. Fenton. J. and Sauer. Apr. Jr.” High Speed Ground Transportation Systems I: Proceedings of the First International Conference on High Speed Ground Transportation Systems. Russell..Contributions Of Cognitive Ergonomics. (1997). 18-20. “How Effective Are Software Engineering Methods?” Journal of Systems and Software. “Increasing Design Quality And Engineering Productivity Through Design Reuse. FL. G.” Proceedings. 57th American Power Conference. 54-60.G. Girczyc. (1993). (1994). OR.” Journal of Construction Engineering and Management. Piscataway. and Thor. 1995. pp.S. (1995). W. A. 30th ACM/IEEE Design Automation Conference. 1088-1096. B. TX. 61 (3). (1991).” Proceedings. Featham. “Scope Management Using Project Definition Rating Index. 457-467. (1990). USA. R. Barton. 47-65.” Proceedings. 98 (1-2).. Dreger. Goodwin. “Applying Software Tools To Enhance Engineering Group Productivity. and Sensenig. CA. P. A. N. 714-728. (1993). pp.APEC '90. M. Dallas. Chicago. IL.. R. Fisher.” Cost Engineering. ASCE. 22 (1). Release 4..D. 40 (10). Oct 25-28 1992 Orlando. 117 (1). Jun 14-18 1993. (1989). Hanna. (1998). (1998). editors. Sydney. B. Illinois Institute of Technology. NJ. Gibson.” Southern Economic Journal. “Forecasting Design-Related Problems: Case Study...S..E. Part 2 (of 3). and Fish. Howell. 40 (2).S. “Business Aspects and Cost Advantages of Partnering for Fossil Engineering Services. “Improving Engineering Design . Australia. 141-146. NJ. Fleck. 1663-1668. 36-41. T.A.” Ergonomics. (1991). 17-21.R. J. “Quantifying the Effect of Change Orders on Electrical Construction Labor Efficiency. R.R. New York. ASCE.. IFPUG Counting Practices Manual. W.D. “Life-Cycle Productivity in Academic Research: Evidence from Cumulative Publication Histories of Academic Economists.Christopher. NY. 1991. Piscataway. E. Prentice Hall. Ohio.J. Portland. Dumont. 612-618. “Productivity Issues in Engineering. (1997). Australia. Englewood Cliffs. International Function Point Users Group. and Tucker. July 8-12. pp. Productivity Press. 1990.” Proceedings. pp. Garmus... and Thomack. pp.” Journal of Construction Engineering and Management. 48-53. pp. 13 (5). IL. Laufer. Isbell. 218 . ASCE. International Mechanical Engineering Congress MECH 91. pp. pp. “Concurrent Engineering Planning in HGST Systems. Chicago. USA.. New Jersey. USA. and Ballard.. pp. Handbook for Productivity Measurement and Improvement. 728-743. Graham.” Journal of Management in Engineering.H. IEEE. pp. ed.G. C. T.R. USA pp. D. (1993). USA. G.L. Los Angeles.F.” Industrial Management and Data Systems.. Fifth Annual IEEE Applied Power Electronics Conference and Exposition . “Interaction Between Subcycles: One key to Improved Methods.0. Function Point Analysis. S. G.

120 (1). Rapid Development.” Proceedings. “Protocol Quality Engineering: Addressing Industry Concerns About Formal Methods. M.” Journal of Professional Issues in Engineering Education and Practice.” Computer. (1990).C. (1996). Low. USA. pp. France.P. (1994). A. (1999b). “Measuring Productivity and Organizational Effectiveness in Technical Organizations. N. (1994).” Presentation at the 1999 IFPUG Conference. R. 204-219.” Engineering News Record. “Software Development Cost Estimation Using Function Points. (1999a). “Object Oriented Approach To Control Strategies. 9 (4). 28 (2).O.E. Institut National de la Statistique et des Etudes Economiques. Mairesse. Probert.. 19 (14). Post.K.B. 1996 International Conference on Advances in Instrumentation and Control. OR. Washington.. Nelson. 20 (4). and Dove. Oct 6-11. 48 (5). Jones. Massachusetts. 9303. F. CREST. SE-16 (1). D.A.. pp. (1996).” Journal of Health Economics...” Journal of the Japanese and International Economies.” Working Paper No. C.W.. Jundt. Ohio. 1-7. (1995). Microsoft Press. Porter. “Function Points: 1979 to 1999 and Beyond. (1992). (1996).” Computer Communications. and Spurlock. Pelz. (1990). Burlington. Matson. IEEE. ”Function Points in the Estimation and Evaluation of the Software Process. and Scully. J. J. J. Jones. “Measuring Managerial Efficiency: The Case of Baseball.” PM. 240 (19). Portland. available from Artemis. IL. R. and Andrews. Ohtake. ISA/96. and Kremp. John Wiley and Sons. Westerville. November-December. 32-39.G. “Determining Software Schedules. (1998).” IEEE Transactions on Software Engineering. and Ohkusa. “Curing the Software Requirements and Cost Estimating Blues. 1258-1267. (1994).M.L. M. and Lew. “The Value of Function Point Metrics. G. 275287. B.. 54-60. and Ross. pp. 642-650. F. pp.M. ASCE. “Building Teams Get High Marks. (1966).” Southern Economics Journal. USA. Malakoff. 1996. Lee. pp. New York. eds. NC.. Scientists in Organizations: Productive Climates for Research and Development. 337-346. P. 73-75. Instrument Society of America.” in Handbook for Productivity Measurement and Improvement.E.H. Chicago. J. Ott. Inc. 463-481. Redmond. Y. Productivity Press. International Function Point Users Group. C. pp. S. 8 (2). N.” Proceedings. (1982).” IEEE Transactions on Software Engineering.. J. pp. “Monitoring Physicians: A Bargaining Model of Medical Group Practice. “Dimensions and Management of Facility Performance. pp.Jones. E. J. C. and Barrett. USA. (1999). “Testing the Matching Hypothesis: The Case of Professional Baseball in Japan with Comparisons to the United States. M. 1999 IFPUG Conference. Packer. Clark. 219 . Porter. Christopher and Thor.. “A Look at Productivity at the Firm Level in Eight French Service Industries. pp. McConnell. (1993). Research Triangle Park. NY. pp. pp.C. M. L. 64-71.

Thomas. 41 (10).” Journal of Construction Engineering and Management. Stephan. pp.R.. IFS Ltd. “Factors Affecting Masonry-Labor Productivity. “Efficiency Wages and Employment Rents: The Employer-Size Wage Relationship in the Job Market for Lawyers. Bradley. “The Partnering Process--Its Benefits. ASCE. pp. American Society for Engineering Management.R. “Quantitative Effects of Construction Changes on Labor Productivity. 626-644.M.” Building and Environment. Shapiro. H. 99-106. and Physician Productivity. Construction Industry Institute. 121 (3). and Napolitan.J. ASCE. (1995). (1992). and Beck. and Olomolaiye. R.” Journal of Construction Engineering and Management. G. Construction Industry Institute... Portland. Kempston. 79 (1). 290-296. G. H. and Evans. Rebitzer. pp.Proverbs.. Sackett. TX. S. “Objectives Matrix Values For Evaluation Of Design Effectiveness.L. D. K.S. “The Critical Importance of Careers in Collaborative Scientific Research.C. NY.O. Rolla. 1st International Conference on Human Factors in Manufacturing. S. 8 (1). 118 (1). P.R..O. H. Repic.. pp. pp.R. Handa. Inc. and Cost/Benefits. Productivity And Quality Of Working Life..B. ASCE.New York. J. 220 . and Levin.. Austin. 13 (4).E. E. Smith. Austin. pp. 35-38. “Masonry Productivity Forecasting Model. C.. ASCE. 163169. J. ASCE. Maloney. Stull.R. UK. (1995).. Thamhain. (1994).R.G. Value Engineering. Thompson. 181-187.” Source Document 22. (1985).” Revue D’Economie Industrielle.” Journal of Performance of Constructed Facilities. Utilization Patterns.. Engineering Management: Managing Effectively in TechnologyBased Organizations. USA... MO. (1990). pp. T.” Proceedings. (1993).R. T. P.” Proceedings. OR.B. S.. 33 (4). Construction Industry Institute. pp.F. H.. CII Source Document 35.F. S.” Journal of Construction Engineering and Management. pp. Thomas.” Research Report 102-11. Crane..” Journal of Labor Economics.F. L. J. (1997). “Effects of Scheduled Overtime on Labor Productivity. W.. pp. (1996). (1984) “Computer Aided Engineering.G. “A Comparative Evaluation of Planning Engineers’ Formwork Productivity Rates In European Construction. D. 705-726. The Manual of Construction Productivity Measurement and Performance Evaluation.S. and Alaydrus. ASCE. 60-76. Sanders. J.. Implementation. and Sanders. London. “Engineering Manager's Role In Improving Productivity.” Journal of Construction Engineering and Management. (1991). Holt. (1988).” Journal of Construction Engineering and Management. 116 (4). 678-708. (1992).E.. H.. 117 (4). and Thomas. A. “Modeling Construction Labor Productivity. V.R. TX.R. 45-61. and Sanders.J. “Constructability Related to TQM. And Measurement.. and Kramer. Horner. TX. (1986). R. USA.D. and Thomas. pp. J.J. P. D. Zwanziger.K.. John Wiley & Sons. 1033-1038. S. 31-45. P. Austin. 6th Annual Meeting of the American Society for Engineering. H. “Projecting the Need for Physicians to Care for Older Persons: Effect of Changes in Demography. Sanders.... Thomas. Russell. H.B. (1993). and Tucker. S.R. 119 (1). pp. Swiggum. and Taylor. J.. USA. Reuben. Thomas." Journal of the American Geriatrics Society.

and Kallioinen.R. B. Mattson. New York. T. (1997).B. C. (1997). Zincke. 76-83. International Computer Software and Applications Conference. USA. 591-592. ACM. 8 (8). ASCE. 88 (2). IEEE Computer Society. “How to Achieve 7. “Computer-Aided Process Engineering: The Evolution Continues. 21-26.Tichenor. NY. Conference on Object-Oriented Programming Systems. pp. Winter.. B. (1994). pp.. “Assessment of Productivity of Psychogeriatric Day Care. Wimo.. pp. Warne. and Applications. (1992). M. Ineichen.52 Function Points per Person-Day with Object Technology. 221 . Languages. “Internal Revenue Service Function Point Analysis Program: A Brief. pp. A. 675-679.” Proceedings. P. G. (1993).” Chemical Engineering Progress.” International Journal of Geriatric Psychiatry.” Proceedings. Partnering for Success.

APPENDIX D PIPING ENGINEERING AND DESIGN 223 .

the piping design process and the principal design documents developed throughout this process are described (Crocker. fabrication details.) Stages of Project Development Stage I According to Crocker (1992). The preparation of the contract specifications is partially an engineering task. this goal cannot be achieved without a closer look at the piping design process itself and its various outcomes. The first stage consists of inception. These 224 . Before proceeding with piping engineering activities. The contract specification establishes the applicable codes and standards. monitoring and documenting the design. the importance of communicating and monitoring all design-related outputs is unquestionable. 1992) and (Smith and Van Lann. In this report. Those specifications are normally prepared under the sponsorship of the owner organization to define the owner basic requirements of the piping system(s). it is not uncommon to prepare a comprehensive set of documents defining the system design criteria. In terms of the piping function. Contract documents. procedures and specifications constitute primary tools for precisely communicating. if any. preliminary design. this stage starts with the development of the contract specifications and ends up with the development of the piping and instrumentation diagrams/drawings (P&IDs) and initial piping physical sketches and composite drawing. and estimating. However. This stage closely resembles the first stage of project development as defined by the Construction Industry Institute (1997) under the name pre-project planning. design documents. 1987. a typical project evolution can generally be divided into three principal stages. fabrication and erection activities for a piping system.PIPING ENGINEERING/DESIGN PROCESS In today’s atmosphere of complex projects and consequently complex piping activities. assignment of responsibilities. the owner’s requirements and the obligations of the parties involved in the project.

The design criteria can be updated down the road as the design progresses to reflect any change in the design basis. intent of the design. pipe and vessel wall thickness. contract. normal. the mechanical discipline assumes the lead for preparing system descriptions for piping systems. two sets of design documents are prepared somehow in parallel. After developing the design criteria. estimating. codes and standards) 3. and control system discipline inputs. Operating modes (start up. Because the design process is an iterative process. design parameters. and major features of the piping system. one document containing all discipline criteria may be prepared. Design bases (safety. Calculations are documents prepared to support the establishment of flow rates. These are system descriptions (SDs) and process/system flow diagram (PFD). and other design parameters. it is usual to issue calculations at various stages of completeness based on firmness of the design input. shutdown. and should define the applicable codes and standards. the functions. Calculations are always regarded as the foundations for the piping system design. Since some systems need mechanical. temperatures. environmental conditions. From the very early stages of design. Function 2. heat transfer rates. The purpose of system description is to set forth. emergency. calculations are also prepared for the pipe stress and flexibility analysis and pipe support design. or alternatively. which will govern the work. The major topics covered in a typical SD are: 1. For later stages. calculations come into existent. Calculations may be issued “preliminary status” for inhouse review. Customarily. or as appropriate) 4. Control concepts 225 . each discipline may prepare its own system description. Description and performance ratings of the of major equipment 5. system pressures. In either case.criteria may be part of an overall project design criteria or may be a separate document prepared solely for the piping design. and bidding. specifically in writing. electrical. and other design bases. the design criteria should reiterate the design requirements delineated in the contract specifications.

and blow-out procedures for the piping system and by the plant operators to operate the system. engineering line diagrams (ELD). class. unique for each valve. Sometimes. P&IDs provide direct input to the field for the physical design and installation of field-run piping. P&IDs contain a minimum amount of text in the form of notes. Each line of piping indicated on the P&IDs is usually identified by a unique line number (Smith and Van Lann. 1996). if being defined somewhere else. most projects maintain a valve list of all valves used in the piping system. Piping and instrumentation diagrams/drawings tie together the system description. it may be appropriate only to reference their source(s). With the existence of system description. or mechanical flow diagram (MFD) (O’Connor and Liao. identifies the system. the piping and instrumentation diagrams/drawings (P&IDs) are developed to provide a schematic representation of the piping.The design bases stated in the SD is used to develop the process/system flow diagram (which shows the features necessary to accomplish the design bases). This change in line number thus servers as a flag to the engineer that a change has occurred. the finalized P&IDs are used by the start-up organization(s) for preparing flushing. Later. electricity control schematic. and control logic diagram (last two items are beyond the scope of piping engineering and design). P&IDs basically show the functional relationships among the system components. 1987. Based on the process flow diagram. In addition to a line list.) The line number is used to reference the subject line’s entry in the piping roster. and possibly type of valve. process flow diagram. which is known as line list. 226 . The process flow diagram extends the purpose of the SD by schematically showing the operational relationships among the major system components and by stating the design and expected process variables for selected modes of operation. The line number remains constant as long as the piping design parameters remain constant. It can be said that the process flow diagram graphically illustrates the system description. P&IDs are referred to as engineering flow diagrams (EFD). flow sheets. The first P&ID in the set for a job should contain a legend defining all symbols used. The valve number. To realize full benefit of the process flow diagram. process control and instrumentation. testing. it should be issued with the system description.

fabrication techniques. 1996). A general practice among engineering organizations is to prepare 227 . However.. Crocker (1992) in his classification of project stages extend the scope of stage I slightly beyond the preparation of P&IDs to include the development of the isometric and orthographic piping sketches.e. which can provide the design study more efficiently than manual methods in many instances today. Portions of other piping systems and structural. design. The design specifications may be prepared for in-house design work to control the quality of work performed. the piping design progresses to the second stage. The sketches and composites are not used for construction or manufacture. which are the backbone in piping design. As mentioned earlier.e. i. in which case they are called composite drawings. Piping and instrumentation diagrams/drawings constitute the basic foundation for design isometrics and orthographic physical layout drawings.) They primarily dictate requirements concerning applicable codes. according to CII project development stages (1997). fabrication and installation (O’Connor and Liao. i. or issued to govern contracted work (Crocker 1992. they rather represent the pre-computer version of the present database in a CAD system or one of its derivative systems. The isometric and orthographic piping sketches initiate the piping layout effort. defined by Crocker (1992). preferred components and supports (Smith and Van Lann. control.) Based on the design specifications. This emerges primarily from the fact that their impact is immense on all subsequent engineering processes. Finalizing P&IDs is considered the first activity among those activities composing phase II.. Stage II With the completion of the initial draft of the calculations and P&IDs. 1987. and HVAC information may be included in these sketches. electrical. piping materials.Preparation of P&IDs is reported to be the most problematic area in piping design. Design specifications are prepared to define the performance requirements for services and materials. the CII classification (1997) considers finalizing P&IDs and initializing the piping layout process as part of stage II. piping layout proceeds to produce the physical design drawings. design.

which govern the work. piping isometric drawings. it is common practice to develop separate piping isometric drawings for each piping-run. the procurement specifications and bills of materials for piping fabrication are prepared to define the scope of work. changes in fluid flow rates. a. a spool sheet covers only one spool. A copy of these documents should be made available to the piping fabricator. The physical design drawings cannot be finalized without supporting stress analysis reports. fire. This term is applied to calculations which address the static and dynamic loading resulting from effects of gravity. With the development of three-dimensional CAD systems.g. These models primarily serve as models for the stress analysis work and show all relevant information and supports/restraints. Codes and standards establish the minimum scope of stress analyses. the piping fabricator prepares the piping spool drawings. Piping isometrics are three-dimensional representations of the piping system shown in two dimensions on the piping orthographic drawings. The specifications should reference the applicable design documents. Stress isometrics could be developed parallel to stress analyses by using piping orthographic drawings and ordinary isometrics as reference. and other environmental conditions. temperature changes. applicable codes and standards. testing requirements. Certified spool sheets are the shop detail drawings developed from the piping isometrics or the piping orthographic drawings for each prefabricated section of piping and approved by the engineer.k. Once the orthographic drawings are completed. However.a. Some codes prescribe loading combinations with not-to-exceed limits. internal and external pressures.. For complex piping systems. In comparison. they may be issued for piping fabrication and construction.orthographic piping drawings. Generally. e. isometrics need not be drawn to scale. piping isometric drawings are easier to use than orthographic drawings because they are more easily visualized and all the information on an isometric drawing pertains to the piping of interest.. and documentation requirements. fabrication and examination requirements. material requirements. By the end of piping drawings production. environmental conditions. seismic activity. the piping segments may be drawn as along as necessary for clarity. the designer can easily check interference and perform a more quality piping-layout. According to the procurement specifications. spool sheets. whether the spool consists of only one fitting or many piping 228 .

elements (limited in size only by shipping or handling capabilities). The spool sheet
should specify all dimensions, materials, fabrication procedures, examination and testing
requirements, and code stamping requirements, as applicable. It is customarily for the
engineering organization to review the spool sheets to ensure that the code,
specifications, and design documents were properly interpreted by the fabricator.
Another set of specifications is the erection specifications, which specify the
applicable codes and standards governing the erection work. The specification should
again delineate any fabrication, quality, examination, testing and documentation
requirements imposed beyond that is required by the code. Also, all design and
fabrication documents should be incorporated in the erection specifications. Erection
isometrics are commonly developed to assist in erecting the designed lines of piping.
Regarding the design and procurement of pipe supports, they are usually
accomplished in one of two ways: the supports can be pre-engineered by the engineer, or
the design can be contracted to a pipe support manufacturer. In the first case, the engineer
prepares a detailed pipe support drawings & specifications containing the necessary
fabrication details and bills of materials. The pipe support fabricator only needs to
fabricate the supports in accordance with the drawings. In the second case, the engineer
specifies the direction of restraint, the type of support (e.g., variable, constant, or rigid),
and the support load. The support manufacturer prepares detailed fabrication drawings for
the shop’s use. As with spool sheets, the engineer reviews the pipe support drawings for
compliance with specifications.
An essential input to the finalized facility design is the certified equipment
drawings. Certified equipment drawings are generally submitted to the engineer by the
equipment supplier for review and approval. Theses drawings are used to finalize the
facility design—that is, the physical and operating characteristics of the actual procured
equipment are incorporated into the design.
The previously enumerated documents mostly define the engineering activity
throughout phases II, III, i.e., design and material management (procurement), as defined
by the Construction Industry Institute (CII 1997)

229

Stage III

As the piping system design reaches completion and materials begin to arrive at
the job site, the project enters the third stage (stage IV and V, i.e., construction and startup & commissioning, as defined by the Construction Industry Institute). The major
deliverables encountered at this stage are as-built drawings, field change requests,
nonconformance reports, and start-up field report. Those deliverable are not going to be
explained here as they are beyond the scope of engineering as specified by the
engineering productivity measurements research team.

230

Figure B.1
Piping Engineering & Design Process

a

Equiment General Arrangement

Line
List
Organization
Structure and
Responsibilities
System
Description

Owner's
Requirements

Contract
Requirements

Design
Criteria

Design
Specifications

PFD

AFD P&IDs

Piping 3-D Model

Orthographic
Drawings

Codes and Standards
Interference
Check

Equipment List

Stress
Analysis

Pipe Support
Drawings

Isometric
Drawings

Initial Calculations

Calculations

Pre-Project Planning

Detailed Design

231

b

c

Figure B.1 (Continued)
Piping Engineering & Design Process

AFC
P&IDs

a

Finalized
P&IDs

Non-conformance
Reports
Field Change Requests

b

c

Procurement
Specifications

Certified
Spool Sheets

As-Built Drawings

Start-up Field Reports

Erection
Specifications

Procurement & Construction

232

Start-up and Commessioning

APPENDIX E
QUESTIONNAIRE SURVEY

233

CII Research Team 156
Questionnaire: Part A
Overall Project Data
Project Input Variable and Design
effectiveness
A-1

Company Information

A-2

Project Information

A-3

Project Input Variables

A-4

Overall Project Results

A-5

Overall Project Design Effectiveness

234

A- I.

COMPANY INFORMATION:

The purpose of this section is to provide the research team with a basic information needed to
identify the various responses received. It will be further used if one or more or the responses
need to be clarified by the person filling out the form. Both company and person identification
will remain anonymous. All data will be held in strict confidence.

1.1 Your company’s name:
1.2 Engineering Company’s name:
1.3 Contact person (name of person filling out the questionnaire form)

Name
Position
Phone number

(

)

Fax number

(

)

E-mail address
1.4 Please specify how many years of experience you (person filling out the questionnaire) have
in the industrial sector of the construction industry.
years
1.5 How do you classify your company? 

Owner w/ in-house design capabilities 

Owner w/o in-house design capabilities 

Full-service design firm 

Constructor 

Designer/Constructor 

Others, Specify

235

. The project selected SHOULD satisfy ALL of the following criteria: • The project is classified as primarily engineered in North America. unit price. • The project must have a detailed piping engineering and design work.A-2 PROJECT INFORMATION The purpose of this section is to investigate several project parameters that affect the performance of design and engineering functions and the means to measure this performance. Personnel filling out the questionnaire are requested at this stage to identify one of the industrial projects they have recently completed. no)? If Detail Engineering effort was split between multiple design sites (How many?) 236 . and further provide information regarding project-specific data. Piping function will be considered as the primary experimental area for Part B of this survey.) Number of equipment pieces (count) Refer to Appendix I for description of counting procedure Was Detail Engineering effort split between multiple design sites (yes. 2. or other. medium. guaranteed maximum. • The project is classified as grass roots OR new unit within existing facility (Greenfield). time & material. Modification projects should NOT be considered. small. very large)      Very Small Small Medium Large Very Large    Owner did total Engineering Design Owner did initial scope development Owner did work up to Detail Design Country of Construction Client / Owner Participation: Design contract type (lump-sum/ fixed price.1 Please provide the following basic project data as outlined in the table Data Item Project Identifier (If confidential-please give code name to allow discussion) Value/Information Grass roots or Greenfield Size of Plant (SF) # of Raw Materials # of Finished Products Project size ($TIC) Size of plant compared to other projects from the same industry sector (very small. large.

$ Equipment Engineering (including engineering subcontracts) Construction (including bulk/construction materials and construction subcontracts) Other TIC ($) % 100% Engineering Schedule 2.2 What is the principal type of project you identified ?        Electrical (generating) Oil exploration/ production Oil Refining Pulp and Paper Chemical Manufacturing Environmental Pharmaceutical manufacturing  Metals refining/ processing Microelectronics manufacturing Consumer products manufacturing Natural gas processing Automotive manufacturing Foods Others. please record the breakdown of project costs among the various areas indicated. Specify:        2.2.4 In the following table.3 Please specify how many years of experience you (person filling out the questionnaire) have in this specific type of industrial projects as of your answer to question 2. years Project / Engineering Costs 2.2.5 Use the scale below to describe the nature of the detailed design schedule compared to other projects from the same industry sector? Crash/ Fast Track 1 Normal 2 3 4 237 Non-Schedule Driven 5 .

Project Complexity Characteristics 2. No Assessment Demonstrated operations at achieved level Formal documented acceptance test over a meaningful period of time 2.6b How new was the “Process or Facility Technology” for this project to the Design organization? 0 Duplicate of Exsiting technology 1 | Scale up/ down of existing technology 2 | 25% new technology 3 | 50% new technology 4 | 75% new technology 5 100 % new technology 2.6a How new was the “Process or Facility Technology” for this project to the owner? 0 Duplicate of Exsiting technology 1 | Scale up/ down of existing technology 2 | 25% new technology 3 | 50% new technology 4 | 75% new technology 5 100 % new technology 2.6c What kind of impact did legal or environmental regulations have on this project scope? 0 No impact 1 | Insignificant impact 2 | Little Impact 3 | Typical impact 4 | Very stiff requirements had to be met 5 Regulatory requirments were reason for project 2.6d How large an impact did site conditions have on the design of this project? 0 No impact 1 | Insignificant impact 2 | Little Impact (New Unit in existing plant) 3 | Typical Impact for new site 4 | Slightly more than normal impact 5 significant impact 2.6f To what extent did Detail Design re-use previously designed standard modules ? 0 Essentially No Re-use 1 | Minor amounts of re-use 2 | Re-use in existing design standards 238 3 | Typical design module re-use 4 | More than usual module design re-use 5 Signifianct Module Design Re-use .6e Please indicate the method of acceptance testing used on this project.

familiar to the principal project participants used to actively manage changes on this project ? b) Was a baseline project scope established early in the project and frozen with changes managed against this base ? c) Were design “freezes” established and communicated once designs were completed ? d) Were areas susceptible to change identified and evaluated for risk during review of the project design basis ? e) Were changes on this project evaluated against business drivers and success criteria for the project before being accepted ? f) Were all changes required to go through a formal change justification procedure ? g) Was authorization for change mandatory before implementation h) Was a system in place to ensure timely communication of change information to the proper design disciplines and project participants ? i) Did project personnel take proactive measures to promptly settle. Sometimes or No to the following questions: a) Was a formal documented change management process.6h What Engineering Design Standards were used on the project? (Check all that apply)  Industry Standards were used  Design Contractor Standards were used  Owner Company Design Standards were used  Project Specific Design Standards were used 2. personnel authorized to request and approve changes.2. large facility size or process capacity. new construction methods. etc High Complexity. an unusually large number of process steps. etc 2. and the basis for adjusting the contract ? k) Was a tolerance level for changes established and communicated to all project participants ? l) Were all changes processed through one owner representative ? m) At project close-out was an evaluation made of changes and their impact on the project cost and schedule performance for future use of lessons learned ? 239 YES Sometimes NO . 1 2 3 4 5 Low Complexity Medium Complexity High Complexity Low Complexity. small facility size or process capacity. small number of process steps.6g How many times has the owner company worked with the Detailed Design Organization before this project ? 1 2 3 4 5 0 Projects (first time working together 1-2 Projects 2-10 projects 11-20 projects > 20 projects 2.Characterized by the use of unproven technology.please indicate the level of complexity for this project compared to other projects from the same industry sector.Characterized by the use of no unproven technology. proven construction methods. previously used facility configuration or geometry. authorize and execute change orders on this project ? j) Did the project contract address criteria for classifying change.6i On an overall basis. new facility configuration or geometry.7 PROJECT CHANGE CONTROL METHODOLOGY Please answer Yes .

Design/Information Technology Practices 2. 2.8 For the various design/information technologies listed below.8c Electronic Data Interchange (EDI) 0 No use 1 | Limited use 2 | 240 5 Extensive use .8b Integrated Data-Base Systems 0 No use 1 | Limited use 2 | 5 Extensive use 2.8a 3D CAD modeling 0 No use 1 | Limited use 2 | 3 | Moderate use 4 | 3 | Moderate use 4 | 3 | Moderate use 4 | 5 Extensive use 2. please identify the extent to which each of these technologies were used in the identified project.

10 representing very GOOD / very HIGH Place a (X) mark in the appropriate cell . cost. (CII Reference Document # 8. use the following scale to assess the Quality of each variable at the start of the Detailed Design. and their quality governs the various design/engineering activities and their outcome quality. These input variables. • Data availability • Automated system requirements • Objectives and priorities incl.2 A study conducted in 1987 by an earlier CII research team identified a set of Input Variables that have direct influence on design effectiveness.2) Considering this project. materials of construction and type of instrumentation.A-3 Project Input Variables 3. where 0 representing very BAD / very LOW.What was your PDRI scope at start of Detail Design ? 3.(The list below each topic is a list of example things to consider) 0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 Completeness of Scope Definition • Project type and facility description • Reference on process fluids.1 If Available. as listed below. schedule. capacity and quality • Special alternates needed • Description of new process equipment 0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 Owner Profile and Participation • Owner management capabilities • Construction and design experience • Decision timing • Previous experience with designer • Management of changes • Participation in detailed design activities • Project personnel continuity • Practices of review and approval of design documents 0 1 2 S 3 C 4 A 5 L 6 E 7 Pre-Project Planning • Pre-project planning studies • Procurement strategy • Selection and experience of designer with facility industrial process • Required operation and maintenance • Project organization • Selection of location • Construction strategy 241 8 9 10 .

continuity and spare equipment guidelines • Environmental requirements • Investment returns • Attainment of capacity • Investment vs.0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 Project Objectives and Priorities • Safety priority • Utility. aesthetics and expected life 0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 E 7 8 9 10 Basic Design Data • General plant identification • Cost considerations • General project summary • General service and yard facilities • Process description • PFD • Special requirements of design. site. risks • Start-up guidelines and costs • Project timing. Construction Input • Labor and material availability • New concepts or materials or systems • Appropriate construction methods and technology • Practical advice on field conditions. startup. and material handling • P&IDs and supporting data • Environmental data 0 S 3 1 2 C 4 A 5 L 6 Selection and Qualification of Designer • Designer organization • Project team continuity • Experience with similar projects • Contract agreement • Knowledge of the technology • Design approach • Management of changes • Capacity. shut downs. transmittal dates and critical schedule • Level of technology • Expandability. safety and labor conditions • Construction sequence • Design approach Constructability team activity during detailed design 242 . readiness and computer capabilities • Project team qualifications 0 1 2 S 3 0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 C 4 A 5 L 6 E 7 8 9 10 Project Manager Qualifications • Previous experience with similar industrial process.

where 0 representing very insignificant influence and10 representing very significant influence Place a (X) mark in the appropriate cell 0 • • • • • • • • • • • Completeness of Scope Definition Owner Profile and Participation Pre-Project Planning Project Objectives and Priorities Basic Design Data Selection and Qualification of Designer Project Manager Qualifications Construction Input Type of Contract / Contract terms Equipment Sources/Vendor Data Use of advanced information/design tools 243 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 .3 Considering the same set of input variables identified in question 3. use the following scale to identify – based on your personal experience – the relative influence of each variable on design effectiveness/design outcome quality.0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 Type of Contract / Contract terms • Extent of designer input • Owner’s expectations • Acceptance of designer proposal • Methods and procedures to handle scope changes • Designer’s responsibilities and authorities 0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 9 10 Equipment Sources/Vendor data • Preliminary vendor data • Schedule commitment (Timeliness of data) • Participation in detailed design • Frequency of changes during detailed design • Quality and completeness of data 0 1 2 S 3 C 4 A 5 L 6 E 7 8 Use of advanced information/design tools • CADD • IDB (integrated database systems) • EDI (electronic data interchange) • Information flow tools between project participants 3.2.

please indicate the budget and actual costs by project phase: • Phase budget amounts should correspond to the estimates at the start of detailed design • Include the cost of bulk materials in construction and the cost of engineered equipment in procurement • Refer to Appendix II which defines the various phases.barrels per day throughput If the product produced or function provided by this facility is confidential. Product or Function Unit of Measure Percent Capacity Achieved 4.2 What % of the total engineering workhours for design were completed prior to total project budget authorization ? 4. the measure is either one of input or output as appropriate Example: Crude oil refining unit.4 If available.1 Please indicate in the following table the product or function of the completed facility. For process facilities. their start and finish milestones and the typical activities and products associated with each Phase Budget (Including Contingency) Project Phase Pre-Project Planning Detail Design Procurement Demolition/ Abatement Construction Start-up TOTALS Amount of Contingency in Budget $ $ $ $ $ $ $ $ $ $ $ $ $ $ 244 Actual Phase Costs $ $ $ $ $ $ $ .A.3 What % of the total engineering workhours for design were completed prior to start of the construction phase ? 4. please write confidential in the first column and provide the other data.4 Project Results 4. the unit measure which best relates the product or function of the completed facility and the percent capacity achieved by the completed facility.

and the net schedule impact resulting from owner scope changes during detailed design.please indicate relative % of total engineering rework on this project.7 Where data is available.4. Total Number of Scope Changes During Design (Before IFC) Field Change Orders (After IFC) Net Cost Impact of Scope Changes $ $ Net Schedule Impact of Scope Changes wks wks 4.6 Owner Scope Changes: Please record the total number. what was the % of total engineering rework due to all reasons on this project ? If data is not available.5 Please indicate the Planned and Actual Project Schedule Durations by Phase: Planned Schedule Duration (weeks) Actual Schedule Duration (weeks) Project Phase Pre-Project Planning Detail Design Procurement Demolition/ Abatement Construction Start-up TOTALS 4. 0 No rework 1 | less than normal amount of rework 2 | Normal amount 245 3 | More than normal amount 4 | Significantly more than normal 5 Largest amount of rework ever experienced . the net cost impact.

0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 C 4 A 5 L 6 E 7 8 9 10 E 7 8 9 10 9 10 Accuracy of the Design Documents • Number of drawing revisions / revised drawings • Reworked spools • Participation in detailed design Number of spec revisions/ revised specs • Reworked hours 0 1 2 S 3 Usability of the Design Documents • • • Drawing Size Number of Drawings Cross-References between dwg. 10 representing very satisfactory / very good Place a (X) mark in the appropriate cell that indicates where you were at end of project.Project Design Effectiveness 5. use the following scale to identify the Degree of Satisfaction with each measure and which is of crucial value to customers.1 The following list represents a tentative set of measures of engineering and design effectiveness. • • • 0 Field Engineering workhours Clarity Completeness 1 2 S 3 C 4 A 5 L 6 Cost of the Design Effort • Piping design cost • Piping design workhours • • Piping design workhours Piping design cost. • % engineering $ as a % of TIC • % piping engineering $ as a % engineering $ 0 1 2 S 3 C 4 A 5 L 6 Constructability • Tolerances • Specialized Equipment • Different Crafts • Materials and Technology • Pre-assembly • Estimated $ savings as a % of TIC 246 E 7 8 . where 0 representing very unsatisfactory / very bad . (CII Reference Document # 8-1) Considering this project. This set is intended to measure various aspects of the design and engineering performance throughout the project.

0 1 2 S 3 C 4 A 5 L 6 E 7 8 9 10 A 5 L 6 E 7 8 9 10 Economy of the Design • Facility Layout • Overspecified Materials • 0 Oversized Members 1 2 S 3 C 4 Performance Against Schedule • Document Release • Project scheduled duration vs. actual duration • Intermediate Release • Engineering scheduled duration vs. actual duration 0 1 2 S 3 C 4 A 5 L 6 Ease of Start-Up • Start-Up Days • Budget vs. actual start-up time • Operator workhours • % Rework during start-up. • Maintenance workhours 247 E 7 8 9 10 .

% estimated constructability dollar-savings as a percentage of TIC (very low. high. satisfactory. low. moderate. very high). unsatisfactory.5. % Rework due to vendor changes % Fabrication Cost increase due to designrelated rework. very high). Measure Example Design Document-Release versus commitments (very unsatisfactory. etc. design changes. low.2 For the following list of selected measures of design effectiveness. please provide the actual result of each measure for this selected project. design omissions. moderate. % Rework during start-up due to design deficiencies % Startup Schedule delay due to design deficiencies % Startup Cost increase due to design deficiencies including rework costs Satisfactory 248 5% 1% 1% 2% moderate moderate Actual Results / Linguistic Description . % Fabrication Schedule delay due to designrelated rework % Construction engineering-hours for design problem solving (very low. moderate. very satisfactory) % Rework due to design errors. high.

• Chutes for product flow. and stand-by equipment. Ventilating equipment.) count one each.e. count one each. strippers. if piped. 19. 4. 15. etc. If pump is turbine driven. 16. and other miscellaneous equipment. count turbine as separate piece of equipment. For existing equipment.e. If attached. etc. not stacked. 6. package or major components. count as one unit. If multi-stage. If compressor is turbine or gas engine driven. i. operating spares. Process and utility compressors electric driven. etc. For customer furnished equipment. 2. classify by normal count... 10. 18. ejectors. 12. Flare stacks count one each. 9. barometric conductors. add ½ per stage for extra stages. use procedure listed above for this particular type of equipment. Heaters of boiler count one each. Ariel coolers count each fan structure. hoppers. utilities. Fintube exchangers count one each service. Storage tanks. 17. Equipment includes the following: • Motors. Mixers or agitators not attached to equipment. 14. Other major equipment such as materials handling. Ejectors or reducers (jets) requiring foundation or supports. count each component as one piece of equipment. count one each. 5. 23. Shell and tube exchangers. 11. count turbine as separate piece of equipment. pit scales. Blowers (including motors) count one each.. count one each. reactors. and dust. etc. Drums. air conditioning equipment. count turbines as separate piece of equipment. wired and skid mounted. If fans are turbine-driven. do not count. Conveyors count one each. Incinerator and thermal reactors (i. 13. count one each. muffle furnace.. gear reducers. 21.e. If fans are turbine-driven. Cooling towers count one per cell. single stage count one each. 3. towers. count each as one piece of equipment. 8. i. If not skid mounted. If stacked. etc. Filters requiring foundation or supports. heating. Pumps (including motors) count one each. Do not count line filters. count one each. count one each. If turbine-driven.. use procedure listed above if revision to piping. instrumentation or relocation is involved. environmental control. Lube and seal oil skid mounted units count one each. Auxiliary compressor equipment not attached to compressor but requiring individual supports and piping. count unit as one. reduction and classification.. count turbine or gas engine as a separate piece of equipment. 20. and drive systems. 22. count as one unit. count turbine as separate piece (if equipment). count one each. 249 .. If connected to equipment or in-line. Package units. ductwork for air. bins. do not count.Attachment I: Equipment Counting Definition (provided by an owner organization): Equipment is hardware that is used in manufacturing including manufacturing equipment and manufacturing support. gases. size-change parts. Major equipment count procedure (provided by a constructor organization): 1. 7. If components of unit require individual supports and piping. • Capitalizable spare parts.

Attachment II: Industrial Project Phases (CII BM&M Committee) Project Phase Pre-Project Planning Typical Participants: • Owner Personnel • Planning Consultants • Constructability Consultant • Alliance / Partner Detail Design Typical Participants: • Owner Personnel • Design Contractor • Constructability Expert • Alliance / Partner Demolition / Abatement Start/Stop Start: Defined Business Need that requires facilities Stop: Total Project Budget Authorized Start: Design Basis Stop: Release of all approved drawings and specs for construction (or last package for fast-track) Typical Participants: • Owner personnel • Design Contractor • Alliance / Partner Construction Typical Participants: • Owner personnel • Design Contractor (Inspection) • Construction Contractor and its subcontractors • Remove existing facility or portion of facility to allow construction or renovation to proceed • Perform cleanup or abatement / remediation Start: Procurement Plan for Engineered Equipment Stop: All engineered equipment has been delivered to site • • • • • • • • • • • • • Start: Beginning of continuous substantial construction activity Stop: Mechanical Completion • • • Start-up / Commissioning Typical Participants: • Owner personnel • Design Contractor • Construction Contractor • Training Consultant • Equipment Vendors Options Analysis Life-cycle Cost Analysis Project Execution Plan Appropriation Submittal Pkg P&IDs and Site Layout Project Scoping Procurement Plan Arch. Rendering Drawing & spec preparation Bill of material preparation Procurement Status Sequence of operations Technical Review Definitive Cost Estimate Start: Mobilization for demolition Stop: Completion of demolition Typical Participants: • Owner Personnel • General Contractor • Demolition Contractor • Remediation / Abatement Contractor Procurement Typical Activities & Products • • • • • • • • • • • • • • Start: Mechanical Completion Stop: Custody transfer to user/operator (steady state operation) • • • • • • • • • 250 Vendor Qualification Vendor Inquiries Bid Analysis Purchasing Expediting Engineered Equipment Transportation Vendor QA/QC Set up trailers Site preparation Procurement of bulks Issue Subcontracts Construction plan for Methods/Sequencing Build Facility & Install Engineered Equipment Complete Punchlist Demobilize construction equipment Warehousing Testing Systems Training Operators Documenting Results Introduce Feedstocks and obtain first Product Hand-off to user/operator Operating System Functional Facility Warranty Work .

CII Research Team 156 Questionnaire: Part B Project Piping Engineering Productivity Systems and Piping Project Data (To be filled out by project piping team or Piping Discipline Lead) B-6 Company Information (Duplicate of A-1) B-7 Productivity Measurement System Overall Engineering Piping Engineering B-8 Project Piping Results B-9 Piping Design Effectiveness Evaluation 251 .

It will be further used if one or more or the responses need to be clarified by the person filling out the form. years 6. All data will be held in strict confidence.1 Your company’s name: 6.3 Contact person (name of person filling out the questionnaire form) Name Position Phone number ( ) Fax number ( ) E-mail address 6.4 How do you classify your company?  Owner w/ in-house design capabilities  Owner w/o in-house design capabilities  Full-service design firm  Constructor  Designer/Constructor  Others.3 Please specify how many years of experience you (person filling out the questionnaire) have in the industrial sector of the construction industry. COMPANY INFORMATION: The purpose of this section is to provide the research team with a basic information needed to identify the various responses received. Both company and person identification will remain anonymous.VI. 6. Specify Project Identifier Used in Part A: 252 .2 Engineering Company’s name: 6.

isometrics for specific 3D sub-model.2 What is the estimated cost to track engineering progress and performance as a percentage of the total design and engineering costs in your company? % 7.     Yes. TOTAL ENGINEERING PRODUCTIVITY MEASUREMENT 7. e.g. Yes. Specify: Yes.3 At what level do you track the performance of engineering functions (check all that apply)?        A) Total Engineering B) Engineering disciplines.VII. all listed isometrics for specific sub-model. It is to be noted that Piping Function has been chosen by the research team as the primary field of investigation for this survey. measured as the ratio between total billable hours for a discipline and the installed cost they designed on project. e. e. Specify: 7.. piping C) Group of work tasks / deliverables. respondents are encouraged to give recommendations where appropriate that may improve these practices.  Yes. Specify: No. measured as the ratio between total billable hours for engineering and the installed cost they designed on project.. In addition to the screening of the current practices. D) Specific Activities (activity codes). e. but use other measures for tracking project engineering productivity. F) Individual personnel. but use other measures for tracking project engineering productivity. production of orthographic drawings. E) Individual deliverables. measured as the ratio between total annual billable hours for engineering and the installed cost they designed each year.g.  Yes. G) Others. Yes. Specify 253 ..g. PRODUCTIVITY MEASUREMENT SYSTEM The purpose of this section is to examine the engineering productivity measurement system(s) in use in your organization. TOTAL ENGINEERING ON PROJECT ANNUAL COMPANY WIDE ENGINEERING PRODUCTIVITY   No.1 Does your company track the productivity in the areas shown below? FOR EACH ENGINEERING DISCIPLINE  No.g.. but use other measures for tracking engineering discipline productivity.

then reviewed and accumulated with other disciplines by PM/EM.  E) A feedback that starts from the piping designers and piping engineers up through the hierarchy.06 1.02 1. ** Performance Index is measured as the ratio between actual unit rate and budget unit rate.32 1. Workhours Quantity Period To-date Forecast Budget Period To-date Forecast Budget Period To-date Forecast Period To-date Forecast Isometrics Revised 1750 Performance Index** Unit Rate* Deliverable Budget Cost Codes 190 190 45 144 190 240 60 184 250 1.4 How is piping engineering/design work-hours being estimated (check all that apply)? AT PROJECT AUTHORIZATION OR CONTRACT AWARD  A) Project Manager / Project Engineering Manager breakdowns total budgeted work-hours among engineering disciplines including piping  B) Piping Chief Engineer with his/her principal engineers and design supervisors estimate workhours needed for piping work. 254 . Specify 7.26 1.28 1. AT START OF DETAILED DESIGN  A) Project Manager / Project Engineering Manager breakdowns total budgeted work-hours among engineering disciplines including piping  B) Piping Chief Engineer with his/her principal engineers and design supervisors estimate work-hours needed for piping work.  F) Others.5 The following is an example of the components of an engineering performance control system used by an engineering firm. Please provide a sample of your engineering performance control system in the piping design area.05 * Unit rate is measured as the ratio between work-hours and quantity. Specify  C) Mutual effort between Project Manager / Engineering Manager and Piping Chief Engineer  D) Design leaders provide the basic estimate that is reviewed and approved by higher levels of the hierarchy.  F) Others.  E) A feedback that starts from the piping designers and piping engineers up through the hierarchy.PIPING DISCIPLINE PRODUCTIVITY MEASUREMENT 7. then reviewed and accumulated with other disciplines by PM/EM.  C) Mutual effort between Project Manager / Engineering Manager and Piping Chief Engineer  D) Design leaders provide the basic estimate that is reviewed and approved by higher levels of the hierarchy.33 1.

 255 . # iso’s.7.6 What is being tracked in your engineering performance reporting system regarding piping design work-hours? (Check all that apply)  Budget Work-hours   Original Revised   Actual Work-hours   Expended work-hours this period Expended work-hours to date   Earned Work-hours   Earned work-hours this period Earned work-hours to date   Forecast Work-hours    Remaining work-hours (to be completed) Projected % completion.  7. At completion. # Lines.   Actual Quantity     Finished this period % complete this period Finished to date % complete to date   Forecast Quantity    Remaining quantity (to be completed) Projected % complete.7 What is being tracked in your engineering performance reporting system regarding quantity of piping work (Engineering Deliverables or Field Quantities ? (ie. At completion. etc)  Budget Quantity    Original Revised Variance from revised budget.

First. identify for each metric. etc) B) Engineering Productivity = Actual number of work-hours spent Actual Quantity of Materials To Be Installed (eg linear feet of pipe) C) Engineering Productivity = Actual number of work-hours spent Production Unit (eg barrels of crude oil refining) OTHERS (Specify) D) Engineering Productivity = 7. rank them from most important (1) to third in importance (3) in the last column. Then.10 What is being calculated in your engineering performance reporting system regarding productivity/unit rate? (check all that apply) 256 .9 A tentative list of metrics that are used or can be used for measuring the piping function engineering productivity is given below. Finally. Secondly. what formula is being used (check all that apply)? A) Engineering Productivity = Actual number of work-hours spent Actual Quantity of Engineering Completed (eg drawings. for the listed metrics. identify ONLY the top three most important / appropriate metrics for measuring piping engineering productivity. commonly known as unit rate) is being tracked/calculated in your performance control system. Engineering Productivity Metrics (Piping Function) Currently Collected ?` Could be Collected ? Rank Total Piping Work-hours / drawing Total Piping Work-hours / ISO Total Piping Work-hours / Line Total Piping Work-hours / Line (2Dwg) Total Piping Work-hours / Line (3D Dwg) Total Piping Work-hours /Equip Piece Piping MTO Work-hours/ Line where. MTO stands for material take-off Piping MTO Work-hours/ Requisition Total Piping Work-hours/ LF of Pipe Total Piping Work-hours/ Valve Count Others (Please specific) 7. add to the list any metrics you find appropriate in measuring engineering productivity of the piping discipline. whether it is currently collected or could be collected in your organization.8 If engineering productivity (defined as the ratio between input work-hours and output quantity. isometrics.7.

      7. Baseline productivity Productivity this period Productivity to-date Forecasted productivity “to be completed” Forecasted productivity “at completion”. Specify 7.what definition do you use ? (Check all that apply)  A) Earned work-hours / budget work-hours (> 1 is GOOD)  B) Earned work-hours / actual work-hours (> 1 is GOOD)  C) Budget work-hours / earned work-hours (> 1 is BAD)  D) Actual work-hours / earned work-hours (> 1 is BAD)   257 .11 Does your company calculate a Performance Index for Individual Engineering Tasks? If sowhat ratio do you use)  Planned or Estimated Unit Rate / Actual Unit Rate (> 1 is GOOD)  Actual Unit Rate / Planned or Estimated Unit Rate (> 1 is BAD)  Others.12 Does your company use Earned Value Analysis to calculate the performance index for a group of tasks with different unit rates ? If yes.

7.14 What is being tracked/reported/calculated in your engineering performance control system and not listed above. if any?     258 .13 What is being reported in your engineering performance reporting system regarding performance deviations from the original planned/estimated? (Check all that apply)  % of overrun or underrun based on:       $ Hours Quantity Completed Log of deviation causes Corrective action plans. Budget revisions   7.

Percent complete Level of effort. Similarly. Others. The list includes: • • • • • Units completed: An actual counting of the units of work completed. Examples of the work tasks that use this method are piling. Start / finish percentages. Percent complete: A subjective approximation of the percent complete for relatively minor tasks and where development of a more complicated intermediate milestone or level of effort formula is not justified.7. Examples of the work tasks that use this method are painting. Incremental milestones: Similar to level of effort method except for that it is used when only a few non-measurable subtasks exist. Start / finish percentages: A process of assigning either 0% or 100% to a task with no intermediate milestones. Examples of the work tasks that use this method are flush. and in light of the definitions given above. Examples of the work tasks that use this method are equipment installation and control panels. flooring and piping erection. Incremental milestones. clean and test the system. Usually the effort of tracking those individual subtasks is numerous and costly. Level of effort: A process of assigning a pre-determined percent complete to a task on the basis of completion of various measurable subtasks. how do you currently track progress in piping engineering and design tasks/deliverables (check all that apply)?       Units completed. and lighting fixtures. pipe hangers erection. Specify 259 . Examples of the work tasks that use this method are structural steel erection and cable trays.15 Earlier CII studies of construction productivity measurement identified five methods to recognize the completed units of work for tracking progress in construction activities.

identify YOUR CURRENT PRIMARY method for measuring its development progress. (Put check in one box per row) Please add any additional piping deliverables not listed below at the end of the list.16 For the major piping deliverables listed below.7. Piping Deliverables % Mile- complete stones Facility Arrangement Process flow diagram (PFD) Piping and instrumentation diagrams (P&IDs) Equipment List System description Initial draft of calculations Line list Valve list Specialty item list Piping design criteria Piping design specifications Piping sketches Composite drawings Plant layout model / Equipment general arrangement model Plot plans – equipment arrangement Piping 3D model Orthographic drawings Piping isometrics System isometrics Underground piping drawings Stress analysis reports Pipe support drawings and specs Procurement specs Bill of materials Material Requisitions / Purchase orders Certified spool sheets Erection specs Erection isometrics 260 Tracking Methods Level of Actual Effort counting Start / Do Not finish Track .

7.17 How satisfied are you with your current company’s Engineering productivity measurement system(s) for: Very Unsatisfied unsatisfied 1 2 INTERNAL PROJECT CONTROL (Within one project) INTERNAL BENCHMARKING (Across projects) EXTERNAL BENCHMARKING (Across Companies) 261 Satisfied 3 Very Satisfied 4 Exceeds my requirements 5 .

VIII- PIPING RESULTS
For the project identified, you are required in this section to provide basic piping specific data
(e.g. budgeted/ actual work-hours, certain output of piping design and engineering process. This
data will be further used in estimating various engineering productivity metrics.
8.1 The deliverables considered to be in the piping function for this study are shown below.
Please indicate whether or not you consider the work shown to normally be part of the piping
discipline work and then adjust your input to “standardize” as necessary to match the work scope
described as closely as possible.
USUALLY INCLUDED IN
PIPING GROUP WORK IN
MY COMPANY (Yes or No)

PIPING GROUP DELIVERABLES
(Include Engineering Activities in Piping Analysis)
Facility Arrangement (Layout)
Initial Piping Size Calculations
Line List
Valve List
Specialty Item list
Piping Design Criteria
Piping Design Specifications
Piping sketches (initial piping layout activity)
Composite drawings (combined piping, mechanical, electrical, structural and
control)
Plant layout model/ Equipment general arrangement model
Plot plans- equipment rearrangement
Piping 3D model
Orthographic drawings
Piping isometrics
System isometrics (integrating piping with attached equipment)
Underground piping drawings
Stress analysis reports
Pipe support drawings and specs
Procurement specs- piping materials
Bill of materials
Piping material requisitions/ purchase orders
Certified spool sheets (shop detail drawings for prefab piping)
Erection specs
Erection isometrics
Others (Specify

PIPING SUPPORT DELIVERABLES
(Do not include these Activities in Piping Analysis)
Process Flow Diagram (PFD’s)
Piping and Instrumentation Diagrams (P&ID’s)
Equipment List
System Description

262

8.2 Visioned by one of our owner research team members, value to customers of engineering
functions can be recognized through five measurable parameters, i.e., engineering costs (%TIC),
schedule (detailed design duration / document releases), product quality (zero defects),
operational expenses ($/unit of production) and safety (zero accidents). Refer to the following
chart for a breakdown of the value-to-customers of engineering functions.

Equipment $

Construction $

TIC $

Owner

($/Unit of Capacity)

# Hours
Labor

A/E FEL Support

$/Hours

Engineering $
A/E Design
(%TIC)

Machine

Speciality Design
Schedule

Total Value for
Customer

(Faster)/(Document Releases)

Field Support

Overhead &
Profit

Product Quality
(Zero Defects)

Operational Expenses
($/Unit of Production)

Safety
(Zero accidents)

Based on your experience, identify the top three piping deliverables that have major impact on
each of the five parameters.
After identifying the three work tasks corresponding to each of the five parameters, please rank
them from most important (1) to third in importance (3). Mark the appropriate cell of the table in
the next page.

263

Value to Customers
Piping engineering and design deliverables
and
piping design support Deliverables

Total
installed
Cost (TIC)

Facility Arrangement (Layout)
Initial Piping Size Calculations
Line List
Valve List
Specialty Item list
Piping Design Criteria
Piping Design Specifications
Piping sketches (initial piping layout activity)
Composite drawings (combined piping,
mechanical, electrical, structural and control)
Plant layout model/ Equipment general
arrangement model
Plot plans- equipment rearrangement
Piping 3D model
Orthographic drawings
Piping isometrics
System isometrics (integrating piping with
attached equipment)
Underground piping drawings
Stress analysis reports
Pipe support drawings and specs
Procurement specs- piping materials
Bill of materials
Piping material requisitions/ purchase orders
Certified spool sheets (shop detail drawings for
prefab piping)
Erection specs
Erection isometrics
Others (Specify

PIPING SUPPORT DELIVERABLES
(Do not include these Activities in Piping
Analysis)
Process Flow Diagram (PFD’s)
Piping and Instrumentation Diagrams (P&ID’s)
Equipment List
System Description

264

Schedule

Product
Quality

Operational
Expenses

Safety

8.3 For the identified project, please provide the budgeted and actual engineering work hours
associated with the project and also the percentage piping was of the TOTAL Engineering Workhours.
Engineering Work-hours
Budget

Piping as % of TOTAL
ENGINEERING work-hours
Budget
Actual

Actual

8.4 In this identified project, please provide the quantity of each installed material, measured in
the specified unit.
OUTPUTS
Mechanical Equipment Number
Length of Pipe:
Small Bore < 3”
Large Bore > 3”
Manual Valves, number
# pipe classes
# lines
#2D drawings
#3 D drawings
Piping Construction Material Take-Off
Manhours
# of control valves
# of relief valves
% of lines that required stress analysis
# of field instruments

Units of
Measure
pieces
-------feet
feet
count
count
count
count
count
MH’s
count
count
%
count

265

QUANTITY

Comments

8.5 For the following list of selected measures of piping design effectiveness, please provide the
actual result of each measure for this selected project for the PIPING DESIGN EFFORT.
Measure

Example

% piping engineering schedule overrun
(OR actual duration vs. planned duration)
Piping design-document-release versus
commitments (very unsatisfactory, unsatisfactory,
moderate, satisfactory, very satisfactory)
% piping engineering cost overrun
(OR actual cost vs. budget)
% piping rework due to design errors, design
omissions, design changes, etc.
% piping rework due to vendor changes
% piping fabrication cost overrun due to designrelated rework.
% fabrication schedule change due to designrelated rework
% construction engineering-hours for piping
design problem solving (very low, low,
moderate, high, very high).
% estimated piping constructability dollarsavings as a percentage of TIC (very low, low,
moderate, high, very high).
% piping Rework during start-up due to design
deficiencies
% startup schedule overrun due to piping design
deficiencies
% startup cost overrun due to piping design
deficiencies including rework costs

5%

266

Satisfactory

2%
5%
1%
1%
2%
moderate

moderate

Actual Results / Linguistic
Description

IX- PIPING Project Design Effectiveness

9.1 The following list represents a tentative set of measures of piping engineering and design
effectiveness. This set is intended to measure various aspects of the piping design and
engineering performance throughout the project. (CII Reference Document # 8-1 )
Considering this project, use the following scale to identify the degree of satisfaction with each
measure and which is of crucial value to customers, where
0 representing very unsatisfactory / very bad ; 10 representing very satisfactory / very good
Place a (X) mark in the appropriate cell that indicates where you were at end of project.

0

1

2

S
3

C
4

A
5

L
6

E
7

8

9

10

C
4

A
5

L
6

E
7

8

9

10

E
7

8

9

10

9

10

Accuracy of the Design Documents

Number of drawing revisions / revised
drawings
Number of spec revisions/ revised
specs

Reworked spools

Reworked hours

0

1

2

S
3

Usability of the Design Documents


Drawing Size
Number of Drawings
Cross-References between dwg.



0

Field Engineering work-hours
Clarity
Completeness

1

2

S
3

C
4

A
5

L
6

Cost of the Design Effort

Piping design cost

% engineering $ as a % of TIC

Piping design work-hours

% piping engineering $ as a % engineering $

0

1

2

S
3

C
4

A
5

L
6

Constructability

Tolerances

Specialized Equipment

Different Crafts

Materials and Technology

Pre-assembly

Estimated $ savings as a % of TIC

267

E
7

8

0

1

2

S
3

C
4

A
5

L
6

E
7

8

9

10

A
5

L
6

E
7

8

9

10

Economy of the Design

Facility Layout

Overspecified Materials

0

Oversized Members

1

2

S
3

C
4

Performance Against Schedule

Document Release

Project scheduled duration vs. actual duration

Intermediate Release

Engineering scheduled duration vs. actual duration

0

1

2

S
3

C
4

A
5

L
6

Ease of Start-Up

Start-Up Days

Budget vs. actual start-up time

Operator work-hours

% Rework during start-up.

Maintenance work-hours

268

E
7

8

9

10

APPENDIX F ILLUSTRATIONS FOR ENGINEERING CONTROL AND PRODUCTIVITY MEASUREMENT 269 .

Figure F.1 Sample Control Accounts 270 .

2 Sample Target Unit Rates 271 .Figure F.

Figure F.3 Sample level of effort / intermediate milestone structure 272 .

1 Guiding level of effort/milestone structure for various piping deliverables and work Tasks 1) Facility Arrangement:   Preliminary overall plot plan AFC 2) Process Flow Diagram (PFD): (process engineering)        Material balance calculations Heat balance calculations Preliminary IFD MOC review Equipment list DP/DT review IFD 3) Piping and Instrumentation Diagrams (P&IDs):    Preliminary P&IDs AFD P&IDs AFC P&IDs 4) System Description: (w/ P&IDs)       Draft Line sizing Line specifications Equipment/Instrument tagging Process issue Project issue 5) Initial draft of Calculations: (w/ P&IDs)     Equipment calculations Line sizing calculations Instrumentation calculations Utility and chemical calculations 6) Line List:  Preliminary line input 271 .Table F.

  AFD line input AFC line input 7) Piping design Criteria:    Issued for Approval (IFA) Issued for Design (IFD) Issued for Construction (IFC) 8) Design Specifications       Complete Draft Write Specifications Check Specifications Issue for approval Issue for construction Issue revisions as required 9) Piping sketches    Preliminary GAs IFR IFD 10) Plant layout model / Equipment general arrangement model:       Model setup Modeling/locating equipment Interference check Integrity check Design check Complete 11) Layout/Plot Plans:      Engineering/layout/design sketches Preliminary issue AFD issue AFC issue w/ holds AFC issue w/o holds 12) 3-D Piping model    Model setup Piping quantity routed Interference check 272 .

   Integrity check Design check Complete 13) Orthographic drawings:      Start Issue for review (IFR) Issue for approval (IFA) Issue for construction (IFC) Finish 14) Isometrics    Issue for review (IFR) Issue for approval (IFA) Issue for construction (IFC) 15) Underground piping       Calculations complete Drafting Complete interdisciplinary review Issue for client review Issue AFC with holds Issue AFC without holds 16) Bill of Materials (bulks):    Bulk Take-off (AFD P&IDs) Bulk Update (AFC P&IDs) Final Take-off 17) Material Requisitions / Purchase Orders:     Inquiry request Bid Purchase order (PO) issued Vendor drawings review 18) Stress analysis reports: 273 .

         Stress priority line list Critical line stress analysis Major support PSV thrust load calculations Tie in analysis Thrust block load calculations Pipe rack lines stress analysis Stress calculations update Quality control 19) Piping support drawings      Design Drafting Issue for review (IFR) Issue for approval (IFA) Issue for construction (IFC) 274 .

Figure F.5 Sample calculations for the engineering performance in an industrial project 275 .

6 Sample engineering performance curve in an industrial project 276 .Figure F.

7a Earned value concept in tracking piping/total engineering progress (company A) 277 .Figure F.

Figure F.7b Earned value concept in tracking piping/total engineering progress (company A) 278 .

8a Earned value concept in tracking piping/total engineering progress (company B) 279 .Figure F.

8b Earned value concept in tracking piping/total engineering progress (company B) 280 .Figure F.

APPENDIX G PROJECT DATA 281 .

000.500 0.189 3 1 4 30% 3.200.000 $25.000 Unknown 2 6 3 3 $53.0% N/A 6 8 7 5 5 7 6 7 7 9 1.337 32.Raw Productivit y Input Quality Variables Complexity Variables Scope Variables Variable Project 1 Project type Size of plant (sf) # of raw materials # of finished products Project size ($TIC) Relative size of plant Client/owner participation Contract type # of equipment pieces Split engineering Engineering-design schedule Owner previous experience w/ process technology Designer previous experience w/ process technology Site conditions Legal and environmental impact Method of acceptance testing Amount of standard module re-use Owner previous experience w/ designer Design standards used Relative level of complexity Project change control methodology Use of 3D CAD modeling Use of Integrated Databases Use of Electronic Data Interchange Design-construction overlap Percent TIC owner scope changes PDRI score Quality/completeness of scope definition Owner profile and participation Pre-project planning Objectives and priorities Basic design data Designer qualifications and capacilty Type of contract/contract terms Project Manager Qualifications Constructor input and constructability Vendor data Piping workhours Length of pipes Raw Productivity = Workhours/length of piping 282 Project 2 Greenfield Greenfield 35.034 .000 4 4 2 2 Cost plus (incentive Cost plus (time and w/ performance material) factors) 158 Yes 2 2 2 3 3 3 2 2 3 3 125 No 3 3 3 2 3 3 1 5 3 4 5 4 2 70% Unknown N/A 7 6 5 8 9 6 9 7 6 7 19.601 0.098 32.

462 0.642.000 3 3 Unknown 1 Cost plus (incentive Cost plus (incentive w/ performance w/ performance factors) factors) Project 5 Project 6 Project 7 Grass Roots 150.613 5.000 5 0 3 20% Unknown N/A 10 10 10 10 10 10 N/A 5 8 10 2.650 283 .0% N/A 10 10 10 10 10 10 10 10 10 9 28.200.585 14.503 0.865.000 $52.200 5 5 0 50% 3.8% N/A 9 10 10 10 9 10 10 10 9 10 31.088 60.308 4 4 4 30% 0.000 112.744 #DIV/0! 0.1% N/A 7 7 3 8 7 N/A N/A 9 9 6 9.000 3 1 Lump-sum Greenfield Unknown Unknown Unknown Unknown 5 3 Lump-sum Grass Roots Unknown 2 5 $32.000 N/A 2 N/A 1 $53.Project 3 Project 4 Greenfield Greenfield N/A 90.000.000 5 2 $120.827 1 3 4 40% 3.000 3 3 In-house Unknown Yes 3 1 0 4 3 2 3 5 1 3 105 No 3 3 2 2 3 2 2 3 3 3 218 No 2 0 0 2 2 3 4 5 4 2 Unknown No 1 5 5 4 3 2 1 1 2 4 120 No 1 1 2 5 0 2 1 5 2 3 0 5 3 85% Unknown N/A 9 10 8 8 8 10 8 10 10 8 8.277 0.

Project 8 Grass Roots Unknown 2 4 $32.900 3 1 1 20% 0.936 12.000 $20.699 1.860.000 Unknown 12 12 Unknown 1 1 Unknown $78.0% N/A 6 9 7 8 8 8 7 9 7 8 49.407 0.892 2 1 1 26% 0.887 34.124.1% N/A 7 8 8 9 9 9 9 8 3 8 66.000 4 Unknown Cost plus (guaranteed maximum) Project 9 Project 10 Project 11 Greenfield Greenfield Greenfield 40.800 0 1 1 46% 0.103 284 .000 36.000 $75.861 19.383 1.000 2 1 Cost plus 120 Yes 2 3 1 4 2 2 5 5 3 5 1200 No 1 1 2 3 1 2 1 1 2 3 600 No 2 1 2 3 1 2 1 1 2 2 Unknown No 4 4 5 3 5 3 1 5 1 4 1 No 3 1 1 4 0 2 3 5 1 1 5 4 2 10% 0.468.6% N/A 10 10 9 9 8 8 9 10 10 7 4.568.850 3 5 3 75% Unknown N/A 5 10 7 7 6 10 6 5 4 8 2.144 0.700.0% 241 7 6 7 8 6 5 5 7 6 5 760 7.088 28.919 0.000 4 4 3 1 1 1 Cost plus (incentive Cost plus (incentive Cost plus (time and w/ performance w/ performance material) factors) factors) Project 12 Unknown Unknown N/A N/A $11.

0% N/A 6 5 7 9 8 8 7 9 6 8 45.0% N/A 8 9 9 Unknown 9 10 9 10 7 9 18.8% 330 4 3 4 7 5 7 8 7 6 6 21.000 0 3 2 50% 2.100.300.2% 279 7 5 6 4 7 8 8 8 7 8 25.000.000 3 Unknown Lump-sum Greenfield Unknown 1 1 $50.755 3.000.980 Unknown Unknown Unknown 8% 3.225 #DIV/0! 3.000 2 1 Alliance 9 No 3 0 0 2 3 2 3 4 2 1 1500 Yes 3 2 2 3 3 3 1 1 2 3 148 No 2 2 0 5 3 3 0 1 4 3 81 Yes 2 5 0 2 3 3 0 2 4 3 136 No 2 5 5 4 4 3 3 5 4 5 3 3 3 70% 0.382 95.280 285 .Project 13 Project 14 Project 15 Project 16 Project 17 Grass Roots Unknown 3 1 $75.000.000 5 2 5 20% ??? N/A 8 4 7 8 8 5 6 6 8 8 43.000 2 1 $76.000 3 1 Cost plus (time and material) Greenfield Unknown Unknown Unknown $9.878 71.600 15.000 3 1 Lump-sum Greenfield 35.040 0.000 3 Unknown Lump-sum Grass Roots Unknown 1 3 $90.363 0.220 3 2 2 95% 5.875 5.

2% 0.520 114.000.800 4 4 Greenfield $130.750 31.0% unknown 180 N/R 2 6 9 3 2 7 4 9 8 7 7 9 2 7 9 2 5 0 8 7 9 9 9 9 2 5 8 3 8 8 N/R N/R 130.7% N/A 8 9 8 8 8 9 7 9 8 7 16.000 4 1 4 2 Cost plus (time and material) Lump-sum 410 No 250 No 230 No 3 3 2 5 5 3 3 5 4 2 388 No 2 1 4 3 3 2 3 2 3 3 140 No 5 3 3 50% ??? N/A 10 9 9 10 10 8 10 10 10 9 1.0% 0.100.000 500.000.000 0.892 142.000 $95.173 0.000 3 4 4 1 1 1 Cost plus (time and Cost plus (incentive Lump-sum material) w/ performance factors) Greenfield 40.000.000 $83.000 112.000 $113.917 0.000 N/A 3 N/R N/A 1 N/R $1.145 N/R N/R 2 3 3 1 2 0 2 2 0 2 2 3 3 2 2 2 1 1 0 3 5 1 5 4 4 2 4 3 2 3 1 4 5 4 0 3 3 80% 4 40% 3 40% 5.Project 18 Project 19 Project 20 Project 21 Project 22 Greenfield Greenfield Greenfield ??? 300.000 5 5 4 55% 7.558 9.277 286 .000.000 0.

5 5 3 1 5 5 2 0 1 1 0 2 3 0 2 3 10 10 10 10 10 10 10 10 10 10 5 8 3 5 5 10 9 9 9 9 43.164.Project 23 Project 24 Project 25 Project 26 Project 27 Modernization Grassroot Modernization Modernization Modernization $6.815 5 7 7 4 6 8 7 10 9 6 6.690 4 3 1 Yes 3 0 0 1 4 3 5 5 1 3 Yes 1 1 1 3 4 1 1 1 4 5 3 27 No 1 1 1 4 3 2 0 5 4 5 3 48 No 1 1 2 5 3 3 2 5 3 8.478 287 .000 3 2 3 $152.600.5 3 1 -9 3 0 0 1 5 2 2 5 3 3.376.000 3 1 5 $18.275.025.248 4 2 6 $1.638 3 2 6 $22.840 7 8 6 7 5 10 10 10 7 9 12.797 5 7 7 4 6 8 7 10 9 6 1.

000 5 3 0 No 3 0 0 1 5 2 2 5 3 3.942 3 4 2 $30.071.742 3 5 6 6 3 8 7 9 8 5 3.5 3 23 No 3 0 0 5 5 2 0 5 4 7 3 10 Yes 3 1 1 5 5 2 0 5 4 4.858.743.5 3 0 No 3 1 1 5 4 1 3 5 3 6.5 3 78 No 1 2 2 4 2 3 1 5 3 6 0 2 3 0 2 3 0 3 2 0 2 2 4 1 0 5 7 7 4 6 8 7 10 9 6 268 5 5 6 4 7 9 6 5 9 8 321 2 5 5 3 4 7 6 6 9 6 11.614 7 9 8 8 8 9 8 8 7 9 8.000 4 1 1 $1.907.Project 28 Project 29 Project 30 Project 31 Project 32 Modernization Modernization Modernization Addition Modernization 2 6 $300.239 10.794 288 .863 3 $11.381 0.935 3 $14.

000 4 3 3 $4.800 4 $0 3 7 6 $3.872 8 9 8 9 8 8 7 8 8 9 3 8 10 8 3 5 2 10 10 3 9 9 9 9 9 9 9 9 9 9 4 1 $2.000 4 1 No 2 1 1 3 5 2 3 3 4 8 7.700 .Project 33 Project 34 Project 35 Project 36 Project 37 Addition Modernization Modernization Modernization Addition $172.989.700 3 3 138 Yes 2 2 2 2 3 3 24 No 2 1 1 5 2 3 12 No 1 5 1 5 3 3 14 No 2 0 0 4 1 1 4 2 8 1 5 2 7.878.427.5 3 3 1 2 4 2 5 3 3 3 5 4 4 1 1 1 0 0 0 0 1 1 9 9 9 9 8 9 9 8 9 9 10 9 9 9 10 10 10 10 9 7 166.856.820 289 11.

828.Project 38 Project 39 Project 40 Modernization Other Grassroot $667.029.970 4 $11.5 0 0 1 0 0 1 1 1 3 10 9 9 9 9 9 9 10 9 9 8 8 6 10 9 8 9 9 6 9 8 10 9 10 9 9 10 8 10 9 290 .152 4 3 3 1 No 3 1 0 4 0 No 2 0 2 4 0 Yes 1 4 0 3 0 4 5 1 4 0 5 1 7 2 1 3 8.753 4 $61.

APPENDIX H REGRESSION ANALYSIS RESULTS 291 .

22752303 1.65711911 0.12883449 R-square = 0.59829 -----------------------------------------------------------------------All variables left in the model are significant at the 0.00159844 0.33234832 4.83 -0.07887259 5.07887259 7.00000001 1.0999 EQUIP 0.0175 Error Total Variable Prob>F INTERCEP 0.30639562 3.07677664 R-square = 0.1288 1.02 Prob>F Regression 0.0441 TIC 0.57 15 17 10.00000001 0.02 16 17 11.05 0.30490702 C(p) = DF Sum of Squares Mean Square F 1 5.20720984 3.37860062 C(p) = DF Sum of Squares Mean Square F 2 6.149572.15319781 4.0282 Error Total Variable Prob>F INTERCEP 0.65711911 0.0175 Bounds on condition number: 1.35072349 16.3049 -4.00053079 6.0768 7.72364041 Parameter Estimate Standard Error Type II Sum of Squares F 0.0088 Bounds on condition number: 1.69004823 Parameter Estimate Standard Error Type II Sum of Squares F 0.57824652 16.0175 2 TIC 0.2500 level.25757965 2.0737 0.2022 Partial R**2 Model R**2 C(p) F 1 0.0185 2 0.00050697 5.Scope Model Stepwise Procedure for Dependent Variable PRODVTY Step 1 Variable EQUIP Entered -4.07887259 7. 1 -----------------------------------------------------------------------Step 2 Variable TIC Entered -3.2022 EQUIP 0.3049 0.2500 significance level for entry into the model.07 Prob>F Regression 0.44985439 0. No other variable met the 0. 4.78 0.7789 292 .25780161 9.3786 -3.36645874 3.80530531 0. Summary of Stepwise Procedure for Dependent Variable PRODVTY Variable Number Step Entered Removed In Prob>F 1 EQUIP 0.00134308 0.

8125.74524125 2.86181117 0.38284580 R-square = 0.02 0.02779186 10.0298 Bounds on condition number: 1.01479636 16.00 14 17 10.65711911 0.99269039 Prob>F Regression 0.Complexity Model Stepwise Procedure for Dependent Variable PRODVTY Step 1 Variable STAND Entered 62.55044063 8.70024814 8.18 -0.72381591 7.86000235 Parameter Estimate Standard Error Type II Sum of Squares F 2.13425629 16.25 -----------------------------------------------------------------------Step 3 Variable REUSE Entered 45.0064 ENVIRO 0.0661 Error Total Variable Prob>F INTERCEP 0.90 15 17 12.1293 0.65711911 0.13218219 1.37 Bounds on condition number: 1.52286282 2.61725401 5.14468966 0.89708150 3.17392452 C(p) = DF Sum of Squares Mean Square F 1 2.65711911 0.88054007 2.0851 R-square = 0.0025 R-square = 0.02546287 Prob>F Regression 0.76003761 16.72387545 Parameter Estimate Standard Error Type II Sum of Squares F 2.58 Variable Reduction for Complexity .1606 STAND 0.77669806 9.27869902 C(p) = DF Sum of Squares Mean Square F 2 4.21685577 0.76 Prob>F Regression 0.80098642 Parameter Estimate Standard Error Type II Sum of Squares F 2.0863 Error Total Variable Prob>F INTERCEP 0. July 15.0851 Error Total Variable Prob>F INTERCEP 0.23630065 1.31642267 4. 1 -----------------------------------------------------------------------Step 2 Variable ENVIRO Entered 54.17428761 3.89708150 2.64232275 2.39159610 C(p) = DF Sum of Squares Mean Square F 3 6.37 16 17 13. 2001 ENVIRO 0.21305038 0.34880288 0. 7.82746165 13.0993 REUSE 0.32116137 2.89708150 3.Consistent 6 16:23 Sunday.40052906 0.75970833 0.25521981 3.78 -0.0092 STAND 0.60 293 .22691934 2.24353749 2.12 -0.44698851 0.

48987862 0.29933918 0.1789 REUSE 3 0.923238.0131 -0.1109 0.30985920 5.1606 3 0.1048 0.0851 2 0.99 -0.0437 STAND 0.50249692 C(p) = DF Sum of Squares Mean Square F 4 8.3687 ENVIRO 2 0. No other variable met the 0.06 Bounds on condition number: 1.0012 OPROC 0.2000 level.5979 OPROC 4 0.50426 -----------------------------------------------------------------------Step 4 Variable OPROC Entered 37.3916 45.99 -0.1293 4 0.1125 ENVIRO 0.9927 2.1129 0.95625265 17.0255 3.1739 62. 14.28 13 17 8.76157065 0. 27.2000 significance level for entry into the model.21931734 3. Summary of Stepwise Procedure for Dependent Variable PRODVTY Variable Number Step Entered Removed In Prob>F 1 0.78625456 R-square = 0.18041162 4.0243 Bounds on condition number: 2.0437 REUSE 0.87954917 0.83250757 8.032492.49 Prob>F Regression 0.7863 2.29892111 4.STAND 0.8979 294 .1125 Partial R**2 Model R**2 C(p) F STAND 1 0.63745908 Parameter Estimate Standard Error Type II Sum of Squares F 3.1739 0.31971066 0.84728820 2.18110285 4.90 0.65711911 0.2787 54.13770518 6.06000849 0.0456 Error Total Variable Prob>F INTERCEP 0.5025 37.09253776 3.19 -0.13399879 3.73810436 10.3828 2.28696809 16.37015102 2.18780880 1.37803 -----------------------------------------------------------------------All variables left in the model are significant at the 0.

17019778 C(p) = DF Sum of Squares Mean Square F 1 2. 1 -----------------------------------------------------------------------Step 2 Variable PM Entered -0.66015702 C(p) = DF Sum of Squares Mean Square F 2 10.37738700 Parameter Estimate Standard Error Type II Sum of Squares F -1.82 1.22819184 0.82211444 16.12575532 26.0004 -0.83500467 2.12596517 2.12522729 1.0003 CONSTR 0.10 -0.65711911 0.24013802 R-square = 0.44769517 16.02479025 R-square = 0.18622105 9.57698367 0.0623 -2.73298533 C(p) = DF Sum of Squares Mean Square F 3 12.21310983 3.17589028 1.16130944 21.00458862 0.30382259 4.0710 PM 0.878376.24470652 0.07406665 1.16420686 1.Consistent 4 16:25 Sunday.59107626 R-square = 0.28 16 17 13.24540462 29.66080500 16.06980798 12.0889 Error Total Variable Prob>F INTERCEP 0.06555981 2.16 -0.0166 CONSTR 0.96299068 6.83475658 0.59105541 0.81 14 17 4.99631411 5.65711911 0. 7.31769251 Parameter Estimate Standard Error Type II Sum of Squares F INTERCEP 0. July 15.11410586 10.11177224 6.0003 Error Total Variable Prob>F Variable Reduction for Input .10 SCOPE 0.0003 Error Total Variable Prob>F INTERCEP 0.0001 CONSTR 0.Input Quality Model Stepwise Procedure for Dependent Variable PRODVTY Step 1 Variable CONSTR Entered 15.83500467 3.0001 Bounds on condition number: 1.83500467 3.51456064 0.19 Prob>F Regression 0.513504 -----------------------------------------------------------------------Step 3 Variable SCOPE Entered -0.1136 PM 0.82 0. 2001 295 .86388215 Parameter Estimate Standard Error Type II Sum of Squares F 2.73305498 21.17950370 8.49815705 14.63 -0.57 15 17 5.18633836 7.95625562 1.20942394 4.65711911 0.83 Prob>F Regression 0.0889 Bounds on condition number: 1.28 Prob>F Regression 0.

Bounds on condition number:
2.401448,
20.50506
-----------------------------------------------------------------------All variables left in the model are significant at the 0.2500 level.
No other variable met the 0.2500 significance level for entry into the
model.
Summary of Stepwise Procedure for Dependent Variable PRODVTY
Variable
Number
Step
Entered Removed
In
Prob>F
1
CONSTR
0.0889
2
PM
0.0003
3
SCOPE
0.0710

Partial
R**2

Model
R**2

C(p)

F

1

0.1702

0.1702

15.2401

3.2817

2

0.4900

0.6602

-0.0248

21.6258

3

0.0728

0.7330

-0.5911

3.8185

296

Piping Productivity Model
Stepwise Procedure for Dependent Variable PRODVTY
Step 1
Variable EQUIP Entered
34.63776470

R-square = 0.30490702

C(p) =

DF

Sum of Squares

Mean Square

F

1

5.07887259

5.07887259

7.02

16
17

11.57824652
16.65711911

0.72364041

Parameter
Estimate

Standard
Error

Type II
Sum of Squares

F

0.44985439

0.25757965

2.20720984

3.05

0.00134308

0.00050697

5.07887259

7.02

Prob>F
Regression
0.0175
Error
Total
Variable
Prob>F
INTERCEP
0.0999
EQUIP
0.0175

Bounds on condition number:
1,
1
-----------------------------------------------------------------------All variables left in the model are significant at the 0.0500 level.
No other variable met the 0.0500 significance level for entry into the
model.
Summary of Stepwise Procedure for Dependent Variable PRODVTY
Variable
Number
Step
Entered Removed
In
Prob>F
1
EQUIP
0.0175

1

Partial
R**2

Model
R**2

C(p)

F

0.3049

0.3049

34.6378

7.0185

297

Model: MODEL1

Dependent Variable: PRODVTY

Analysis of Variance
Source

DF

Sum of
Squares

Mean
Square

Model
Error
C Total

1
16
17

5.07887
11.57825
16.65712

5.07887
0.72364

Root MSE
Dep Mean
C.V.

0.85067
0.87822
96.86279

R-square
Adj R-sq

F Value

Prob>F

7.019

0.0175

0.3049
0.2615

Parameter Estimates
Variable

DF

Parameter
Estimate

Standard
Error

T for H0:
Parameter=0

Prob > |T|

INTERCEP
EQUIP

1
1

0.449854
0.001343

0.25757965
0.00050697

1.746
2.649

0.0999
0.0175

298

Model: MODEL2

Dependent Variable: PRODVTY

Analysis of Variance
Source

DF

Sum of
Squares

Mean
Square

Model
Error
C Total

1
16
17

5.07887
11.57825
16.65712

5.07887
0.72364

Root MSE
Dep Mean
C.V.

0.85067
0.87822
96.86279

R-square
Adj R-sq

F Value

Prob>F

7.019

0.0175

0.3049
0.2615

Parameter Estimates
Variable

DF

Parameter
Estimate

Standard
Error

T for H0:
Parameter=0

Prob > |T|

INTERCEP
EQUIP

1
1

0.449854
0.001343

0.25757965
0.00050697

1.746
2.649

0.0999
0.0175

299

APPENDIX I
BASICS AND ILLUSTRATIONS FOR NEUROFUZZY SYSTEMS AND
MULTIPLE ATTRIBUTE UTILITY FUNCTIONS

303

1. Neurofuzzy Systems

Neurofuzzy systems are structured schemes that represent synergistic integration
between ANNs and fuzzy control systems. That is, the neural networks provide
connectionist structure and learning abilities to the fuzzy control systems, while fuzzy
control systems provide the neural networks with high-level fuzzy rule thinking and
reasoning abilities (Lin and Lee 1996). Efforts at merging ANNs and fuzzy control
systems can be characterized into three categories (Lin and Lee 1996):

a. Neural fuzzy systems: This category aims at providing fuzzy control systems with
the automatic tuning methods typical of neural networks but without altering the
functionality of the fuzzy control system.
b. Fuzzy neural networks: This category retains the basic properties and architecture
of neural networks and fuzzifies some of the network elements.
c. Fuzzy-neural hybrid systems: This category integrates both techniques in series or
in parallel to gain certain benefits specific to the problem under consideration.
1.1 Fuzzy Neural Networks

Applying fuzzy methods into the workings of neural networks constitutes a major
thrust in neurofuzzy computing (Gupta 1994). By incorporating fuzzy principles in neural
network architecture, more flexibility and system robustness are attainable. Fuzziness in
this context means more flexibility in the system definition, i.e., the boundaries of the
system are described more vaguely while the system maintains the ability to reconfigure
itself for best performance (Lin and Lee 1996). Fuzziness of neural networks may be
introduced at all aspects of the network including inputs, weights, aggregation operators,
transfer functions, and outputs (Tsoukalas and Uhrig 1997).

304

The synergistic integration of fuzzy methods and neural networks transfers the
burden of developing a fuzzy control system to the learning capabilities of neural
networks. Fuzzy control primarily refers to the control of processes or systems through
fuzzy linguistic description (Tsoukalas and Uhrig 1997). The basic architecture of a fuzzy
control system is outlined in Figure I.1. In this architecture, three major elements are
recognized, i.e., input interface (fuzzifier), processing module, and output interface
(defuzzifier) (Pedrycz 1995, Tsoukalas and Uhrig 1997).

Input

Input
Interface

Processing
Module

Output
Interface

Output

Figure I.1. Basic Architecture of Fuzzy Control System
The processing module is the core element of the fuzzy control system. It contains
a set of If-Then rules that encode the control knowledge of the system (Pedrycz 1995).
The input interface is a module that fuzzifies the system inputs, i.e., transforms crisp
inputs into their fuzzy form. The output interface is a module that defuzzifies the system
outputs, i.e., transforms fuzzy outputs into their crisp form.
Developing the knowledge base of the processing module primarily depends on
capturing human expertise of the process or system into consideration in the form of IfThen rules. This task can be quite challenging for those complex and ill-defined
processes or systems. A promising approach suggests the novel idea of transforming the
burden of designing fuzzy control and decision-making systems to the training and
learning of the connectionist neural networks. Figure I.2. illustrates a basic fuzzy neural
network architecture (Escoda et al. 1997, Lin and Lee 1996).

305

Basic Architecture of Fuzzy Neural Network 1. ym xn Figure I... 1994).... µ x2 . Txk } corresponding to a crisp variable x. Tx2 ... Fuzzification Fuzzification is the process of mapping from an observed crisp input space into labels of fuzzy sets in a specified universe of disclosure (Arafeh 1996... x2 . Lin and Lee 1996...Fuzzification Layer Intermediate Layer Input Layer Defuzzification Layer Output Layer x1 y1 ... µ xk } for each label of the pre-specified term set T ( x) = {Tx1 . Yan et al. Consider an illustrative project having a total installed 306 ... it concerns determining the degrees of membership µ ( x) = {µ 1x .2. In other words...2... ..

about $70..000. Tx2 . the project cost can be said to be “about $30M” with a possibility of 0.2 to be 0. µ x2 . µ xk } for each label of the pre-specified term set T ( x) = {T x1 . 1.0. about $50. about $50. 1994). about $70.000.15.000} can be calculated from Equation 4. In other words...000.000.000} is used to represent the project cost.000. Tx3 } ={about $30.000..3. Accordingly.000..0. Yan et al. T xk } . Defuzzification Defuzzification is the process of mapping from a fuzzy output defined over a specified universe of disclosure into a space of crisp output (Lin and Lee 1996. T x2 .000.15. µ A~ ( x ) 1. and 0.85. it concerns determining the crisp value of variable x given the degrees of membership µ ( x) = {µ 1x .000. as shown in Figure I.85. 0.cost ($TIC) of $53. “about $50M” with a possibility of 0.3.85 0...000.15 0% $10M $70M $90M Project Cost ($TIC) $53M Figure I. Two commonly used methods of defuzzification are the Center of Area (COA) method and the Moment of Maxima (MOM) method (Lin and Lee 1996).3.000.000. Assume a fuzzy variable having a term set T ( x) = {Tx1 .. 307 . respectively.0 About $30M About $50M $30M $50M About $70M 0. Illustrated Example for Input Fuzzification The degrees of membership for the project cost ($TIC) with regard to the term set T (x) ={about $30. and “about $70M” with a possibility of 0.

This can be expressed as: u MOM = ∑i =1 m ui m (I. 0. respectively.4a. the estimated numerical value would be u COA . The MOM method generates the mean value of all components of the resulting fuzzy variable that reach its possible maximum. Of the two commonly used defuzzification methods. assume the output variable “%design rework” to be described through the linguistic term set {about 0%. 308 .2) where ui is the ith element in the universe of disclosure where the membership function of µ C (u i ) is at the maximum value.1 for the three linguistic labels {about 0%. Formally. about 4%}. as shown in Figure I. calculated as follows: ∑ µ (u ) * u = ∑ µ (u ) k u COA i =1 C i i (I. This is conducted through estimating the first moment of area around the Y-axis and dividing this value by the total area. and m is the total number of such elements considered (Tsoukalas and Uhrig 1997).1) k i =1 C i where the summation is carried over discrete values of the universe of disclosure ui sampled at k points (Tsoukalas and Uhrig 1997). about 2%.4.7. Assume also that the trained network calculated the membership values of 0. and 0. For illustration purposes.The widely used COA method generates the geometrical center of the resulting distribution for the output variable. about 2%. the COA has been shown to yield superior results (Lin and Lee 1996). about 4%}.

Illustrated example for the COA method of defuzzification ∑ ∑ n j =1 n j =1 µ C ( xi ) * xi = 0.1)=2.5(4)(0.4)*(0.0 About 0% About 2% About 4% 0.5(4)(0.7 0. Illustrated example for the COA method of defuzzification The three values can be represented with respect to the linguistic sets as given in Figure I.4a.04)=0.1 0% 2% 4% 6% Rework Figure I.7)*(0.5(2)(0.7)+0.5(2)(0. the %design rework can be calculated as follows: µ A~ ( x ) 1.4b.5(4)(0.0 309 .02)+0.4)+0.µ A~ ( x ) 1. Applying the COA method.1)*(0.5(4)(0.036 µC ( xi ) = 0.4b.4 0.0 About 0% 0% About 2% About 4% 2% 4% 6% Rework Figure I.0)+0.

1984)...xCOA = 0. x′m ) and x′′ = ( x1′′. as shown in Figure I. risk-neutral. Preferences among alternatives are defined in terms of the expected utilities (EU). are constructed through identification of the decisionmaker risk attitude.e. x′2 .. ui(xi). The assessment of a utility function is usually accomplished by decomposing the multiple attribute function into m-single attribute functions. x2′′. The component (singleattribute) utility functions. A decision-maker’s preference and indifference between a pair of alternatives can be described mathematically using utility functions.. risk-seeker.036 = 0.018 1.1 Basics Multiple attribute utility (MAUT) is a methodology for selecting from among a set of alternatives in the presence of uncertainty. x′m′ ) if and only if E[u ( x′)] = E[u ( x′′)] (Evans.... The degree of “liking” of the decision outcomes for this method is described by probability density functions (utility functions) over the attribute space. Multiple Attribute Utility Functions 2. and risk-averse. 0 2.8% 2. 310 ...5. i. A decision-maker is indifferent between two alternatives x′ = ( x1′.

u(x) 1.Λ . the additive utility function can be represented as: u ( x1 .. is the range of values taken by attribute i. In cases where mutual independence is not satisfied. x m ) = ∏ [1 + Kk i u i ( xi )] (I.3 is that mutual independence exists between the identified attributes.e.0 Risk-Aversion Risk-Neutral Risk-Seeking UL UH x Figure I.. A key condition for the validity of Equation G. which considers the mutual dependence. i. varies between UL and UH. Utility Risk Attitudes For utility-independent attributes.4) i =1 where k is chosen to satisfy the equation 311 . ranges from 0 to 1. yet modified to take into consideration dependency of attributes. is given as: m 1 + Ku ( x1 .. The multiplicative form.5. x m ) = k1u1 ( x1 ) + k 2 u 2 ( x 2 ) + Λ + k i u i ( xi ) + Λ + k m u m ( x m ) (I. ∑k i =1 i = 1. k i corresponds to the relative importance of attribute i. x 2 ... Relative importances are m positive numbers that sum up to unity. xi . which correspond to u i ( xi ) of 0 and 1 respectively. the additive utility function should not be used as is. x 2 . is the component utility function for attribute i.3) where ui ( xi ) .

m 1 + K = ∏ [1 + Kk i ] (I. Synthesis of priorities takes each of the derived ratio-scale local priorities and construct a composite (global) set of priorities for the elements at the lowest level of the hierarchy. The decomposition principle requires the problem be composed into a hierarchy that captures important elements of the problem. AHP is among the most popular methodologies available for multiple criteria decision-making. comparative judgment.5) i =1 Once the multiple attribute utility function u ( x1 . and synthesis of priorities. In fact. 312 . Comparative judgment requires assessment of pairwise comparisons of the elements within a given level. as shown in Table I. the matrix entries are used to generate a ratio scale that reflects the local priorities of elements in the hierarchy. the alternatives can be ranked.2 Eigenvector Prioritization Method and the Analytical Hierarchy Process (AHP) The analytical hierarchy process (AHP) is a multiple criteria decision-making technique that allows the consideration of both objective and subjective factors in selecting the best alternative. After assessments are collected into comparison matrices. Saaty (1980) developed a standard approach for the pairwise comparison within the comparative judgment step of the process. The alternative with the highest expected utility is the highestranking alternative. AHP has been used in a wide variety of practical applications. Since its introduction by Saaty (1980).1. x m ) has been assessed. x 2 . AHP is based on three principles: decomposition. 2. The comparison would simply take the form: “How important is element 1 when compared to element 2?” The decision-maker would then provide one of pre-specified responses in either numeric or linguistic fashion.Λ . with respect to their parent in the next-higher level.

4. a consistency ratio of less than 0.) for various matrix sizes (N) has been approximated by Saaty (1980) based on simulation runs.) (I. Generally. = λ max − N ( N − 1)( R.R.). 313 .1. a random index (R.2. as shown in Table I. Using this matrix. 6.Table I.I. a problem that has four criteria and five alternatives requires five separate comparison matrices. In fact. which is a function of comparison matrix dimensions (NxN). 8 are intermediate values The responses are further placed into a comparison matrix.I .6) The random index (R. and the dominant eigenvalue ( λ max ) – that is: C.. the local priorities are determined by calculating the normalized principal eigenvector.R. Number of comparison matrices needed to solve a problem depends on both criteria and alternatives considered. One of the important features of AHP is its ability to provide a measure for the consistency of the decision-maker’s judgment. one for the pairwise comparison of the criteria and four each for the pairwise comparisons of the alternatives with respect to each of the four criteria.I. For instance. none of the other commonly used approaches to solving multiple attribute decision-making problems provide the ability for such consistency checks. AHP measures this consistency/inconsistency through the use of the consistency ratio.10 is considered acceptable. C. Decision Aids for Pairwise Comparison of the AHP (Saaty 1980) Importance Numerical Rating* Equally Important 1 Moderately Important 3 Strongly Important 5 Very Strongly Important 7 Extremely Important 9 *2.

58 4 0.0. The corresponding membership values for the two variables would be {0. about $50M.000 6 (on a 0-10 scale) 5% Assuming a linguistic term set of {about $10M or less. about $90M or more} is used for depicting the variable “project cost” as shown in Figure I. about $30M.90 5 1. Two project input variables (project cost and completeness of scope definition) and one measure of design effectiveness (design rework) are demonstrated in this example.7 and I. 1 0.00 2 0.I.0. Illustrative Project Project Variable Project Cost ($TIC) Completeness of scope definition Design rework Value $53.0.0.0.0.32.000.68.0.46} and {0.Table I.49 … … 3.45 10 1. The fuzzy representations for the variables “completeness of scope definition” and “design rework” are illustrated in Figures I.0}.10.0. The corresponding membership values to such term set would be {0. Approximated Random Indices (R.0.00 3 0.0.8. Table I.5.32 8 1.I.) (Saaty 1980) N R.2.15.41 9 1.0.5.0}.24 7 1.85. Illustrative Example of System Functionality Consider the illustrative project whose data are given in Table I.12 6 1.92.3. 314 . about $70M.3.6.0.

0 0.46 0.92 0.85 0. Fuzzy Representation of the Input Variable “Project Cost” Moderate Very low high low 1.68 very high 0.0 About $10M or less About $30M About $50M About $70M About $90M or more $10M $30M $50M $70M $90M 0.10 0 5 10 6 Figure I.Membership Value 1.7.32 0.6.15 0 Project Cost ($TIC) $53M Figure I. Fuzzy Representation of the Input Variable “Completeness of Scope Definition” 315 .

0 0.10 0.50 0.92 0.0 0. The process is repeated until an acceptable error term is reached. which is further used in adjusting the weights of the network.32 Scope = 6 Calculated ε %Rework 0.68 0.0 0.1. Fuzzy Neural Network Architecture (Illustrative Example) 316 Target 5% .0 0.0 0.9.50 0 0% 2% 4% 6% 12% %Design Rework 5.85 0. Fuzzy Representation of the Output Variable “Design Rework” The architecture of the fuzzy neural network used for relating the identified inputs with the resulting output of the illustrative project is shown in Figure I.8.0 $TIC=$53M 0. Adaptation of the connection weight structure is based on calculating an error term.0 About About About 0% 4% 2% About 6% About 8% About 10% 8% 10% About 12% or more 0.0% Figure I.0 0.50 0.9.0 0.46 Adjust Network Weights Figure I.15 0. 0.

317 .10 illustrates the membership values corresponding to the linguistic terms depicting the variable “design rework”. a number of design effectiveness measures can be considering simultaneously for a comprehensive assessment of engineering performance in an industrial construction projects. the calculated numerical output would be 4. Figure I.51 0.49. design document release commitment.51. A tentative assessment of the relative importance could be as given below: • Design document release commitment is strongly important over design rework. • Detailed design schedule delay is moderately important over design rework.Assume that the trained network is used for calculating the design rework based on the input of $53M (for project cost) and 6 (for completeness of scope definition).0. Calculated Output of the Network (Illustrative Example) The illustrative project of Table I. Using the adjusted weight structure of the fuzzy neural network. and detailed design schedule delay) are considered.10.0. assume three measures of design effectiveness (design rework. assume an industry expert was asked to provide his judgment of the relative importance of these measures.0.0.0}.3 depicts an individual output variable as a measure of design effectiveness. Applying the COA method of defuzzification. However.98%.49 0% 2% 12% %Design Rework Figure I. For the illustrative purposes. the calculated output would be {0. Membership Value 1.0 About About 2% 0% About 4% About 6% 4% 6% About 8% About 10% 8% 10% About 12% or more 0.0. Also.

Based on the expert assessment and the decision aids presented in Table I.929 . First. the matrix is normalized by dividing the values of each column on the sum of this column.2 1 0.1.33 0.7 S 0.333 R 1 5 3 9 D 0.329 1.5/1.2 1 0.7 0.0/1.333 2 1 Mollaghasemi and Pet-Edwards (1997) describe the process of calculating the preference structure between the three measures. Afterwards.333 2 1 X W 0.22. This can be done through multiplying the original matrix with the calculated weight vector as follows: 1 5 3 A 0.1096 0.0999 0.33 3/9 0.2 1 0.7 2.5556 0.5813 0.33/3.0/3. the priority matrix.2941 0.0/3. λ max or the matrix dominant eigen value needs to be estimated.3333 Normalized Matrix 0.748 0.5882 0. weights are calculated as the average of each individual row of the normalized matrix as shown below: R D S R D S R 1 5 3 9 D 0. could be developed as follows: design Rework design Document release commitment detailed design Schedule delay R D S R 1 5 3 D 0.1096 0.3092 It is important to check the consistency of the expert response by calculating the consistency index (C.5 S 0.2/1.• Design document release commitment is equally/moderately important over detailed design schedule delay.3092 318 = AW 0.33 5/9 1.333 Column Sum Normalized Matrix 1/9 0.2 1 0.333 2 1 3.5 1.I) as defined in Equation 4.7 1.7 S 0. A.333 2 1 3.3000 Weights 0.5 1. First.6001 0.1176 0.5 0.1111 0.5813 0.

3.3. Further assume that the expert specified the UH and UL for the three measures of design effectiveness as follows: Measure of Design Effectiveness %design rework design document release commitment (very unsatisfactory.R. would equal 0. where i=1. the expert responses are considered consistent whenever C.λ max = 1 N ∑ AW i / Wi = 3.) C. Developing the utility function u ( x1 .003 (3 − 1)(0. moderate. = λ max − N ( N − 1)( R. 2.58) According to Saaty (1980). x 2 .004 − 3 = 0.R. and very satisfactory) %detailed design schedule delay UL 15% Unsatisfactory UH 2% Very satisfactory 10% 3% Figure I.004 As R.11 illustrates the component utility functions u i ( xi ) for the design effectiveness measures considered above. is less than 0. u(x1) 1. ui ( xi ) . which is satisfied in this example. from Table 4. the calculated consistency index would be: C. = 3. unsatisfactory.0 UH =2% UL=15% 319 x1 "% design rework" . satisfactory.1.I .R. x3 ) requires identifying the component utility functions.58.I.

x 2 .0 . v.0 UH =3% UL=10% x3 "% schedule delay" Figure I. the utility function u ( x1 . 1." u(x3) 1. moderate Unsatisf. consider a sample project that has the following data: Measure of Engineering Performance %design rework design document release commitment %detailed design schedule delay The corresponding utility of each measure would be: Measure of Engineering Performance %design rework design document release commitment %detailed design schedule delay 320 Actual result 5% Moderate 20% ui(xi) 0.769 0.u(x2) v. For illustration purposes. x3 ) is fully identified. comm. satisf.333 0. unsatisf.0 x2 "document rel. satisf. Individual Utility Functions (Illustrative Example) Based on the component utility functions and the calculated preference structure (weight matrix from AHP).11.

0 = 0.333 + 0.Collective design effectiveness [ u ( x1 . x3 ) ] = 0.1096*0. x 2 .5813*0.769 + 0.3092*0.28 321 .

APPENDIX J SENSITIVITY ANALYSIS FOR THE PROJECT INPUT VARIABLE CATEGORIES 323 .

"typical impact" insig. "typical impact" low insig. & construction schedule delay due to design def. relative level of complexity.0% Satsifactory %Detailed design schedule delay Satisfactory 4. sign.9% 2.8% 1. and legal and environmental impact. v." "reguatory req.7% 1. insign. "little impact" v. high About $90M Cost plus About $70M Cost plus w/ guaranteed max.7% % Fab & construction cost increase due to design def. "stiff req.4% 4.1% 0.4% 4. 0. insign.1 illustrates the sensitivity analysis for the “General Project Attributes” category. low low moderate high v. relative size of project.1 further represents the influence chart of this category. sign.7% 1. small Relative level of complexity Site conditions v.2% 1.1.6% 0.1% 0.4% 1. 1.2% 3. UT. "little impact" v. 324 . The influence chart shows a maximum improvement in the total engineering performance. sign. Table J. Table J.7% 1. Sensitivity Analysis for the “General Project Attributes” Category Project Input Variables General Project Attributes v.045.3% Design document release commitment 3. large large medium small v.8% The sensitivity analysis shows no particular change trends in measures of engineering performance corresponding to the varying conditions for the “General Project Attributes” category.9% 0.2% % Detailed design cost overrun 1." mod.2% % Construction hours for design problem solving Moderate Moderate Moderate Moderate Moderate % Estimated Dollar savings due to constructability Moderate Moderate 0. site conditions.9% 8. "more than normal" moderate mod. About $50M Cost plus w/ incentives About $30M Unit Price About $10M Lump sum Relative size of project v.9% % Fab.8% Moderate 0. Figure J. contract type.General Project Attributes The project variables included in this category are: project size. high sign.7% 3.8% Moderate % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 1. Project size ($TIC) Contract type Legal and environmental conditions Engineering Performance Measures %Design rework 4.2% 1.1% 0.5% 0. high v.4% Satsifactory 3.4% 5.1% Satisfactory Satisfactory 9. in the amount of 0. low v.5% 0.8% Moderate 0.9% 0.8% 0.

80 UR 0.2 further represents the influence chart of this category. newness of process technology to owner. UT.60 0. However.988 0. no particular trends have been shown for the impact of this category on engineering performance in the subsequent project phases. low low moderate high v.50 General Project Attributes Figure J.85 0.75 UT 0.860 0.053. high UR 0.858 0.55 0. Table J.992 0.90 0.88 0.00 0.98 0.862 0.1.08 0. and owner previous experience with designer.58 v.990 0. The sensitivity analysis shows noticeable impact of the owner attributes on the detailed design schedule delays and cost overruns.1.95 1.65 0.940 0.78 0.68 0. Figure J.70 0. 325 . experience with process technology and previous experience with designer results in reducing the delays and cost overruns of the detailed design phase. The maximum corresponding improvement in total engineering performance. The increase in the owner participation in the design process.978 UT 0.817 0.2 illustrates the sensitivity analysis for the “General Owner Attributes” category.850 0. reaches 0. Influence Chart for the “General Project Attributes” Category General Owner Attributes The project variables included in this category are: owner profile and participation.

68 0.00 0.837 0.2% 1.70 0.3% 1.98 0.55 0.9% Satisfactory 0.0% 0. & construction schedule delay due to design def.3% 3.1% 3.816 0.860 0.4% % Fab.943 0.2% 1.989 1.4% 2.65 0.9% 0.85 0.58 v.6% 0. low "1" Owner profile and participation Newness of process technology to owner low low "3" moderate moderate "5" high high "7" v. 0.2.88 0.1% 16.80 UR 0.4% 0.95 1.8% Moderate 0. high "100% high "75% new moderate "50% low "25% new new technology" technology" new technology" technology" v.8% 1. low v.90 0. low low moderate high v.9% % Fab & construction cost increase due to design def.963 0.8% Moderate % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 3.6% % Detailed design cost overrun 19.1% 2.869 0.001 UT 0.8% Moderate 0. 0.6% Satsifactory %Detailed design schedule delay 3. low "0 projects" 3. Influence Chart for the “General Owner Attributes” Category 326 .08 0.1% 2.8% 0.Table J.2% 1. high "9" v.78 0.5% 7.939 0. high v.9% 19.50 General Owner Attributes Figure J.7% 3.2% 1.2.819 0. high ">20 projects" Owner previous experience with designer %Design rework Design document release commitment high "10-20 projects" moderate "2-10 projects" v.4% 10.6% 3. Sensitivity Analysis for the “General Owner Attributes” Category Engineering Performance Measures Input Variables General Owner Attributes v.60 0.2% % Construction hours for design problem solving Moderate Moderate Moderate Moderate Moderate % Estimated Dollar savings due to constructability Moderate Moderate 0.5% 2. low "duplicate or scale " low "1-2 projects" v.75 UT 0.8% 0.9% Satisfactory Satisfactory Satsifactory 0.2% 1. high UR 0.

reduction of the construction hours committed to design problem solving and the further increase in the constructability savings.0% 2.8% This impact of designer attributes further extends to the construction phase.0% 0. 327 .0% 2.3% 1. associated with the “General Designer Attributes” category is 0. high Yes v.8% 0. Figure J.3% 1.5% % Detailed design cost overrun 15.0% 2. high "9" v.4% Satsifactory %Detailed design schedule delay 8. The sensitivity analysis shows sizable impact of designer attributes on various measures of engineering performance during both detailed design and construction phases.8% 9.3. Sensitivity Analysis for the “General Designer Attributes” Category Input Variables General Designer Attributes Split engineering practices Designer qualifications and capacity Newness of process technology to designer Engineering Performance Measures %Design rework Design document release commitment v.0% 5.6% 5.3% 5.2% Satisfactory 6.9% % Fab & construction cost increase due to design def. and newness of process technology to designer.9% 0.3 further represents the influence chart of this category. designer qualifications and capacity.4% % Estimated Dollar savings due to constructability low low 0.General Designer Attributes The project variables included in this category are: split engineering practices. Improving the attributes as previously indicated leads to reducing fabrication and construction delays due to design deficiencies.9% Moderate Satisfactory Satsifactory 0.6% 4.9% 1.8% Moderate 0. low "duplicate or scale " 5.8% Moderate % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 2. & construction schedule delay due to design def. and increase in designer experience with process technology results in reducing the amount of design rework and also the delays and cost overruns of the detailed design phase. The maximum improvement in total engineering performance.0% % Fab.2% 1.6% 11.8% Moderate 0.2% 1.3% 1.7% 0. The increase in the designer qualification and capacity. Table J. reduction in split engineering practices. high "100% high "75% new moderate "50% low "25% new new technology" technology" new technology" technology" v. low low moderate high v.2% Moderate low low Moderate Moderate % Construction hours for design problem solving 0.7% 15. UT. 2. Table J.2% 7. 1.9% 3.218.3 illustrates the sensitivity analysis for the “General Designer Attributes” category.0% 2. low "1" Yes low "3" Yes/No moderate "5" No high "7" No v.

70 0.1.025 0.95 1.60 0.891 0. Table J.78 0.4 illustrates the sensitivity analysis for the “Project Schedule” category.08 0.90 0.801 1.68 0.85 0. low low moderate high v.65 0. Influence Chart for the “General Designer Attributes” Category Project Schedule The project variables included in this category are: design schedule.88 0.98 0.75 UT 0.58 v.80 UR 0.50 General Designer Attributes Figure J. and designconstruction overlap.696 0.4 further represents the influence chart of this category.990 1.880 0.673 0. 328 .013 UT 0.861 0.00 0.3. high UR 0.55 0. Figure J.774 0.

2% % Detailed design cost overrun 17.9% 2.5% 2.0% 2.854 0.08 0.4.0% 1.3% 1.60 0.9% 2.65 0.836 0.2% Satisfactory 0.8% 5.9% 10.95 1.4.50 Project Schedule Figure J.904 0.75 UT 0.2% 1. UT.9% % Construction hours for design problem solving Moderate low Moderate Moderate Moderate % Estimated Dollar savings due to constructability Moderate Moderate 0. associated with the “Project Schedule” category is 0. low fast track low fast track / normal moderate normal high non-schedule driven / normal v.2% 1.8% % Fab & construction cost increase due to design def.862 0.88 0. Sensitivity Analysis for the “Project Schedule” Category Input Variables Project Schedule Design schedule v.981 UT 0. 329 .78 0. Influence Chart for the “Project Schedule” Category The sensitivity analysis shows no particular change trends in engineering performance measures except for the detailed design cost overruns.98 0.8% Moderate % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 1.9% 4.8% Moderate 0.00 0.70 0.4% 2.2% 0.0% Satsifactory Engineering Performance Measures %Design rework Design document release commitment %Detailed design schedule delay 0.0% 0. low low moderate high v.8% 1.0% 4. high non-schedule driven Design-construction overlap about 90% about70% about 50% about 30% about 10% 3.1% 2.0% 1.0% 2.962 1.852 0.983 0.041 0.0% 2.2% 1.90 0.8% Moderate 0.0% Satisfatory Satisfactory Satsifactory 1.Table J.992 0.85 0. The maximum improvement in total engineering performance.1% 2. 0.068.5% % Fab.8% 0.68 0.80 UR 0. Reducing the designconstruction overlap while having a non-schedule driven design schedule results in sizable reduction in the detailed design cost overrun.55 0.6% 4.58 v. high UR 0. & construction schedule delay due to design def.2% 1. 1.

low v.3% % Fab.0% Moderate 1.6% 4.2% 4.3% 0. and constructability savings.703. low "1" low "3" moderate "5" high "7" v. construction and start-up phases. Sensitivity Analysis for the “Project Information Inputs” Category Input Variables Project Information Inputs Completeness of scope definition v. 330 .1% 6.0% %Detailed design schedule delay 12.0% 11.5% 4.5. high "9" Completeness of objectives and priorities v.5% 10. and start-up phases. completeness of objectives and priorities.8% 4.9% 0.5 further represents the influence chart of this category. Table J. Figure J.4% Satsifactory 1. high "9" Quality of constructor input and constructability v.8% % Construction hours for design problem solving % Estimated Dollar savings due to constructability % Start-up schedule delay due to design deficiences 2. high "9" Quality of vendor data v.2% 2. Table J.9% Unsatis.5% 0. construction and start-up phases.0% % Detailed design cost overrun 15. schedule delays in detailed design. fabrication and construction. Improving the information inputs results in a consistent and gradual reduction in design rework. The maximum improvement in total engineering performance. It affects each measures of engineering performance throughout the detailed design. UT.6% % Fab & construction cost increase due to design def.6% 13. cost overruns in detailed design. 4. it contributes to increasing design document release commitment. low "1" low "3" moderate "5" high "7" v. & construction schedule delay due to design def. low "1" low "3" moderate "5" high "7" v. low "1" low low "3" moderate moderate "5" high high "7" v.2% Moderate Moderate Moderate Moderate Low Low Low Low Moderate 1. high "9" Engineering Performance Measures %Design rework Design document release commitment 7.6% 0.Project Information Inputs The project variables included in this category are: completeness of scope definition.9% 1.5% 4. Also.0% % Start-up cost increase due to design deficiences 11. associated with the “Project Information Inputs” category reaches a high value of 0. and the construction hours spent for design problem solving.4% 11. 11.2% 10. completeness of basic design data.7% 0. high "9" Completeness of basic design data v.9% 1.9% 8.0% 9. quality of constructor input and constructability and quality of vendor data. low "1" low "3" moderate "5" high "7" v.1% 1.5 illustrates the sensitivity analysis for the “Project Information Inputs” category. high v. Moderate Moderate Satsifactory 1.8% The sensitivity analysis shows that the “Project Information Inputs” category has the greatest impact on the resulting engineering performance.3% 1.4% 18.

40 0.60 UR 0.084 UT 0. and use of Electronic Data Interchange (EDI).00 Project Infomation Inputs Figure J.30 0.20 0.530 0.00 0 1 2 3 4 5 6 7 8 9 UR 0.60 0.20 0.998 1.6 illustrates the sensitivity analysis for the “Level of Automation” category. Figure J.90 1.00 0.867 0.80 0.80 0. Influence Chart for the “Project Information Inputs” Category Level of Automation The project variables included in this category are: use of 3D CAD modeling.239 0.1.942 10 0.70 0.275 0. 331 .50 UT 0.6 further represents the influence chart of this category.298 0.40 0.10 0. use of Integrated Databases (IDB).610 0. Table J.00 0.343 0.5.

60 0.0% 2.0% 12.718 0. use" v. 1.9% 2.6.58 v.6% 7.6% 3.1% Satsifactory Input Variables Use of 3D CAD modeling Engineering Performance Measures %Design rework Design document release commitment %Detailed design schedule delay 25.65 0. & construction schedule delay due to design def. Influence Chart for the “Level of Automation” Category The sensitivity analysis shows that the “Level of Automation” category widely affects the schedule delays and cost overruns of the detailed design phase. Increasing the level of automation causes a reduction in both schedule delays and cost overruns of the 332 .7% 5.893 1. low "limited use" low "limited/normal use" moderate "normal use" high "normal/ext. insig. sign.55 0.919 0.058 1. low "limited use" low "limited/normal use" moderate "normal use" high "normal/ext. low v.1% 1.2% % Construction hours for design problem solving Moderate Moderate Moderate Low Low % Estimated Dollar savings due to constructability Moderate Moderate 0.6% 2.6.1% 1.68 0.3% % Fab.033 UT 0.826 0.4% 1.08 0.Table J.776 0.00 0.000 1.75 UT 0.7% 2.8% Moderate 0.6% Satisfactory Satisfactory Satsifactory 3. high "extensive use" Use of Integrated Databases v.85 0.88 0.2% 1.7% % Start-up cost increase due to design deficiences 4.70 0.898 0.8% 1.0% 1.5% 15.1% % Fab & construction cost increase due to design def.78 0. high "extensive use" Use of Electronic Data Interchange (EDI) v.7% 0.8% 0.90 0. moderate sign.2% 0. use" v. high "extensive use" 2.95 1. high v.80 UR 0.4% 0. use" v. UR 0.98 0. insig.6% 9.869 0.8% Moderate % Start-up schedule delay due to design deficiences Moderate 0.2% Satisfactory 2. v.9% 3.50 Level of Automation Figure J. Sensitivity Analysis for the “Level of Automation” Category Level of Automation v.7% 17.4% 1.5% % Detailed design cost overrun 19.3% 0. 0.2% 1. low "limited use" low low "limited/normal use" moderate moderate "normal use" high high "normal/ext.

094. and change communication system. Project Changes The project variables included in this category are: percent TIC scope changes. through reducing percentage of changes and improving change communication. Table J. Figure J. The sensitivity analysis shows that the “Project Changes” category has a noticeable impact on the various measures of engineering performance related to the detailed design phase.detailed design phase until a certain level. change management procedure. thus resulting in an overall improvement in the total engineering performance. Also.7 further represents the influence chart of this category.7 illustrates the sensitivity analysis for the “Project Changes” category. UT. UT. associated with the “Project Changes” category is 0. UT. detailed design schedule delays and cost overruns. Afterwards. this improvement in the “Project Changes” category reduces the construction hours spent on design problem solving. However. The maximum improvement in total engineering performance. 333 . The maximum improvement in total engineering performance. associated with the “Level of Automation” category is 0.201. Improving the “Project Changes” category. the use of these automation tools on the extensive side causes an increase in schedule delays and cost overruns. this increase in delays and overruns is counterbalanced with reduction in construction hours for design problem solving. leads to a gradual reduction in design rework.

1% 0.60 0.0% 4.90 0.8% Moderate % Start-up schedule delay due to design deficiences % Start-up cost increase due to design deficiences 2.9% 0.2% Low-Moderate % Estimated Dollar savings due to constructability Moderate Moderate 0.2% Moderate 1.3% 2.9% 2.8% Moderate 0.1% Satsifactory %Detailed design schedule delay % Detailed design cost overrun 4. low low moderate high v.0% % Fab & construction cost increase due to design def. Influence Chart for the “Project Changes” Category 334 0. 0.2% 1. Sensitivity Analysis for the “Project Changes” Category Engineering Performance Measures Input Variables Project Changes Percent TIC scope changes v.88 0.0% 0.75 UT 0.7.3% 2.2% % Construction hours for design problem solving 1.9% moderate 4. 0.78 0.2% 0.70 0.911 Project Changes Figure J.2% Design document release commitment high Satisfactory Satisfactory 8.2% Moderate Moderate Low-Moderate 1.025 1.68 0.0% 4.857 0.2% 0.8% Moderate 0.80 UR 0. high UR 0.6% 1.8% 1.3% % Fab.55 0.8% Satsifactory 0.971 0.7.95 1.048 UT 0.1% Satisfactory 2.817 0. & construction schedule delay due to design def.890 0. low low moderate high v.843 0.50 .65 0.98 0.08 0.2% 1. high about 9% about 7% about 5% about 3% about 1% Change management procedure high high high high high Change communication system %Design rework low low-moderate 5.Table J.4% 4.9% 2.940 0.58 v.7% 4.85 0.9% 3.5% moderate-high 4.987 1.00 0.8% 0.4% 1.

APPENDIX K ADDITIONS TO THE BENCHMARKING AND METRICS SURVEY 335 .

2. small. Engineering Work-hours Budget Piping Design as % of TOTAL ENGINEERING work-hours Budget Actual Actual Piping Design includes routing. detailing from P&Ids.PROPOSED ADDITIONS TO THE BENCHMARKING AND METRICS SURVEY (add to the end of the survey) ENGINEERING PRODUCTIVITY SPECIAL FOCUS AREA PIPING DESIGN 1. Please provide the following basic project data as outlined in the table Data Item Size of plant compared to other projects from the same industry sector (very small. medium. supports. very large) Value/Information      Very Small Small Medium Large Very Large Was Detail Engineering effort split between multiple design sites (yes. 3D presentation. please provide the quantity of each installed material. large. insulation. For the identified project. In this identified project. specifications and piping material. stress analysis. measured in the specified unit. isometrics. no)? 336 . please provide the budgeted and actual engineering work hours associated with the project and also the percentage piping design was of the TOTAL Engineering Work-hours. OUTPUTS Mechanical Equipment Number Length of Pipe: Small Bore < 3” Large Bore > 3” Units of Measure pieces -------feet feet QUANTITY Comments NOTE: See appendix for equipment counting instructions 3.

moderate. etc. low. % fabrication schedule change due to designrelated rework % construction engineering-hours for piping design problem solving (very low. very high). For the following list of selected measures of piping design effectiveness.4. high. % piping Rework during start-up due to design deficiencies % startup schedule overrun due to piping design deficiencies % startup cost overrun due to piping design deficiencies including rework costs 5% 337 Satisfactory 2% 5% 1% 1% 2% moderate moderate 10% 3% 2% Actual Results / Linguistic Description . % estimated piping constructability dollarsavings as a percentage of TIC (very low. very satisfactory) % piping engineering cost overrun (OR actual cost vs. Measure Example % piping engineering schedule overrun (OR actual duration vs. satisfactory. design omissions. very high). unsatisfactory. high. budget) % piping rework due to design errors. % piping rework due to vendor changes % piping fabrication cost overrun due to designrelated rework. low. planned duration) Piping design-document-release versus commitments (very unsatisfactory. please provide the actual result of each measure for this selected project for the PIPING DESIGN EFFORT. moderate. moderate. design changes.

actual duration 0 1 2 3 4 5 6 Ease of Start-Up • Start-Up Days • Budget vs. actual start-up time • Operator work-hours • % Rework during start-up. Place a (X) mark in the appropriate cell that indicates where you were at end of project. where 0 represents very unsatisfactory / very bad . use the following scale to identify the degree of satisfaction with each measure and which is of crucial value to customers. • • • 0 Field Engineering work-hours Clarity Completeness 1 2 3 4 5 6 Cost of the Design Effort • Piping design cost • % engineering $ as a % of TIC • Piping design work-hours • % piping engineering $ as a % engineering $ 0 1 2 3 4 5 6 7 8 9 10 Constructability • Tolerances • Specialized Equipment • Different Crafts • Materials and Technology • Pre-assembly • Estimated $ savings as a % of TIC 0 1 2 3 4 5 6 7 8 9 10 5 6 7 8 9 10 Economy of the Design • Facility Layout • Overspecified Materials • 0 Oversized Members 1 2 3 4 Performance Against Schedule • Document Release • Project scheduled duration vs. This set is intended to measure various aspects of the piping design and engineering performance throughout the project. The following list represents a set of measures of piping engineering and design effectiveness. S C A L E 0 1 2 3 4 5 6 7 8 9 Accuracy of the Design Documents • • Number of drawing revisions / revised drawings Number of spec revisions/ revised specs • Reworked spools • Reworked hours 0 1 2 3 4 5 6 10 7 8 9 10 7 8 9 10 Usability of the Design Documents • • • Drawing Size Number of Drawings Cross-References between dwg. 10 representing very satisfactory / very good. actual duration • Intermediate Release • Engineering scheduled duration vs. Considering this project. • Maintenance work-hours 338 7 8 9 10 .5.

6. PROJECT PIPING COMPLEXITY a How new was the “Process or Facility Technology” for this project to the owner? 0 Duplicate of Exsiting technology 1 | Scale up/ down of existing technology 2 | 25% new technology 3 | 50% new technology 4 | 75% new technology 5 100 % new technology b How new was the “Process or Facility Technology” for this project to the Design organization? 0 Duplicate of Exsiting technology 1 | Scale up/ down of existing technology 2 | 25% new technology 3 | 50% new technology 4 | 75% new technology 5 100 % new technology c What kind of impact did legal or environmental regulations have on this project scope? 0 No impact 1 | Insignificant impact 2 | Little Impact 3 | Typical impact 4 | Very stiff requirements had to be met 5 Regulatory requirments were reason for project d How large an impact did site conditions have on the design of this project? 0 No impact 1 | Insignificant impact 2 | Little Impact (New Unit in existing plant) 3 | Typical Impact for new site 4 | Slightly more than normal impact 5 significant impact e Please indicate the method of acceptance testing used on this project. No Assessment Demonstrated operations at achieved level Formal documented acceptance test over a meaningful period of time f To what extent did Detail Design re-use previously designed standard modules ? 0 Essentially No Re-use 1 | Minor amounts of re-use 2 | Re-use in existing design standards 339 3 | Typical design module re-use 4 | More than usual module design re-use 5 Signifianct Module Design Re-use .

1 2 3 4 5 Low Complexity Medium Complexity High Complexity Low Complexity. etc High Complexity. previously used facility configuration or geometry.Characterized by the use of no unproven technology.g How many times has the owner company worked with the Detailed Design Organization before this project ? 1 2 3 4 5 0 Projects (first time working together 1-2 Projects 2-10 projects 11-20 projects > 20 projects h What Engineering Design Standards were used on the project? (Check all that apply)  Industry Standards were used  Design Contractor Standards were used  Owner Company Design Standards were used  Project Specific Design Standards were used i On an overall basis. new construction methods. new facility configuration or geometry. large facility size or process capacity. an unusually large number of process steps. proven construction methods. etc 340 .please indicate the level of complexity for this project compared to other projects from the same industry sector. small number of process steps.Characterized by the use of unproven technology. small facility size or process capacity.

Incinerator and thermal reactors (i. Blowers (including motors) count one each. 14. If turbine-driven. If pump is turbine driven. strippers. Heaters of boiler count one each. count one each. instrumentation or relocation is involved. For existing equipment. if piped.e. muffle furnace. 17. 2. If compressor is turbine or gas engine driven. For customer furnished equipment. use procedure listed above for this particular type of equipment. Other major equipment such as materials handling. Ariel coolers count each fan structure. count one each. Drums. 10. 11. count turbines as separate piece of equipment. do not count. If fans are turbine-driven. 19. count as one unit. 8. count turbine or gas engine as a separate piece of equipment. 20. single stage count one each. etc. 12.. Fintube exchangers count one each service. Filters requiring foundation or supports. count each component as one piece of equipment. barometric conductors. count turbine as separate piece (if equipment). Conveyors count one each. use procedure listed above if revision to piping. count each as one piece of equipment. If multi-stage. Lube and seal oil skid mounted units count one each. 16. Shell and tube exchangers. 21. etc. 18. pit scales. Package units. add ½ per stage for extra stages.. 7. 3.. If components of unit require individual supports and piping. count one each.. bins. classify by normal count. count one each.Appendix I Equipment Counting Major equipment count procedure: 1. Flare stacks count one each. Do not count line filters. Ventilating equipment. Mixers or agitators not attached to equipment. 4. 341 . count turbine as separate piece of equipment. hoppers. If not skid mounted. Auxiliary compressor equipment not attached to compressor but requiring individual supports and piping. heating. count unit as one. do not count. etc.) count one each. 5. 6. If fans are turbine-driven. ejectors.e. Ejectors or reducers (jets) requiring foundation or supports. reactors. package or major components. etc. reduction and classification. count one each. If stacked.. Cooling towers count one per cell. count one each. air conditioning equipment. 9. 15. Storage tanks. wired and skid mounted. 22.. Pumps (including motors) count one each. i. count as one unit. 13. etc. count one each. count turbine as separate piece of equipment. If attached. towers. Process and utility compressors electric driven. If connected to equipment or in-line. not stacked.

APPENDIX L PRESENTATION MATERIALS FOR THE PLENARY SESSION AND THE IMPLEMENTATION SESSION AT THE 2001 CII ANNUAL MEETING 343 .

construction Lines of Code/hour did not work well • Little on engineering engineering profession • Defined clear starting point point • Biased toward tools tools or techniques • Adjusted for complexity • Abundance of of conclusions. BE&K. Steel Steel Arizona State State University University Black & & Veatch Veatch Procter & & Gamble Gamble 2 Research Objectives Problem Statement • Determine Determine present present practices and why they do not work well • Engineering productivity measurement is a critical element of project performance • Find productivity improvement success stories in other other industries and learn from them • Present practices do not work well in driving the improvement that today's design design tools offer • Develop an Engineering Productivity Model that that addresses shortcomings shortcomings of present methods • Surprisingly little effort has been expended in the engineering productivity arena • Test new model with pilot study • Develop implementation plan 3 4 Productivity Literature Software Industry • Focuses on manufacturing.Plenary Presentation Slides: Engineering Productivity Measurement Research Team Engineering Productivity Measurement Bob Shoemaker John Atwell Bill Buss Luh-Maan Chang Chang Glen Glen Hoglund Duane McCloud Deb McNeil Navin Patel Patel John Rotroff Ken Walsh Denny Weber Weber Tom Zenge Engineering Productivity Measurement Research Team Bob Bob Shoemaker Shoemaker BE&K CII Annual Conference 2001 1 BE&K.Schematics and specs from database database . conclusions. lack of data • Adjusted for defects • Service professions focus on profitbased measures • Developed standardized scoring system • This proven methodology has driven significant improvement in the software delivery process • The software industry approach has has applicability to engineering 5 6 Present Practices Problems with Present Practices Most companies: • Lack of standards for format and content • Track production of drawings and specifications versus budget • Difficulty in in tracking actual effort effort dedicated to each each deliverable • Use % TIC as target engineering budget • No correlation between between number of deliverables and installed quantities quantities or or effectiveness • Use earned value concept concept in some form • Computer-based tools: • Have no uniform system of measurement . Chair Bechtel Air Air Products Products Purdue University Ontario Hydro FPL FPL Energy Dow Chemtex U.S.Physical drawings replaced replaced by models models 7 8 344 .

5. 6. Project Management Input Input Quality Factor 4. Procurement 5. Chemical Process 8. 16 345 X X Effectiveness Effectiveness Factor % Field Rework Rework . Instrument/Controls 13 14 Engineering Productivity Model Engineering Productivity Model Input Input Quality Factor Project Definition Definition Rating Index Index X X Scope & & Complexity Complexity Factor Project Characteristics Characteristics X X Raw Raw Productivity Productivity Hours Hours Installed Qty. Project Characteristics Characteristics 9. 7. 1. 9. 2. Mechanical Process Scope & & Complexity Complexity Factor X X X X Hours Hours Installed Qty. 3. Piping Piping X X Project Definition Definition Rating Index Index 7. 4. Electrical Electrical Raw Raw Productivity Productivity Effectiveness Effectiveness Factor % Field Rework Rework Focus of Piping Pilot 10. Civil/Structural 2. 8. Architectural 3. X X Input Input Quality Factor Effectiveness Effectiveness Factor % Field Rework Rework Project Definition Definition Rating Index Index 15 X X Scope & & Complexity Complexity Factor Project Characteristics Characteristics X X Raw Raw Productivity Productivity Hours Hours Installed Qty. Mechanical 6.Levels of Productivity Levels of Productivity Company EPC Work Process Company EPC Work Process Project Project Overall Engineering Overall Engineering Discipline Discipline Deliverable Deliverable Individual Individual 9 10 Levels of Productivity Levels of Productivity Company EPC Work Process Company EPC EPC Work Process Project Project Project Overall Engineering Overall Overall Engineering Engineering Discipline Discipline Deliverable Deliverable Individual Individual Individual 11 12 Disciplines Engineering Productivity Model 1.

Dow. Effectiveness Effectiveness Factor % Field Rework Rework 18 Testing the Model for Piping Discipline Summary Projects analyzed: 40 This quantity-based model: model: Objectives •• Addresses Addresses shortcomings shortcomings of of present present methods methods -.Established Established good correlation between hrs/ft hrs/ft and dominant dominant variable variable •• Focuses Focuses engineering engineering effort effort on on capital capital investment investment •• Uses Uses data data already already collected collected for for construction construction productivity productivity Learning •• Is Is applicable applicable to to all all industries industries and project types.Screen for for dominant influence factors -.Established Established number of of equipment pieces pieces as as aa dominant scope/complexity scope/complexity variable variable -.Engineering Productivity Model Input Input Quality Factor Project Definition Definition Rating Index Index X X Scope & & Complexity Complexity Factor Project Characteristics Characteristics X X Raw Raw Productivity Productivity X X Hours Hours Installed Qty. -.Valuable data is being ignored in in detail detail design design phase phase of projects projects •• Will Will continuously continuously improve improve with with use use 19 20 What’s Next Implementation Session Panel •• Call to companies with expertise and interest interest in in this this previously neglected neglected arena arena Deb McNeil Dow. Engineering Productivity Model Input Input Quality Factor Effectiveness Effectiveness Factor % Field Rework Rework Project Definition Definition Rating Index Index 17 X X Scope & & Complexity Complexity Factor Raw Raw Productivity Productivity X X Project Characteristics Characteristics X X Hours Hours Installed Qty.Verify Verify input/output correlation for for hrs/ft hrs/ft •• Allows Allows progress progress tracking tracking with with present present engineering engineering tools tools Results Results •• Engineering Engineering and and Construction Construction on on same same project project control control basis basis -. Moderator •• Develop Develop detailed models for for each discipline John Atwell Bechtel Bechtel •• Implement on on projects projects Ken Walsh Walsh Arizona State State •• Industry use of of standardized system system for internal improvement improvement and external external benchmarking Tom Zenge Procter Procter & Gamble Stake goes well well beyond beyond engineering engineering cost 21 22 Implementation Session • Learn how the software industries’ experience validates the approach • See what benefits to effective project delivery the future holds • Learn the many different ways you can contribute to a significant improvement step in the EPC industry 23 346 .

Implementation Session Handout: 347 .

348 .

ASCE. NY. 3rd Edition. F. 540545. 1(2). Construction Industry Institute. 112(4). 122(3). and Gill. and Farrington. and Kontny. 134-141. Lilburn. McGrawHill.” Proceedings for Construction Congress I. A. A. G. 2(3). Bishop. C. 3(4). (1988) “Assuring Quality in Design Engineering. Barlow. Inc. A. ASCE. E. ASCE. ASCE. (1985) “Effective Management of Effective Design. Austin. K. GA. S. Burati. C. and Thumann. Spring 1986. The Fairmont Press. L. B..” National Productivity Review. ASCE.” Journal of Construction Engineering and Management. (1986) “Productivity in Engineering Disciplines. ASCE. J. and Salazar. (1987) “Costs of Quality Deviations in Design and Construction. Construction Industry Institute. (1996) “Modeling Project Performance for Decision Making. J. D. 51-66. B. SD-26. New York. M. (1986). TX.REFERENCES Alarcon. J. Chalabi. and Ashley. V. L. design-construct and general contracting. 141-147.” Journal of Management in Engineering. R. E. B. J. Bent.” Journal of Management in Engineering. “CAD and Management of Construction Projects. J. J.” Journal of Management in Engineering. A. (1989) “Variables Impacting Design Effectiveness Key to Project Quality. (1987) “Automation’s Impact on Engineering design Process. ASCE. L.. 275-280.” Journal of Management in Engineering. (1987) “Input Variables Impacting Design Effectiveness” Report. 2nd Edition. D. (1994). K. 349 . D. F. 557-565. C. SD-29. F. Burgess.” Journal of Construction Engineering and Management. 16-22. Project Management for Engineering and Construction. Armentrout. Austin. Breen. 265-273. 10(4). ASCE. NY. J. Chalabi. 4(1). (1986) “Engineering Productivity Management and Performance Measurement. W.” Journal of Management in Engineering. TX. and Paulson. New York. 24-27. (1992) Professional Construction Management: including CM.” Report. Beaudin. B. (1994) “Engineering Management and Project Triangle. Atkin. Bolte. and Madden. Barrie. A. F. J.

Georgy.. ASCE. B. (1991). S..” Automation in Construction. 116(4). Belmont. 142-150. 5-15. 4(1). and Teicholz. Elsevier. J. L. and Walsh. (1999) “Critical Success Factors for Different Project Objectives. Gibson. winter 1993/1994. Choi. L. 121(1). (1992) “Integrated Data-Base Systems. 4(1995). K. W.E. ASCE. FL. P.. Kog.” Journal of Computing in Civil Engineering. D. 49-56. Aouad. and Bell. K.” Journal of Construction Engineering and Management. K.M.” Report. Clemen.” Journal of Management in Engineering. ASCE. D. C.. G. T. (1993) “Reducing Complexity and Working with Bottlenecks Improves an Oil Refinery’s Engineering Performance. 130-142. and Skibniewski.. Gibson. (1996) “Achieving Industrial Facility Quality: integration is key. (2000) “Engineering Performance in Industrial Construction. R.. Child. ASCE. C.” Report. Orlando. ASCE. J. (1991) Making Hard Decisions: an introduction to decision analysis. Fergusson. P.. S. (1995) “An Information Engineering Approach to Modeling Building Design. IR125-2. N. Y. K. CII (1997) “Determining the Impact of Process Change on the EPC Process: A model to quantify time and cost improvements. C. SD-12. PrenticeHall. 12(1). L. Brown.” Journal of Construction Engineering and Management. (1990) “Costs and Benefits of Computerization in Design and Construction. C. N. and Ibbs.” Journal of Construction Engineering and Management.” Journal of Construction Engineering and Management. C. Ford. J. and Barlog. 91106. Kirkham. 118(1). New Jersey. 50-59. 2nd edition. M. Diekmann. and Young. J. Copper. Eldin. (1999) Neural Networks: a comprehensive foundation. 163-175. Ginn.” Journal of Construction Engineering and Management. K. Chang. G. Austin. Inc. “Management of Engineering/Design Phase. Chua. L. Haykin. B. Oxman. Duxbury Press. ASCE. F.. P. K. 350 .” National Productivity Review. 117(1).. (1995) “Neural Network Method of Estimating Construction Technology Acceptability. C. and Bell. E. E. M.. (1990) “Electronic Data Interchange in Construction. Brandon. and Loh. ASCE. Construction Industry Institute. T. 63-77. R. and Thrush. (1986) “Project Control in Design Engineering. Austin. G. TX.. ASCE.” Proceedings for Construction Congress VI. M.Chao. R. CA. H. Construction Industry Institute. 727-737. G.. 125(3).. TX.

. 351 . F. P. ASCE.” Journal of Management in Engineering. 400-416.. J.. (1988) “Design Productivity: a quality problem. J. Lin. J. J. 194-200. T.. Mollaghasemi. CA. J. O’Neil. IL. 28-33. New York. W.. NJ. Moselhi.” Journal of Construction Engineering and Management.. (1990) “Framework for Analysis of Performance.” Technical Briefing. Wasserman.” Proceedings for Second International Conference on Engineering Management. ASCE.” Journal of Management in Engineering. M and Pet-Edwards. R. and Escobar. W. R. (1998) “A Database Architecture for Design Collaboration. (1989). (1995) “Modeling Initial Design Process using Artificial Neural Networks. and Fazio. and experimental designs. and Lee.” Journal of Management in Engineering. Managing of Industrial Construction Projects. Irwin. T. A. Dellen Publishing Company. C. Los Alamitos. Homewood. M.. 475-483. (1992) Statistics for Engineering and the Sciences. 606-625. and Kutner. McConnell. F. IEEE Computer Society. Mendenhall. C.” Journal of Management in Engineering. 9(3). 1(2). and Thompson. 4(4). H. 3rd Edition. 399-414. R. T. Elsevier.” Journal of Computing in Civil Engineering. McFarland. 7(1998). 5(4). (1990) Applied Linear Statistical Models: regression. L. and Sincich. IEEE Computer Society Press. G. D.” Automation in Construction. T.Jeng. ASCE. T. 10(5). M. D. 79-94. (1989) “Operations Management in Engineering Office. (1994) “Managing Performance of Engineers. J. W. A. ASCE. (1984) “Earned Value Technique for Performance Measurement. Prentice-Hall. ASCE. Hegazy. A. Rumsey. 116(3). M. Toronto. McGeorge. (1991) “Neural Networks as Tools in Construction. Meserve. (1997) “Making Multiple-Objective Decisions. New Jersey.” Journal of Construction Engineering and Management. 117(4). CA. 350-362. and Deshpande. (1989) “Managing for Productivity in Engineering Design: review and a case study. F. Inc. Canada. and Eastman. San Francisco. Neter. G. O. Nicholas Publishing Company. ASCE. S. Prentice Hall PTR. 89-94. Touran. L. A. Mukherjee. Karaa. ASCE. analysis of variance. (1996) Neural Fuzzy Systems: A neuro-fuzzy synergism to intelligent systems. H. Maloney.

NY. J. (1986).. R. New York. (1980) The Analytical Hierarchy Process: planning. Society for Computer Simulation International. B. Stull. Austin. R. New York. Vanegas. (1998) “A Framework and Practices for Cost-Effective Engineering in Capital Projects in the A/E/C Industry. H. Tucker. ASCE. 11(2). Construction Industry Institute. (1997) Fuzzy and Neural Approaches in Engineering. 352 . McGraw-Hill Publishing Company. Construction Industry Institute. J. TX. NY. H. 248-265. O. R. and Powell. W. (1994) “Integrated Design-Process Model. New York. E. V. SD-22. L. John Wiley & Sons. H. N. and resources allocation. J. and Maldonado. and Norton. John Wiley & Sons. TX. Construction Industry Institute. (1992). RR112-11. 10(5). K. B. A. Computers and Engineering Management. ASCE.. and Tucker. and Scarlett. L. NY. T.Ritz. NY. J. New York. 26-29. F. T. (1990) “Comparative Impact of Quality Improvement Strategies on the Productivity and Quality of the Design-Construction Process. Tsoukalas. J. Shah. NY. TX. C. (1992) “Object-Oriented Approaches for Integrated Engineering Design Systems. 270-281.” Report. Austin.” Report. R. R. and Uhrig. 113-118. M. Wheeler. (1990). Inc. W. 55-62. John Wiley & Sons. priority setting. (1990) Total Engineering Project Management. E. (1995) Project Management in the Process Industries. Sanvido. L. McGraw-Hill. Wuellner. A. Austin. and Stubbs. Thamhain. “Evaluation of Design Effectiveness. ASCE.” Journal of Computing in Civil Engineering. 6(3). (1990) “Project Performance Evaluation Checklist for Consulting Engineers.. R. Martini. (1986) “Objective Matrix Values for Evaluation of Design Effectiveness” Report. McGraw-Hill. SD-16.. R.” Journal of Management in Engineering. G. NY. and Murphy. Inc.” Journal of Management in Engineering. G. F. J. Whittaker. (1995) “Performance Appraisal for Improved Productivity. Engineering Management: managing effectively in technologybased organizations. K. Sause. Hastak. New York. ASCE. Pearce. 6(3). Saaty. Sikorsky..” Proceedings of the 21st Annual Pittsburgh Conference.” Journal of Management in Engineering. L. New York.

New York.). J. L. Holzman. Berkeley. University of California. Marcel Dekker. 353 . (1977) Theory of Fuzzy Sets. Kent (eds. CA. Included in Encyclopedia of Computer Science and Technology (1977). A. and A. NY.Zadeh. A. Belzer. Electronics Research Laboratory. College of Engineering.