You are on page 1of 64

Six Sigma

DMAIC Guide
Q u i c k G u i d e

Six Sigma Team


3M Germany
All rights, including translation into foreign languages, reserved.

No part of this work may be reproduced in any form, even for teaching
purposes, or processed and duplicated by the use of electronic media,
without prior written permission.

copyright 2003 3M Deutschland GmbH


Introduction
The systematic, targeted implementation of the Six Sigma
corporate initiative is very extensively determined by mastery and proj-
ect-related application of the tools and quality techniques of the DMAIC
methodology.

For this, the appropriate training for Champions, MBBs, BBs and GBs
provides the necessary platforms of knowledge.

In response to numerous requests, and on the basis of knowledge


acquired by the trainer teams, we have condensed the ele-
ments of the extensive training material down to their essential con-
tent and, by the use of the examples, created understandable aids to
interpretation.

We hope that this tool is of assistance to you in the scope of your project
responsibility.

D. Kuncar Dr. . Ertrk E. Spenhoff

3
DEFINE
1. Project Charter

MEASURE
2. Process Map
3. Cause & Effect Matrix
4. Measurement System Analysis
5. Process Capability Analysis

ANALYZE
6. Failure Mode and Effect Analysis
7. Multi-Vari Analysis

IMPROVE
8. Design of Experiment

CONTROL
9. Control Plan

4
User Instructions
How do I make use of this brief DMAIC manual?
As a clearly arranged reference work, it provides swift access to
terms and parameters used in the context of the DMAIC
methodology. In discussions and project reviews it can be very helpful
for recalling or checking on concepts and valuation limits in the course of
assessing test results.
The brief DMAIC manual is arranged in line with the structure and chro-
nology of DMAIC. The stages of the DMAIC process and tools, as identi-
fied in the margins of the pages, quickly show where the explanatory notes
tie in. In addition, each stage of DMAIC is colour-coded.
The parameters and evaluation criteria serve to eval-
uate the results from each successive phase of projects. They are
supplemented with explanations of terms where necessary.
Concrete examples to clarify the subject in detail or a graphi-
cal presentation providing orientation in cases of complex
relationships (e.g. multi-vari analysis) are given under the heading
Example.
Practical knowledge that can be used to facilitate work is listed under
Useful Hints.
Under Lessons Learned you can make your own notes.
The Pitfalls are cautionary notes based on difficulties and
problems encountered during the course of projects.
The Review Check Questions are listed to enable the reader to con-
duct an independent check in advance of a review and to ensure that a
list of questions is readily available during the review.
A glossary of terms frequently used in Six Sigma is given at the end of
the book.

5
DEFINE
Project Charter
DEFINE
MEASURE ANALYSE IMPROVE CONTROL

Project Charter
Project Charter

Purpose
The purpose of a Project Charter is to initiate a Six Sigma project by defin-
ing its scope and the project variables. The Project Charter is the basis
Process Map

for Black Belts and Green Belts to work from in the sense that it clearly
defines the project.

Measurements and Evaluation Criteria


C&E Matrix

Baseline:
Baseline is the actual performance of the process that is to be improved.

Entitlement:
Entitlement is the best possible performance that has been ob-
MSA

served recently or was observed in a Benchmarking study. It can be perfor-


mance as defined during development or by the equipment manufacturer.
PCA

Metrics:
COPQ (Cost of poor Quality): COPQ is a measurement of documented
costs of mistakes or errors, such as complaint costs, rework costs, waste,
FMEA

etc.

Cpk (Process Capability Index): The CpK measures a process


variables short term performance. It requires ranges to be defined. Target
Multi-Vari

values are not sufficient to calculate a CpK. Cpks less than 1 indicate
room for improvement whereas Cpks greater than 1.33 indicate only
limited room for improvement.
DoE
Control Plan

8
DEFINE
MEASURE ANALYSE IMPROVE CONTROL

Project Charter
Ppk (Process Performance Index): The Ppk measures a process vari-
ables long term performance. It requires ranges to be defined. Target val-
ues are not sufficient to calculate a Ppk. Ppks less than 1 indicate room
for improvement whereas Ppks greater than 1.33 indicate only limited

Process Map
room for improvement. PPks greater than 1.5 fulfil 6-requirements.

DPU (Defect per Unit): The DPU is a general measurement for the
number of errors per unit. In this case, Error is a synonym for unwanted
events (e.g. non-answered telephone calls per day, etc.)

C&E Matrix
DPMO (Defects per Million Opportunities): As opposed to the DPU,
the DPMO provides comparable values under differing complexities of
the attribute. Weighting of the different complexities is carried out under
Opportunities (Types of Errors).

MSA
RTY (Rolled Throughput Yield): RTY provides a processs actual yield
based on the individual process steps.

PCA
Conditions

FMEA
The Project Ys should, wherever possible, be measured with one of
these metrics.
The Entitlement must be verifiable. Wishful thinking or estimates are
not helpful.
Multi-Vari

Project goals should be defined so that the Gap (Entitlement-


Baseline) is closed by 75%.
DoE
Control Plan

9
DEFINE
MEASURE ANALYSE IMPROVE CONTROL
Project Charter

Example: Goal Tree


Coporate Ys Business Ys Project Ys Process Ys
Process Map

Market ???
Penetration of
Market Product A
Penetration ???
Growth Costumer
New Market Satisfaction
Development
C&E Matrix

???
Number of
Dead SKUs
Lagerbestand ???
Cash Lagerzeit
Days Sales
MSA

Outstanding
PCA

???
Yield (RTY)
Manufacturing
Cost ???
FMEA

Cost Downtime
Energie
Cost
Multi-Vari

Hopper
Project list
Solution is clear
with priority and
simple Neither clear nor simple
Yes
Not easy
Not easy
DoE

and not clear


Quick Hits
Is something Other initiatives Six-Sigma-Projects
Driven by initiative complex relationships
Control Plan

broken?
insufficiently defined? techniques known process oriented
uneducated? low engineering reduce dispersion (mean error)
change of cultures remove errors
or rules requires analysis

10
DEFINE
MEASURE ANALYSE IMPROVE CONTROL

Project Charter
Tips
Measurements can be primary, secondary or contrary.

Examples: Inventory Manufacturing


Primary: Inventory level RTY
Secondary: rented warehouse space waste disposal cost

Process Map
Contrary: ability to deliver productivity

Include project goals for all measurements. These goals can be revised
during the project.
Entitlement of complaints should always be set to zero.
Stakeholder Analysis helps to find the right champion.

C&E Matrix
The Scope should be large enough to encompass points that have a
direct input/influence on the project.

Pitfalls

MSA
Projects with too many measurements become problematic.
Projects take too long they should be able to be completed in 4 to 6

PCA
months.
Project complexity is too great because too many points need to be
clarified. (Scope/boundaries - Definition)

FMEA
Even when the solution seems apparent, the DMAIC method should
still be used.
A project is only feasible if a measurable Project Y exists.
A goal should be expressed as change the Project Y from __ to __,
Multi-Vari

general expressions such as a reduction of __% are not sufficiently


accurate.
The boundaries to Just-do-it-projects is unclear (e.g. reduction of
manufacturing cost through replacement of raw materials. Better:
Reduction of raw material costs.)
DoE

The process to be improved is unknown or cannot be influenced.


Projects which plan to incorporate new methods are unsuitable (e.g.
introduction of new software).
Control Plan

11
DEFINE
MEASURE ANALYSE IMPROVE CONTROL
Project Charter

Review Checklist
1. What is the expected business impact?
2. Has a Goal Tree been developed linking the Project Y to
Business Unit Y to Corporate CTX?
3. Has the potential savings benefit been identified and
validated?
Process Map

4. What is the business impact to date?


5. What is the problem and where is it focused?
6. Has this been worked on before and how are you leverag-
ing that effort?
7. What are the boundaries of the process?
8. Is the scope meaningful and manageable?
C&E Matrix

9. Is the defect definition clear and concise?


10.How do we currently measure the Project Y?
11.Do you have Baselines established?
12.How was Entitlement determined?
13.What percent of the entitlement Gap is being targeted by
MSA

this project?
14.Who is on the team and who is championing the effort?
15.How effective is the team and the facilitator (Project Leader)?
PCA

16.What actions were taken to correct for any issues


uncovered?
17.What barriers to success exist?
FMEA

18.Has a Stakeholder Analysis been done?


19.Have Influence Strategies been developed?

Lessons learned
Multi-Vari
DoE
Control Plan

12
MEASURE
Process Map

Cause & Effect


Matrix

Measurement
System
Analysis

Process
Capability
Analysis
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Process Map
Project Charter

Purpose

The Process Map depicts a simple Process diagram to identify the Input
Xs (influences) and Output Ys (Process Ys, result variables). The
Process Map

process diagram serves as the direction guild line.

Measurements and Evaluation Criteria


The measurements are the process steps, Input Xs and Output Ys.
C&E Matrix

Process Steps are separated into value creating and non-value creating
steps (potential for reengineering). The Process Steps are activities and
therefore should be defined by verbs. The process should be broken down
into 48 process steps.
MSA

Controlled Inputs are controlled variables that can be used to change the
Output Ys.
For Example: Temperature (X) changes the drying time (Y) or number
PCA

of service personnel (X) changes the response time (Y) for customer
questions.
FMEA

Uncontrolled Inputs are uncontrolled variables that have an effect on the


Output Ys. They can be defined as Common Cause or Special Cause
effects. These variables are measurable but difficult to set.
For Example: Room Temperature (X) influences Adhesion (Y) or
Multi-Vari

Illness (X) influences Company Results (Y).


DoE
Control Plan

14
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Project Charter
Critical Inputs are uncontrolled Input Xs that have a proven
special cause effect on the Output Ys. These inputs can also be non-
quantifiable. They must be marked boldly on the Process Map.

Process Map
Outputs are responses or results of the process. Differentiate
between the Project Ys (Targets) and Process Ys (responses that influ-
ence the Project Ys).

Conditions

C&E Matrix
The Input Xs and Output Ys should contain nouns.
Input Xs can be Output Ys of previous Process Steps.
Attributes should be based on specific features (e.g. availability, speed,
etc.) rather than the source of the feature (e.g. Maker G6) to be able to

MSA
qualify as Input Xs or Output Ys.
The Input Xs and Output Ys should be quantifiable
wherever possible.

PCA
Include Output Ys fr process, products and service.
Avoid monetary Output Ys wherever possible.

FMEA
Multi-Vari
DoE
Control Plan

15
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Example: Process Map


Inputs / Controls Process steps Outputs
Product name U Part number
Order amount U Inventory amount
Delivery date U Production amount
Payment method U Order time to delivery
Process Map

Order date U Accept order Confirmed delivery time

Size C
Production amount U Production date
Inventory amount C Delivery date
Production rate C Production time
Set up time C
C&E Matrix

Availability U Plan order

Production date C
Process parameters C Production date attain-
Materials C ment
Availability
MSA

Set up machine

Production amount U
PCA

Delivery date C Run time


Settings C Product attributes
Lot number U
Produce products
FMEA

Tips
Multi-Vari

A cause and effect diagram (Fishbone, Ishikawa Diagram) can


be helpful to define Inputs by separating them into the categories
Person, Machine, Material, Measurement, Method, Management and
Environment.
DoE

The Process Steps are acivities, so use verbs.


Define Process Steps first, followed by outputs, then define the nec-
essary inputs.
Control Plan

Use features (e.g. speed) rather than feature sources


(e.g. Maker G6) for the Input Xs or Output Ys.

16
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Project Charter
Feature source Feature Feature attribute
Variabe X or Y Attainment x or y
Person Weight Measurement (e.g. 80 kg)
Height Measurement (e.g. 183 cm)
Eye Colour Category (e.g. blue)

Process Map
Sex Category (e.g. female)
Intelligence Rating (e.g. IQ 120)
Coater Speed Measurement (e.g. 2 m/s)
Availability Measurement (e.g. 50%)
Downtime Value (e.g. 10 breakdowns/month)
Forecast Deviation Value (e.g. 10000 EUR)
Availability Category (e.g. no)

C&E Matrix
Packing list Completeness Category (e.g. no)
Machine readability Category (e.g. yes)
Sales Sales method Category (e.g. distributor)
Customer type Category (e.g. end user)

MSA
Pitfalls
The process has been defined in too much or too little detail.

PCA
Inputs are feature sources and not measurable features.
The Process Map depicts wishful thinking rather than reality.
Experienced team members not available.

FMEA
Multi-Vari
DoE
Control Plan

17
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Review Check List


1. Who helped to develop the map and what organizations do they
represent?
2. Does the Process Map reflect current state or the desired pro-
cess?
3. Are all non value-added steps included?
Process Map

4. What did you learn from the Process Map?


5. What Quick Hits did you find from this effort?
6. What Process Steps does the team feel can be eliminated or
combined to reduce opportunities for scrap and
increase rate?
7. How will you measure the Key Inputs and Key Outputs?
C&E Matrix

8. What characterizes an Uncontrolled and a Controlled


variable?

Lessons learned
MSA
PCA
FMEA
Multi-Vari
DoE
Control Plan

18
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Cause & Effect Matrix

Project Charter
Purpose
The C&E Matrix helps to select the important Input Xs (con-
trolled and uncontrolled inputs) from all of the Input Xs in the

Process Map
process map. Selection is based on the strength of the effect that the
Input Xs are thought to have on the Output Ys.

Measurements and Evaluation Criteria

C&E Matrix
The Weighting of the Output Ys (Rating of Importance to Project)
should be in the range of 1 to 10. The Outputs of the C&E Matrix are
used as selection criteria. A slight difference in weighting will not lead to
a useful selection of the Inputs.

MSA
The Evaluation of the Effect of Input Xs on the Output Ys is carried out
according to the following table:

PCA
Evaluation Comment
0 No effect
1 Weak effect on dispersion

FMEA
3 Medium effect on dispersion and position
9 Strong effect on dispersion and position

The weighted sum of the evaluations serves as a filter based on the Pareto
Multi-Vari

principal. A rule of thumb is: The 5 to 20 largest weighted sums are used to
choose the Input Xs which will be carried over into the FMEA.
DoE
Control Plan

19
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Conditions:
All Input Xs from the Process Map are evaluated C&E Matrix.
More than 10 Output Ys should be never used.
The Output Ys should be Process Ys and Project Ys.
Process Map

PROCESS MAP
Inputs Process Outputs
Steps
C&E Matrix

C&E MATRIX

Inputs Outputs Sum


MSA

Example: C&E Matrix


PCA

Output Ys
Rating of Importance 7 7 5 3 9
FMEA

Particle size

AI Content
Hardness
Density

Torsion
Multi-Vari

Sum

Process Step Inputs Xs


Melting AI Content 9 9 3 9 1 177
Grinding Viscosity 3 3 3 1 9 141
Grinding Solvent content 3 3 3 9 138
Melting Duration 9 9 1 1 134
DoE

Grinding Formulation 3 3 1 1 9 131


Grinding Solvent/particles 3 3 1 1 9 123
Grinding Granulate size 1 1 3 3 9 119
Control Plan

Grinding Drying time 3 1 9 3 100


Grinding Recycling content 3 3 3 1 1 99
Grinding Vortex efficiency 3 3 9 1 1 90
Grinding Air content 3 3 9 1 0 87

20
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Project Charter
Pareto Diagram

200 177
150 141 138 134 131 123 119
100 99 90 87
100
50

Process Map
0
AI Content

Viscosity

Solvent content

Duration

Formulation

Particle content

Granulate size

Drying time

Recycling content

Vortex efficiency

Air content

C&E Matrix
Tips

MSA
The ideal number of Output Ys is between 3 and 7.
In the selection process for Output Ys it may be helpful to look at

PCA
Process Ys and other helpful values
(e.g. Counterbalance: if you want to reduce the changeover time
in manufacturing, you must ensure that warehouse inventory is not

FMEA
increased.)
If the C&E-Matrix is used properly, FMEA will be easier.
A Pareto Diagram helps in visualizing the priorities.
When there are more than 150 Input Xs, start by
Multi-Vari

evaluating the Process Steps and then evaluate the Input Xs in


more detail.
DoE
Control Plan

21
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Pitfalls
It is wrong to ask whether the Input Xs and Output Ys
correlate. It is better to ask how strong the effect of X is on Y.
Correlation does not imply causality.
Only Process-Ys are a Customer Requirement for selection purposes
in the C&E Matrix.
Process Map

If several Output Ys are listed, they can have a higher weight-


ing than a single more important Output Y.
Do not force customer related Output Ys into the C&E Matrix, if
they are not helpful with the project.
C&E Matrix

Review Check List


1. Who provided inputs to the Customer Requirements
for this C&E Matrix?
2. What groups determined the relationship ratings for the Inputs
MSA

and Outputs?
3. What method was used to determine the final relationship score?
4. What surfaced as the top Input Variables from the
PCA

C&E Matrix? Do these make sense?


5. What actions are being taken on the top ranked Input Variables?
6. Are there any Quick Hits that can be assigned to lower ranking
FMEA

Input variables?
7. Do the current Control Plans reflect the need to monitor these
top Process Input Variables?
Multi-Vari

Lessons learned
DoE
Control Plan

22
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Measurement System Analysis

Project Charter
Purpose
The MSA defines the ability of a system to set up a process
correctly (Input Xs) and to evaluate the process (Output Ys). Therefore

Process Map
the MSA is an important requirement in setting up and evaluating pro-
cesses.

Measurements and Evaluation Criteria

C&E Matrix
Validity is a check on whether the correct aspect of a process is being
measured. The data can come from a trustworthy method or source but do
not need to comply with the projects goals.

Accuracy is based on the Trueness and Precision of the data.

MSA
In the transactional processes of business data, accuracy is
evaluated by carrying out Audits. If the data are generated by

PCA
measurement systems, a Gage R&R-Analysis (Repeatability and
Reproducibility) Analysis is carried out. If the data are generated by
a qualitative evaluation, an attributive MSA is used. The Gage R&R-

FMEA
Analysis and the Attributive MSA use values according to the definitions
in the following table: Multi-Vari

Variable MSA Classification Attributive MSA


R&R < 10% good Kappa > 0.90
R&R < 30 % acceptable Kappa > 0.70
R&R > 30 % unacceptable Kappa < 0.70
DoE
Control Plan

23
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Conditions
To audit business data, audit documents must be produced to show
the relevance and correctness. Relevance must always be checked,
especially for relative values such as square metres per time unit or
material, etc.
In the Gage R&R-Analysis repeated measurements on the same part
Process Map

must be possible.
When destructive test methods are used, homogenous material must be
used for the Gage R&R-Analysis. In this case the value %R&R must
be up to 50%.
The Attributive MSA can be used on evaluated data only and not for
measurement of defective parts or number of defects.
C&E Matrix
MSA
PCA
FMEA
Multi-Vari
DoE
Control Plan

24
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Project Charter
Example: Measurement System Analysis
GAGE R&R ANALYSIS DIAGRAM
Observed Process
Dispersion

Process Map
Process Measurement
Dispersion Dispersion (R&R)

Short term process Long term process Systematic Accidential


dispersion dispersion measurement error measurement error
Repeatability
Trueness (measurement equipment
influence)

C&E Matrix
Linearity over the Homogeneity of the
measurement area dispersion

Stabiloity over Reproducibility


the time (operator influence)

StdDev Study Var %Study Var


Source (SD) (5.15*SD) (%SV)

MSA
Total Gage R&R 0.10129 0.52163 9.21
Repeatability 0.03944 0.20312 3.59
Reproducibility 0.09329 0.48046 8.49

PCA
Tester 0.00000 0.00000 0.00
Tester* Parts 0.09329 0.48046 8.49
Part to part 1.09463 5.63736 99.57
Total Variation 1.09931 5.66144 100.00

FMEA
Trueness, linearity and stability form part of the regular calibration
and therefore can be left out of the analysis. Homogeneity is cover-
Multi-Vari

ed by the different parts (different measurement points) and is a part of


the Reproducibility study.

Difference between %P/T and %R&R


DoE

The %P/T quotient is the number of measurement errors on the tolerance,


whereas the %R&R quotient is the number of measurement errors on the
Control Plan

total mean error (Dispersion).

25
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Tips
Project Charter

To validate data from databases (e.g. EUROMS) use known test inputs
or compare with original data (e.g. Delivery note).
Validate data from databases through audit documents of
continuous led audits.
A plausibility check is useful for large amounts of data, whereby the
data for each variable are sorted by size. It is important to return the
Process Map

data to their original state after the test.


The Gage R&R Analysis and the attribute MSA need to have precisely
defined test conditions.

Pitfalls
C&E Matrix

It is difficult but usually not impossible to carry out a Gage R&R


analysis in transactional projects.
The attributive MSA is used for any attributive tester data.
The chosen units in the Gage R&R Analysis may not be representative
of the true process variability. The reason for this is that the choice of
units was not arbitrary (e.g. 10 units from one jumbo rather than from
MSA

several jumbos, 10 units from one run rather than from several runs).
The measuring equipment does not have a high enough resolution.
PCA

Review Check List


1. What are the major sources of measurement error?
FMEA

2. Did you conduct MSA on the Project Ys and all


critical Input Xs?
3. How much measurement error exists in comparison to the pro-
cess variation (%R&R)?
Multi-Vari

4. How much measurement error exists in comparison to the speci-


fications (%P/T)?
5. Is the measurement system acceptable for the process improve-
ment efforts?
6. If not, what actions do you suggest?
DoE

7. How were the samples chosen?


8. Was the team sensitive to sub-grouping for
sample selection?
Control Plan

26
MEASURE
DEFINE ANALYSE IMPROVE CONTROL

Process Capability Analysis

Project Charter
Purpose
The PCA evaluates the capability of the process to meet the customer
requirements (Output Ys, CTQ).

Process Map
Measurements and Evaluation Criteria
The process capability is defined by six values:
(variable: Cp, Cpk, Pp, Ppk; attributive: DPU, DPMO).

C&E Matrix
Short term long term Sigma DPU DPMO in
CP Cpk Pp Ppk Level ppm
2.00 1.50 2.00 1.50 6.00 0.0000034 3.4
1.83 1.33 1.83 1.33 5.50 0.0000317 31.7

MSA
1.67 1.17 1.67 1.17 5.00 0.0002327 232.7
1.50 1.00 1.50 1.00 4.50 0.0013500 1350.0
1.33 0.83 1.33 0.83 4.00 0.0062097 6209.7
1.17 0.67 1.17 0.67 3.50 0.0227501 22750.1

PCA
1.00 0.50 1.00 0.50 3.00 0.0668072 66807.2
0.83 0.33 0.83 0.33 2.50 0.1586553 158655.3
0.67 0.17 0.67 0.17 2.00 0.3085375 308537.5

FMEA
0.50 0.00 0.50 0.00 1.50 0.5000000 500000.0

The capability indices marked in red can be improved by the tools of the
DMAIC process. For the capability indices marked in yellow the DFSS
Multi-Vari

tool could be necessary. No improvement is necessary for the capability


indices marked in green.

If the Cp or Pp of variable features is higher than the Cpk or


Ppk, then the process is not centralised (target deviation). If the
DoE

Cp, Cpk, Pp and Ppk are approximately the same but less than
1.33, the process is not mastered (Common Causes). If the Cp and
Pp or Cpk and Ppk differ greatly from each other, then process
Control Plan

failures will result (Special Causes).

27
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Conditions
Data must be independent to carry out a capability analysis.
The residuals of measurable variables must have approximately a nor-
mal distribution.
If the data is not normally distributed because of instabilities such as
exceptions (Outliers), then the lack of stability is the problem and the
Process Map

non-normality is an aspect of this problem.


If the data are stable and not normally distributed, they need to be
transformed (e.g. with logarithmic transformation).

Tips
C&E Matrix

To calculate the capability indices (Cp, Cpk, Pp, Ppk), the


specification ranges are required. These can be calculated with standard
deviations and a target value.
MINITAB contains a Six-Pack analysis package.
MSA

For I-MR Charts: If all measurement values are within the control ranges
(UCL and LCL) then only common causes exist (the process is
stable), otherwise special causes are present (the process is not
PCA

mastered).
FMEA

Pitfalls
The capability indices are always too small if the data are not indepen-
dent (autocorrelation, autoregression). It can be checked with MINITAB
Multi-Vari

run charts.
The order of the data has been changed in some way or sorted, in which
case the calculation of dispersion is nothing more than guesswork.
The analysis does not provide plausible control ranges, because the data
are limited on one or both sides. Use the p-chart for error portions, c-
DoE

chart for number of errors and the logarithmic normal distribution for
measurable data.
Control Plan

28
Example: PCA Six Pack for Delivery Time
Check that the Capability Histogram
Individual and MR Chart process is stable (no
special causes, meaning
UCL =14.39 no points out
side the UCL and LCL
limits).
DEFINE

Mean =11.66 This process is stable.

Individual Value
LCL = 8.937

Obser. Check whether


the process is
Normal Prob Plot
normally dis-
MEASURE

tributed (points
UCL = 3.349 lie on the red
curve).
This process
tends to a

29
R =1.025

Mov.Range
normal
LCL = 0 distribution.

Last 25 Observations Within Graphical com-


parison of the
Capability Plot
StDev: 0.908687
Cp: 0.55 short and long
Cpk: 0.31 term dispersion
relative to
specification.
Overall

Values
StDev: 0.905299
Observation Number Pp: 0.55
Ppk: 0.31
Capability Indices: Cp,Cpk,Pp,Ppk.
Fact 1: Process is not central, since Cp (0.55)> Cpk (0.31). Target is Cp = Cpk.
Fact 2: Dispersion is double the specified amount, since Cp = 0.55. Target is Cp = 2.
Fact 3: There are no process breakdowns, since Cp=Pp and Cpk = Ppk
ANALYSE IMPROVE CONTROL

Control Plan DoE Multi-Vari FMEA MSA C&E Matrix Process Map Project Charter
PCA
MEASURE
Project Charter
DEFINE ANALYSE IMPROVE CONTROL

Review Check List:


Initial Capability

1. What are the long and short-term process capability values?


2. How does this compare to the customers perspective of perfor-
mance?
Process Map

3. Is there a significant opportunity to improve beyond


current levels (i.e., how large is the gap between Cp and Ppk)?
4. What are the definitions of defects and opportunities?
5. Are the capabilities established for Process Input Variables and
Process Output Variables from the C&E?
C&E Matrix

Baseline Data

6. How much data or what time period was used to


determine the baseline capability?
MSA

7. What variables were evaluated?


8. Are they stable?
9. Are there explanations for out of control signals?
PCA

9. How long was the process monitored to determine


stability?
10. Have you captured samples of data that truly reflect the normal
FMEA

process?
11. What are the largest types of variation over time
shift-to-shift, day-to-day, week-to-week, etc.?
Multi-Vari

Lessons learned
DoE
Control Plan

30
ANALYSE
Failure Mode &
Effect Analysis

M u l t i - Va r i
Analysis
ANALYSE
DEFINE MEASURE IMPROVE CONTROL

Failure Mode and Effect Analysis


Project Charter

Purpose
Possibilities for errors in the Input Xs which would create a
process breakdown on the Process Ys should be identified and
Process Map

evaluated. The FMEA risk analysis takes the important Input Xs


identified in the C&E Matrix and evaluates the critical and non-
critical values of these points.

Measurements and Evaluation Criteria


C&E Matrix

The following 3 parts of the FMEA help to evaluate the risk of the Input Xs
on the process:

Severity: evaluates the severity of the error (deviation of the Input Xs).
MSA

It is evaluated by values (1 = meaningless to 10 = life endangering or


high costs).
PCA

Occurrence: evaluates the frequency of the error. It is evaluated by


values (1 is 1.5 Ppk (6 level) to 10 which is a Ppk less than or equal
to zero).
FMEA

Detection: evaluates where and how the error will be recognised. It is


evaluated by values (1 is error, is always immediately detected, to 10
where the error is detected by the customer or the error is only detectable
Multi-Vari

with great effort).

The Risk Priority Number (RPN) is the product of the three values
and defines the risk of the error from 1-1000. Risks with a RPN
< 125 can be ignored, all other RPNs must be taken into account.
DoE
Control Plan

32
ANALYSE
DEFINE MEASURE IMPROVE CONTROL

Project Charter
C&E MATRIX

Inputs Outputs Sum

Process Map
FMEA
Inputs Error Effects RPZ Measures

C&E Matrix
MSA
Conditions

PCA
The FMEA must be carried out by people with expertise in the pro-
cess.
The Effects of the error must relate back to the defined Process Ys.

FMEA
All Input Xs carried over from the C&E Matrix are included in the
FMEA table.
If there is a high RPN, define corrective measures and keep Quick
Hits in mind.
Multi-Vari
DoE
Control Plan

33
ANALYSE
DEFINE MEASURE IMPROVE CONTROL

What are the actions for


Project Charter

then process improvement


Process improvement

Improve detection first,


reducing the occurrance of the
Recommended

All hands on deck!


Address controls
Cause, or improving detection?
Actions

No action

No action

No action
Should have actions only

Action

Action
on high RPNs or easy
fixes.
Process Map

RPN = Severity x Occurence x


RPN

Detection

Failure reaches user

with major impact


Assured Mastery

Frequent failures
Frequent failures,
detectable, costly
Frequent failures,
Failure does not
Ideal Situation

Big trouble!
reaches user
How well can you detect cause

reach user
Result
or FM?
Detection
(110)
C&E Matrix

What are the existing


Current Controls

Det.

high

high

high

high
low

low

low

low
controls and procedures
(inspection and test) that
Occur.

prevent eith the cause or

high

high
high

high
low

low

low

low
MSA

the Failure Mode? Should


include an SOP number.
How often does cause or FM
Sev.

high

high

high

high
low

low

low

low
Occur.

occur?
(110)
PCA

What causes the Key Input to


FMEA

Potential
Causes

go wrong?

How severe is the effect to the


Severity
Multi-Vari

(110)

customer?

What is the impact on the Key


Process Potential Potential
Effects
Failure

Output Variables (Customer


Requirements) or international
DoE

requirements?
In what ways does the Key
Failure
Mode

Input go wrong?
Control Plan

What is the process step/


Input
Step

Input under investigation?

34
ANALYSE
DEFINE MEASURE IMPROVE CONTROL

Project Charter
Useful Hints
For each Input X, several potential failures can be determined.
Potential failures should be defined in such a way that the
measurable input value may be described as too small/low, wrong,
incomplete or to early/late.
Make sure to list no more than 2 potential failures for each

Process Map
Input Xs, otherwise the Input X is not correctly defined.
No additional Input Xs may result.
A failure tree is helpful to structure and identify cause-effect relation-
ships.
Equivalent failure sequences have the same severity rating.
Corrective actions to minimize risks should be prioritized.

C&E Matrix
Establish rating scales as they help to access Severity, Occurrence
and Detection.
The FMEA-Analysis initiates further projects (eg. GB-Projects).
The FMEA provides potential Inputs Xs for the Control Plan.

MSA
PCA
Pitfalls

FMEA
Wrongly defined Inputs Xs,e.g. a non-quantified noun without fea-
tures leads to many useless potential failure modes. Example:
Wrong Input: Operator
Multi-Vari

Pot. failure: Operator is too small, too tall, demotivated etc.


Correct Input: Operators qualification
Pot. failure: insufficiently qualified
Causality (observation window) between failure, cause of failure and
failure effect must remain on one level and not be changed.
DoE
Control Plan

35
ANALYSE
Project Charter
DEFINE MEASURE IMPROVE CONTROL

Review Check List


1. Who helped develop the FMEA; what organizations do they rep-
resent?
2. Which items from your C&E Matrix did you evaluate in the
FMEA tool?
3. Does the tool reflect current state or the desired process?
Process Map

4. What type of ranking system did you use?


5. What quick-hits did you find from the FMEA?
6. Did you complete the actions recommended section of the tool?
7. Do all actions in your FMEA have responsibilities assigned and a
completion date identified?
8. Have you updated your Control Plan with what you know so
C&E Matrix

far?

Lessons learned
MSA
PCA
FMEA
Multi-Vari
DoE
Control Plan

36
ANALYSE
DEFINE MEASURE IMPROVE CONTROL

Multi-Vari Analysis

Project Charter
Purpose
For the purpose of screening the most important inputs, the
correlations of the critical input Xs derived from the FMEA with

Process Map
the project and process Ys are ascertained. Any correlation iden-
tified does not necessarily signify a cause-effect relationship
(dependency). Special attention is paid to possible uncontrolled input Xs
(noise variables).

Parameters and Evaluation Criteria

C&E Matrix
For the purpose of analysis, use is made of existing historic data if avail-
able (passive approach).

The simple correlation coefficient (r) is a statistical value for

MSA
quantifying the dependency between two variables. Negative
correlation signifies that increasing the input variable reduces the level of
the output value. Conversely, positive correlation causes the output value

PCA
to rise.

|r| Remarks

FMEA
0.00 < 0.10 No or negligible correlation
0.10 < 0.30 Weak correlation
0.30 < 0.70 Medium correlation
0.70 < 0.90 Strong correlation
Multi-Vari
0.90 < 0.99 Very strong correlation
0.99 < 1.00 Functional correlation

A null hypothesis is confirmed or rejected on the basis of the


p-value. The p-value is the probability of the null hypothesis being
correct. Rejection of the null hypothesis indicates a significant
DoE

finding.

P-Value Remarks
Control Plan

p > 0.10 No significance


0.10 > p > 0.05 Weak significance
0.05 > p > 0.01 Significant finding
0.01 > p Highly significant finding
37
ANALYSE
Project Charter
DEFINE MEASURE IMPROVE CONTROL

Requirements
For analysis, the data must be independent.
For measurable variables, the residuals must be more or less
normally distributed and spreads must be homogeneous.
The error probability must be established before testing is
carried out.
Process Map

The data set should include no outliers.

Example: Multi-Vari Analysis


Data Description Hypothesis Chi-Square-Test
C&E Matrix

2 -Test
A1 A2 of dependence
B1 5 10
discrete X H0: p11 = p1. p.1
B2 80 50 discrete Y of homogenity
H0: p1 = p2
of normality Anderson-Darling
0 1
MSA

H0: F(x) = F0(x) Test, 1 sample


discrete X t-Test (confidence
continuous Y interval)
target
X one level
PCA

H0: 0 = 1
homogenity F-Test 2-sample
0 1
t-Test
FMEA

discrete X of variances
continuous Y H0: 12 = 22
X 2 levels of mean
H0: 1 = 2
Multi-Vari

homogenity Levene-Test,
0 1 2 of variances one-way ANOVA
discrete X H0: 12 = ... = k2
continuous Y of mean
X 2+ levels H0: 1 = ... = k
DoE

of correlation Correlation- and


H0: r = 0 regression analysis
continuous X of regression
y
continuous Y H0: b0= 0
Control Plan

x H0: b1 = 0

38
ANALYSE
DEFINE MEASURE IMPROVE CONTROL

Project Charter
The tools of multi-vari analysis are presented in brief tabular form. In
addition to the description and presentation of the data, the null hypoth-
eses to be tested, and the planned tests, are stated. Apart from these uni-
variate tests, multiple or multivariate tests can be conducted. The methods
of the general linear models (GLM) are very largely applicable.

Useful Hints

Process Map
Time effects (e.g. delayed effects, such as turns versus net manu-
facturing costs) must be taken into account and the data must be
synchronized.
Partial populations (N>1000) should be avoided because,
otherwise, the slightest differences or correlations become

C&E Matrix
significant although not practically relevant. A sample size of n=200 to
500 is adequate and still readily manageable.
The data should always be left in their original sequence;
otherwise, testing for independence is not possible.
The variables should be sorted into Input Xs and Output Ys.

MSA
Pitfalls

PCA
Correlations might also be due to the feature correlations (body mea-
surements, storks nests versus births), inhomogeneous correlations
(shoe size versus income), and formal correlations (solids content ver-

FMEA
sus solvent content). These do not help in
eliminating the causes of spread. On causal correlations, expert deci-
sions have to be taken. Pure correlations are over-
interpreted as dependencies (cause-effect relationships).
Multi-Vari

Historic data are unreliable by reason of unknown background factors


(e.g. changes in material/process).
Multi-collinearity (i.e. dependencies between the Input Xs) may seri-
ously impair the results. If the correlations between the Input Xs are
IRI< 0.3, they are deemed negligible.
DoE
Control Plan

39
ANALYSE
Project Charter
DEFINE MEASURE IMPROVE CONTROL

Review Check List


1. How did you determine what Process Input Variables to
investigate?
2. How did you statistically determine the contribution of each
Process Input Variable to the overall variability
observed?
Process Map

3. What sample size did you use for each Process Input Variable
to ensure statistical validity?
4. What was your null hypothesis?
5. What risk levels did you assume?
6. Is the project addressing a mean shift or variation
reduction?
C&E Matrix

7. What type of data is the Process Output Variable? Process


Input Variables?
8. What statistical tools were used for evaluation of the
relationship between the Process Input Variables and the
Process Output Variable?
MSA

9. What Process Input Variables contribute the most to the


Process Output Variable variability?
10.Are the statistically significant variables practically
PCA

significant?
11.Does your data confirm that your critical Xs from the C&E Matrix
are causing the variation?
FMEA

Lessons learned
Multi-Vari
DoE
Control Plan

40
IMPROVE
Design of
Experiment
IMPROVE
DEFINE MEASURE ANALYSE CONTROL

Design of Experiment
Project Charter

Purpose
The optimum setting of the selected process Input Xs will be deter-
mined by experiment so that the Process and Project Ys can
Process Map

achieve their target values. Moreover, the setting is done in such


a way as to ensure that the process is robust on noise variables.
In contrast to the multi-vari analysis, the inputs are varied in
accordance with an experiment plan (active approach) and results are
recorded.
C&E Matrix

Parameters and Assessment Criteria


The parameter (R2) describes what percentage of the variation is explained
by the function Y = f (X1, X2, ..., Xk).
MSA

The rest of the variation is the residual error. A null hypothesis is con-
firmed or rejected using the p-Value. An outcome is significant, if the null
PCA

hypothesis is rejected, i. e. the impact of the Input Xs is ascertained.

Estimated Effects and Coefficients for Yield


FMEA

Term Effect Coef SE Coef T P


Constant 85.507 85.507 0.05648 1514.04 0.000
Temp 3.919 1.960 0.05648 34.70 0.000
Multi-Vari

Time 4.885 2.442 0.05648 43.24 0.000


Pressure 2.975 1.488 0.05648 26.34 0.000
Temp*Time 13.840 6.920 0.05648 122.53 0.000
Temp*Press. 0.238 0.119 0.05648 2.10 0.065
Time*Press. -7.219 -3.609 0.05648 -63.91 0.000
DoE

The Lack of Fit shows whether the regression model is adequate. It is the
most important measurement for testing the model-related data. Rejection
of the null hypothesis requires a higher-order model.
Control Plan

42
IMPROVE
DEFINE MEASURE ANALYSE CONTROL

Project Charter
Analysis of Variance for Yield
Source DF Seq SS Adj. SS Adj. MS F P
Main Effects 3 192.29 192.286 64.095 1E+03 0.000
Interactions 3 974.85 974.853 324.951 6E+03 0.000
Residual Error 9 0.46 0.459 0.051
Lack of Fit 1 0.00 0.004 0.004 0.07 0.803
Pure Error 8 0.46 0.456 0.057

Process Map
Total 15 1167.60

Conditions
It is important for the analysis that the data are independent. For mea-

C&E Matrix
surable parameters, the residuals must be normally distributed. The
experiment plan should be orthogonal,
rotatable or D-optimal. Moreover, the trials must be performed in ran-
domised and repeated manner.

MSA
Example: Analysis & Optimization

PCA
After designing an orthogonal experiment plan, the trial is
performed. The data will then be analysed via regression. Contour line
graphics are used for setting the Input Xs in order to yield an ideal value

FMEA
for Output Ys.

7 8
Multi-Vari

3 4
DoE

X2
5 6

X3
Control Plan

1 2
X1

43
IMPROVE
Project Charter
DEFINE MEASURE ANALYSE CONTROL

No. X1 X2 X3 X1X2 X1X3 X2X3 X1X2X3


1 - - - + + + -
2 + - - - - + +
3 - + - - + - +
Process Map

4 + + - + - - -
5 - - + + - - +
6 + - + - + - -
7 - + + - - + -
8 + + + + + + +
C&E Matrix

Variable: Yield
2** (3-0)-Plan; MQ residuals=.051032
1.2
MSA

1.0
0.8
0.6
PCA

0.4
100
0.2 95
90
FMEA

Time

0.0 85
80
-0.2 75
-0.4
Multi-Vari

-0.6
-0.8
-1.0
-1.2
DoE

-1.2 -1.0 -0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 1.2
Temperature
Control Plan

44
IMPROVE
DEFINE MEASURE ANALYSE CONTROL

Project Charter
Tips
Limit the experiment plan to a maximum of five Input Xs and three
process Ys.
If there is no possibility of adding trial factors later, you should con-
sider the use of Central Composite or Box-Behnken Design.
Follow the rule of maximum comparability, i. e. completely homoge-

Process Map
neous test conditions have to be created, e.g. through block formation.
Use the Derringer & Suich procedure to
optimise several process Ys.
Do not use experiment plans of solution type III for optimization. Use
D-optimal experiment plans, if the use of standard experiment plans is
not possible for some reason.

C&E Matrix
Pitfalls
The defined scope of experiment is wrong, too small, or too large.

MSA
The settings of the Input Xs (factors) were incorrectly performed or
do not remain constant during the trials.
The trials were performed in standard and not randomised order.

PCA
There were not enough repetitions. A degree of freedom
exceeding 30 has to be aimed at in every case.
The wrong model has been defined for analysis, e.g. factors constitut-

FMEA
ing percentages (100 % fit) cannot be analysed by means of regression
constants. Multi-Vari
DoE
Control Plan

45
IMPROVE
Project Charter
DEFINE MEASURE ANALYSE CONTROL

Review Check List


1. What are your critical Ys?
2. What are your critical Xs?
3. How did you select the factors (Process Input Variables) to
include?
4. How were the levels for each factor determined?
Process Map

5. Did you complete a screening DOE?


6. Does the knowledge to be gained justify the expense of the exper-
iments?
7. How did you determine the sample size requirements for each
experimental run?
8. What risk levels did you assume when you designed your experi-
C&E Matrix

ment?
9. What actions would you recommend based on the
experimental results?
10.What did you learn from your Pilot Run?
11.Who was on your team and what were their roles?
MSA

12.How much variation in the response (Process Output Variable) was


due to the factors (Process Input Variables) you investigated?
13.What significant factors and interactions exist?
PCA

14.How many full factorial designs were completed and why?


15.What methods have been utilized to optimize the process?
16.What actions have been taken to verify the results of the experimenta-
FMEA

tion?
17.What changes have you made to the process?
18.How have the results been used to impact the Control Plan?
19.What is the underlying model/equation that relates the Process Input
Multi-Vari

Variables to the Process Output Variable(s)?


20.Can you statistically demonstrate real change as a result of improve-
ments?
21.What percent of the Entitlement Gap has been closed with these
improvements?
DoE

22.What next steps should be taken?

Lessons learned
Control Plan

46
CONTROL
Control Plan
CONTROL
DEFINE MEASURE ANALYSE IMPROVE

Control Plan
Project Charter

Purpose
The documentation of the project results in the Control Plan is used for
mastering the identified critical Input Xs and thus for controlling the pro-
Process Map

cesses affected in order to maintain the improvement gained in the project


the Project and Process Ys (over the long term).

Parameters and Assessment Criteria


C&E Matrix

The columns of the Control Plan contain the following data/information:


Process;
Process Step;
Output; Input; Control Plan
Process Specifications
MSA

(LSL, USL,Target)
Capability/Date;
Measurement Technique
PCA

%R&R and/or P/T


Sample Size;
Sample Frequency
FMEA

Control Method;
Reaction Plan
These data are derived from the outcome of the Process Map, of the
Process Capability Analysis (Initial Capability), the C&E Matrix, and
Multi-Vari

the FMEA.
DoE
Control Plan

48
CONTROL
DEFINE MEASURE ANALYSE IMPROVE

Project Charter
Documentation Package
The Control Plan is part of a process documentation package which may
be rather extensive. This project documentation should be available as an

Process Map
information base from the Process Owner. Established process regulations
(operating or manufacturing procedures, machine procedures...) will be
updated with the project outcome and serve as a daily work resource.

C&E Matrix
Audit Plan Preventive
DOEs Maintenance
Reactions
Plans Parameter
Mistake Cards
Training Proofing
Capability SOPs
Studies

MSA
MSA
Process FMEA s
Customer Maps
Requirements

PCA
FMEA
Control Plan
Multi-Vari

Process Owner
The person with major responsibility for the process outcome must
DoE

have the authority and capability to control the process and thus
influence the result. If the improved process is controlled by sever-
al persons in their respective partial areas, each of these persons is the
Control Plan

process owner responsible for his/her partial area.

The established management systems (e.g. Controlling, Administration, HR

49
CONTROL
DEFINE MEASURE ANALYSE IMPROVE
Management, Environmental Management System, Quality Management
Project Charter

System, etc.) must be considered during process restructuring or develop-


ment in order to avoid any build-up of parallel systems and thus confu-
sion of authorities. The new process regulations must become a part of
established systems in order to ensure sustainable improvement through
signature regulations, internal audits etc.
Process Map

Example: Control Plan


C&E Matrix

PROCESS MAP
Inputs Process Outputs
Steps
MSA

C&E MATRIX

Inputs Outputs Sum


PCA

FMEA Capability Summary Sheet


FMEA

Inputs Fault Effects RPN Actions Outputs M-Tech. R&R Cpk LSL USL
Multi-Vari

Overview of the process capability


The Key-outputs Y
The Key-inputs X will be analyzed
will be analyzed

Control Plan
DoE

Process Inputs Outputs M-Sys. %RAR Cpk LSL USL React.


Control Plan

50
CONTROL
DEFINE MEASURE ANALYSE IMPROVE

Project Charter
Tips
The established Quality Management System can/must be used for
developing and updating the documentation.
Develop a reaction plan.
Make use of mistake proofing systems.
The sample, audit and test plans indicate how often, where and to

Process Map
whom results must be reported.
Develop a training plan. The training describes all aspects of the pro-
cess and responsibilities: process improvement activities must be fully
documented.
The Control Plan is regularly reviewed and updated and is
located at the process site.

C&E Matrix
Pitfalls
There is not much time at the end of the project to consolidate data and

MSA
information and to develop documentation.
The established documentation system is not known.
Additional regulations are created instead of existing ones being

PCA
adapted.

FMEA
Multi-Vari
DoE
Control Plan

51
CONTROL
Project Charter
DEFINE MEASURE ANALYSE IMPROVE

Review Check List


Control Methods:

1. What control methods are in place for the Critical Process Input
Variables and Process Output Variables?
2. How were the control methods determined? Are they
Process Map

statistically valid?
3. How are the control methods identified and documented?
4. What methods exist for updating and controlling the
Control Plan?
5. How was the Reaction Plan for each Process Input Variable/
Process Output Variable developed?
C&E Matrix

6. Who is the Process Owner?


7. Does the Process Owner accept the results of your project and
support the implementation of proposed process
changes?
8. Who is responsible for maintaining the control plans and meth-
MSA

ods?
9. Who is responsible for implementing the control scheme?
10.What training is needed and for who?
PCA

11.Who will conduct necessary training?


12.What confidence do you have that the changes implement-
ed will be effective once the project is completed?
FMEA

13.Has an audit plan been developed, and documented in the business


units quality management system?

Long-Term Capability
Multi-Vari

14.How long will data be taken to verify that a successful


process improvement has been accomplished before
closing the project?
15.What data will be taken?
DoE

16.Who will take the data?


17.Who will analyze and track the data?
18.What statistical tools will be used to verify the
Control Plan

improvement?

52
SIX SIGMA GLOSSARY August
2002

A
Analyze
3rd phase in the DMAIC process: process data have been identified
and the process is now analysed with regard to causes for mis-
takes (see Root Cause Analysis) and non-value-adding activities.

B
Baseline
The performance of a process at the start of the improvement pro-
cesses: How has the process performed up to now?

Black Belt
Six Sigma expert (full time) leading Six Sigma projects/teams and
trained in leadership, project management, statistical methods and
problem solving strategies.

Breakthrough performance
Improvement signifying a jump to a higher performance level.
Significant reduction of the gap between baseline and entitlement
(real possible maximum performance). Leads to substantial improve-
ment in terms of the corporate objectives growth, cost and cash.

Business Critical Y
A business units concrete improvement target. Is related to the
corporate objectives (Corporate Ys) Growth, Cost and Cash.

53
C
Champion
An upper level business leader who facilitates the implementation
of Six Sigma projects. The Champion identifies improvement possi-
bilities in his/her responsibility area, initiates projects, makes the
necessary resources available and supports the Six Sigma teams.

Control
Last phase of the DMAIC process.
Development of standard measures for regular performance review:
How can we sustain the new, higher performance level?

Corporate Y
Target parameter for corporate success, e.g. growth, cost, cash.

Cost of Poor Quality


Costs associated with the delivery of poor quality products or ser-
vices. Examples: complaint handling, replacements, inspections
and other activities that do not generate added value.

Critical to Quality (CTQ)


Part of a process that has direct impact on the perceived quality.

Critical Y
Parameter to be improved in the Six Sigma process: What exactly
do we want to improve?

54
D
Defect
Any product characteristic that deviates from customer require-
ments.

Defects per Million Opportunities


Quality measurement parameter that is often used in the Six Sigma
process: number of observed mistakes per million mistake oppor-
tunities.

Define
First phase in the DMAIC process: definition of the problem or the
process improvement possibilities or of customer requirements. Not
static, but reviewed again and again in the course of the DMAIC
process and adjusted according to the findings.

DMAIC Process
Key Six Sigma process: Define, Measure, Analyse, Improve and
Control. Its methodology is systematic, scientific and fact-based.

Design for Six Sigma (DFSS)


Methodology similar to DMAIC with the aim of deriving new prod-
ucts and services. Unlike DMAIC, the DFSS methodology consists
of a variety of techniques to be combined.

Design of Experiment (DOE)


Statistical experiment planning which identifies the factors and their
optimum settings that affect the process outcome.

55
E
Entitlement
Actual possible maximum performance (benchmark).

F
Failure Mode and Effect Analysis (FMEA)
Technique enabling systematic analysis of possible mistakes and
assessment of risks through products and process mistakes. Helps
to select high-risk inputs in the context of Six Sigma.

Funnel Effect
Filters down the key input variables of all possible impact variables
of project Y.

G
Gap
Gap between entitlement and baseline.

Green Belt
Has a similar function to the Black Belt but is less comprehen-
sively trained. Dedicates part of his/her working time to coaching
on smaller or partial projects.

H
Hopper
Collection of potential Six Sigma projects.

I
Improve
Fourth phase of the DMAIC process: problem solutions are
developed, tested and implemented and roots of mistakes are
eliminated.

56
M
Master Black Belt
Full-time Six Sigma expert. Is responsible for implementing the Six
Sigma strategy in a region (e.g. Germany) or in a business unit. He
leads and assists several Black Belts, manages a project, and is
responsible for training and communication.

Measure
Second phase in the DMAIC process: measurement, capture and
collection of all relevant process data.

Multi-Vari Analysis
Statistical method for analysing suspected process input variables
in relation to correlations and effects on the process outputs.

P
Process Capability Index
Parameter for identifying whether a process is able to meet the
process output requirements (e.g. customer requirements).

Process Map
A step-by-step pictorial sequence of a process showing process
inputs, outputs, cycle time, decision points and activities involved
at a glance.

Process Owner
The person responsible for the process output. This person has the
authority and ability to influence and control the process.

Project Y
Parameter to be improved in a Six Sigma project (e.g. inventory
reduction, output increase of a production line, market share in-
crease in a certain segment). Helps to achieve higher corporate
objectives.

57
R
Rolled Throughput Yield (RTY)
Yield of a process expressed in percentage. It is calculated by multi-
plying the yields of each process step. Example: The yield of the
first step is 98% of input, in the second, 90%, and in the third, 95%.
The Rolled Throughput Yield (RTY) thus amounts to 84% (0.98 x
0.90 x 0.95 = 0.84).

Root Cause Analysis


Analysis of the exact cause for a process deviating from the de-
sired result.

S
Six Sigma
Statistical measure for the extent to which a process deviates from
perfection. Six Sigma stands for 3.4 mistakes per million mistake
possibilities (see Defects per Million).

Super Y
Clustering similar Six Sigma projects into one topic with general
significance for the company (e.g. production, inventories) with the
aim of increasing transparency and efficiency by sharing informa-
tion and experience.

V
Variation
Spread r (result) differences) within a process. Variation is a
parameter for the stability or predictability of a process. Any pro-
cess improvement (in the context of Six Sigma) aims at reducing
variation as far as possible.

58
Analyze-Roadmap: 2 -Test

Independence test of two variables


at two or more levels

Action Minitab Remarks


H0: The variables are independent
H0: pij =pi. p.j
HA: The variables are dependent or
Define HA: pij =pi. p.j
Hypothesis
Evaluate the consequences of wrong decisions. Define
error probability and the necessary
sample sizes.

Collect Randomized data collection. Direct analysis of implau-


Data sible results.

Use the 2 -independence test to review the null


hypothesis and reject it if p-value < .
The variables are dependent in that case.
Examine Chi-Square The expected values must be 5 for 22 tables, 2 for
independence Test r 2 tables and 1 for r c tables.
of the variables
If these conditions are not met, columns (c) or rows (r)
can be combined.

The differences between the observed values and the


Evaluate expected values show by which cells
Table the deviations are caused. The 2 values of the cells
are the standard sizes of the deviations.

59
Analyze-Roadmap: One-Way ANOVA
Comparisons of mean values of two or more sam-
ples

H0: All mean values are equal.


HA: At least one mean value is different.
H0: All variances are equal.
Define
HA: At least one variance is different.
Hypotheses
Evaluate consequences of wrong decisions.
Define error probability and the necessary
sample sizes.

Randomized data collection.Direct analysis of implau-


Collect sible results. Secure the time sequence of the data.
Data Do not round off the measured values.

Use the Levene test to study the null hypothesis and


Study homo- Test for reject it if p-value < . The samples then come from
geneity of the equal different populations. If the variances and the sample
variances Varances sizes differ widely, the mean value test is probably
not correct.

Use the One-Way ANOVA to check the null


hypothesis and reject it if p-value < . The
Study homo- samples then come from different populations.
geneity of One-way If the differences between the mean values are
the variances not significant, yet substantial, increase the
sample size. Store residuals.

Use the Normality Test to check the normality and


reject it if p-value < . This means that the
Check ND Normality residuals are not normally distributed. Minor
Fit Test deviations in the ends are negligible.
In the event of major deviations, the data must be
transformed.

Use the I-MR-Chart to study autocorrelation and sta-


bility. The process is disturbed if special
Study causes or patterns exist. This result may distort the
I-MR-Chart
Autocorrelation hypothesis test.
Analyze-Roadmap: Regression
Simple and multiple regression and correlation

Action Minitab Remarks


Define Model Y = f(X), keeping in mind that
only causal inputs may be used.
HO: All regression coefficients equal zero.
Define HA: At least one regression coefficient does not
Model and equal zero.
Hypotheses Evaluate consequences of wrong decisions. Define
error probability and the necessary
sample sizes.

Randomized data collection. Direct analysis of implau-


Collect
sible results. Secure the time sequence of the data. Do
Data
not round off the measured values.

Check the null hypothesis and reject it if p-value < .


This means that the variables impact the
targeted value. There is a correlation between the vari-
ables X and Y. The measures for the
Regression correlation are the correlation coefficient r and the
Check correlation
und square r2. High values for r2 signify high dependence
and regression of
Fitted Line but no adequate regression model. Therefore, the lack
the variables
Plot of fit sample must also be checked. For check, activate
Pure Error and
evaluate the ANOVA. Use the Fitted Line Plot for a
first graphical analysis. Reduce the model to the sig-
nificant variables and store the residuals.

Use the Normality Test to check the normality and


reject the null hypothesis if p-value < .
Check NT Normality
This means that the residuals are not normally distrib-
Fit Test
uted. Minor deviations in the ends are negligible.

Use the I-MR-Chart and the Residual Plot to study


autocorrelation and stability. The process is disturbed,
Study
I-MR-Chart or the model not adequate, if special causes or pat-
Model and
Residual-Plot terns exist.
Autocorrelation
Analyze-Roadmap: DoE
Factorial or central composite experiments

Action Minitab Remarks


Define factors, experiment scope and model type
(first or second grade)
Create
HO: All regression coefficients equal zero.
Factorial
Define HA: At least one regression coefficient does not equal
or
Model and zero.
Response
Hypotheses Evaluate consequences of wrong decisions. Define
Surface
error probability and the necessary
Design
sample sizes.

Randomized trials. Direct analysis of implausible


Collect results. Secure the time sequence of the data.
Data Do not round off the measured values.

Check the null hypothesis and reject it if p-value < .


Analyse
This means that the inputs impact the
Factorial
Determine targeted value. Use the Lack of Fit to check the model.
or
Effects of For check, activate Pure Error and
Response
Inputs evaluate the ANOVA. Reduce the model to the signifi-
Surface
cant variables and store the residuals.
Design

Choose the factors for graphical presentation. Choose


optimum, robust factor settings in
Graphical Factorial
accordance with the graphics. Use modified
Presentation or
settings to find optimum, robust settings for determined
of Model Contour Plot
factors also.

Use the Normality Test to check the normality


and reject the null hypothesis if p-value < .
Check NT Normality
This means that the residuals are not normally
Fit Test
distributed. Minor deviations in the ends are
negligible.

Use the I-MR-Chart and the Residual Plot to check


Study
I-MR-Chart autocorrelation and stability. The process is disturbed,
Model and
Residual-Plot or the model not adequate if special causes or patterns
Autocorrelation
exist.
3M Deutschland GmbH
Carl-Schurz-Strasse 1
41453 Neuss
Phone 0 21 31/14 0
Fax 0 21 31/14 32 00 DW-0001-1187-5

You might also like