Professional Documents
Culture Documents
This tutorial is about gleaning value from the Six Sigma world, to
raise the caliber of engineering, regardless of the corporate
stance on Six Sigma
• Refine problem & goal • An improvement journey to achieve goals and resolve
statements. problems by discovering and understanding
• Define project scope & relationships between process inputs and outputs,
boundaries. such as
Y = f(defect profile, yield)
= f(review rate, method, complexity……)
7 Basic Tools (Histogram, Scatter Plot, Run Chart, Flow Chart, Brainstorming, Pareto Chart), Control
charts (for diagnostic purposes), Baseline, Process Flow Map, Project Management,
“Management by Fact”, Sampling Techniques, Survey Methods, Defect Metrics
© 2006 by Carnegie Mellon University page 7
Process Improvement – Design
for Six Sigma (e.g., DMADV)
Define Measure Analyze Design Verify
7 Basic Tools (Histogram, Scatter Plot, Run Chart, Flow Chart, Brainstorming, Pareto Chart), Control
charts (for diagnostic purposes), Baseline, Process Flow Map, Project Management,
“Management by Fact”, Sampling Techniques, Survey Methods, Defect Metrics
© 2006 by Carnegie Mellon University page 12
Measure Guidance Questions
Define Measure Analyze Improve Control
[MPDI]
© 2006 by Carnegie Mellon University page 14
Controlled and Uncontrolled Factors
Controlled factors are within the project team’s scope of
authority and are accessed during the course of the project.
Studying their influence may inform:
• cause-and-effect work during Analyze
• solution work during Improve
• monitor and control work during Control
Procedure
• Consider what factors, groupings, segments, and situations
may be driving the mean performance and the variation in Y.
• Draw a vertical tree diagram, continually reconsidering this
question to a degree of detail that makes sense.
• Calculate basic descriptive statistics, where available and
appropriate, to identify areas worthy of real focus.
[MPDI]
© 2006 by Carnegie Mellon University page 16
Segmentation Example
Call Center Support Costs
Support Effort
99,906 Cases
275,580 Hours
80% of Case Effort
NB = Non-billable
© 2006 by Carnegie Mellon University page 17
Segmentation vs. Stratification
Segmentation—
• grouping the data according to one of the data elements (e.g.,
day of week, call type, region, etc.)
• gives discrete categories
• in general we focus on the largest, most expensive,
best/worst – guides “where to look”
Stratification—
• grouping the data according to the value range of one of the
data elements (e.g., all records for days with “high” volume vs.
all records with “low” volume days)
• choice of ranges is a matter of judgment
• enables comparison of attributes associated with “high” and
“low” groups—what’s different about these groups?
• guides diagnosis
[MPDI]
© 2006 by Carnegie Mellon University page 18
Process Mapping
Process map—a representation of major activities/tasks,
subprocesses, process boundaries, key process inputs, and
outputs
INPUTS OUTPUTS
(Sources of PROCESS STEP (Measures of
Variation) A blending of Performance)
inputs to achieve
• People • Perform a service
• Material
the desired • Produce a Product
• Equipment outputs • Complete a Task
• Policies
• Procedures
• Methods
• Environment
• Information
[MPDI]
© 2006 by Carnegie Mellon University page 19
Example: Development Process Map
Unit
Design Code Compile
Test
Requirements Resources Code Executable Code
Á Estimate Á Code Stds z Test Plan, Technique
Concept design Á LOC counter z Operational Profiles
Å Interruptions
[MPDI]
© 2006 by Carnegie Mellon University page 20
Input / Output Analysis
Supplier Inputs Process Outputs Customer
outputs
Inputs Step 1
Step 2 Outputs
inputs
Identify
• What does the data look like upon initial
Define Explore Identify Define
needed assessment?
data
Is it what
possible we expected?
control
project
scope data • What is the overall performancemethod
solutions of the
process?
Characterize
Establish Obtain Implement Implement
data set process &
• Do we have measures for all significant
(pilot as
formal
problem
project factors, as best needed)
we know them?Document
Evaluate
data • Update
Are there data toSelect
be added to the
quality process map? solution
improvement
project scope
• &Are any urgently needed improvements
scale
Summarize Evaluate
revealed?
& baseline
data • What assumptions have been made
[MPDI] about the process and data?
© 2006 by Carnegie Mellon University page 23
Analyze Guidance Questions
Define Measure Analyze Improve Control
Ca
No formal defect
Ca
e
us
.1 us
us
tracking mechanism
e
A1 bca
e
Subcause C2
C
A
u
Environment Methods bs
Su
eB
Subcause D2
eD
us
us
Variation
Ca
Ca
.1 use
D2 bca
[MPDI; original source: Westfall] u
bs
Su
300
200
100 20
18
0 16
Number of Days
0 10 20 30 40 50 60 70
14
Size (thousands of executable source lines)
12
10
8
6
4
Histogram 2
0
34 36 38 40 42 44 46 48 50 52 54 56
Product-Service Staff Hours
[MPDI]
© 2006 by Carnegie Mellon University page 28
7 Basic Tools: Chart Examples
Defects Removed By Type Pareto Chart
90
80
70
60
Quantity
50
40
30
20 Mean Time To Repair
10
0 100.00
20 40 80 50 10 100 30 60 70 Removed
90 in test
Time (minutes)
80.00
Defect Type Codes Injected in Design
60.00
40.00
20.00
0.00
Run Chart 1 2 3 4 5 6 7 8 9 10
Product
[MPDI]
© 2006 by Carnegie Mellon University page 29
7 Basic Tools: Chart Examples
90th percentile
Box & whisker plot 75th percentile
mean x
Median: 50th percentile
for assessment data 25th percentile
10th percentile
CMMI BENCHMARK
SEI Level 3
100
90
80
70
60
50
40
30
20
10
Analysis &
Organization
Resolution
Requirements
Integration
Validation
Organizational
Organizational
Verification
Management
Management
Development
Technical
Solution
Decision
Integrated
Product
Process
Definition
Training
Process
Project
Focus
Risk
[MPDI]
© 2006 by Carnegie Mellon University page 30
7 Basic Tools: Chart Examples
SPC Chart: Individual, Moving Range
60
UCL
50
Zone C
40
Zone B
30
Individuals
20 Zone A
CL
10
Zone A
0
Zone B
-10
-20
Zone C
LCL
0 5 10 15 20 25
Subgroup No.
.
[MPDI]
© 2006 by Carnegie Mellon University page 31
Example: Cost/Schedule Monthly
Performance Baseline
0
-1000
-2000 All Org Units, all projects,
%Cost Var.
-3000
-4000
October to March
-5000
-6000
• Reminder: This is (current
-7000 actual – most recent
-8000
estimate)
1: oct-01 2: nov-01 3: dec-01 4: jan-02 5: feb-02 6: mar-02
month label
• Averages within spec, and
100
close to 0
80
60 • Need to examine extreme
40
values, especially for cost
%Sch. Var.
20
0
Range:
-20
-40
60% • Even if extreme values are
-60
-80
outliers, it looks like we need
-100
1: oct-01 2: nov-01 3: dec-01 4: jan-02 5: feb-02 6: mar-02
to investigate variability
month label
[MPDI]
© 2006 by Carnegie Mellon University page 32
Descriptive Statistics
Measures of central tendency: location, middle, the balance point.
Approaches:
• Customer specifications
• Interviews, surveys, focus groups
• Prototypes
• Bug reports, complaint logs, etc.
• House of Quality
9
8 Mean=7.8
7
6 Individual
Lower5 data points
Control4 LCL=4.001
Limit
3
Time
0 5 10 15
ordered
Observation Number x-axis
0.10
UCL=0.09633
Defects per Line of Code
0.09
0.08 _
U=0.07825
1 3 5 7 9 11 13 15 17 19 21 23 25
ID
9
8
7
Mean=7.8
• Caused by the natural
6
5
variation in the process
4 LCL=4.001
• Predictable (stable) within a
3
0 5 10 15 range
Observation Number
_
15
UCL=13.36 Special Cause Variation
• Assignable variation that
Individual Value
10
products
© 2006 by Carnegie Mellon University page 46
Voice of Business
The "voice of the business" is the term used to describe
the stated and unstated needs or requirements of the
business/shareholders.
[MPDI]
© 2006 by Carnegie Mellon University page 50
Discussion: What if I Skip This Step?
What if…
• All 0’s in the inspection database are really missing data?
• “Unhappy” customers are not surveyed?
• Delphi estimates are done only by experienced engineers?
• A program adjusts the definition of “line of code” and doesn’t
mention it?
• Inspection data doesn’t include time and defects prior to the
inspection meeting?
• Most effort data are tagged to the first work breakdown
structure item on the system dropdown menu?
• The data logger goes down for system maintenance in the
first month of every fiscal year?
• A “logic error” to one engineer is a “___” to another
[MPDI]
© 2006 by Carnegie Mellon University page 52
Practical Tips
Map the data collection process.
• Know the assumptions associated with the data
Look at indicators as well as raw measures.
• Ratios of bad data still equal bad data
Data systems to focus on include the following:
• Manually collected or transferred data
• Categorical data
• Startup of automated systems
[MPDI]
© 2006 by Carnegie Mellon University page 53
Formal MSE Provides Answers…
• How big is the measurement error?
[MPDI]
© 2006 by Carnegie Mellon University page 54
Sources of Variation in a Formal MSE
Total
Variability
= Process + Measurement
System
Variability Variability
σ2Total σ2Process M1
σ2MS
M2
M3
[MPDI]
© 2006 by Carnegie Mellon University page 55
Characteristics of a Formal MSE
• Precision (reproducibility and repeatability – R&R)
• Accuracy (bias)
[MPDI]
© 2006 by Carnegie Mellon University page 56
Accuracy (Bias)
[MPDI]
© 2006 by Carnegie Mellon University page 57
Precision vs. Accuracy
(σ) (µ)
[MPDI]
© 2006 by Carnegie Mellon University page 58
Precision
Spread refers to the standard deviation of a distribution.
The standard deviation of the measurement system distribution is
called the precision, σMS.
Precision
σ2 MS = Reproducibility
σ2 rpd + Repeatability
σ2 rpt
[MPDI]
© 2006 by Carnegie Mellon University page 59
Repeatability
Repeatability is the inherent variability of the measurement
system.
Measured by σRPT, the standard deviation of the distribution of
repeated measurements.
The variation that results when repeated measurements are
made under identical conditions:
• same inspector, analyst
• same set up and measurement procedure
• same software or document or dataset
• same environmental conditions
• during a short interval of time
[MPDI]
© 2006 by Carnegie Mellon University page 61
Simple MSE for Continuous Data—1
f h ( dd d )
MSE Metrics-Precision
%Gauge Repeatability & Reproducibility (%GR&R):
The fraction of total variation consumed by measurement
system variation.
σ MS
% GRR * = x100 %
σ Total
* Automotive Industry Action Group (AIAG) MSA Reference Manual, 3rd edition
<10% Acceptable
>30% Unacceptable
* Reference Automotive Industry Action Group (AIAG) MSA Reference Manual, 3rd edition
[MPDI]
© 2006 by Carnegie Mellon University page 65
MSE Calculations for Attribute Data 2
Quick Rule of Thumb Approach
[MPDI]
© 2006 by Carnegie Mellon University page 66
Analyze Guidance Questions
Procedure
1. Determine 1st and 3rd quartiles of data set: Q1, Q3
2. Calculate the difference: interquartile range or IQR
3. Lower outlier boundary = Q1 – 1.5*IQR
4. Upper outlier boundary = Q3 + 1.5*IQR
[MPDI]
© 2006 by Carnegie Mellon University page 71
Interquartile Range: Example
4
1 333
2 40 Upper outlier boundary
Q3 30 30 + 1.5*14 = 51
Interquartile Range 27
30 – 16 = 14 25
22
Procedure 20
1. Determine 1st and 3rd quartiles 18
of data set: Q1, Q3 Q1 16 3
2. Calculate the difference: 16
interquartile range or IQR Lower outlier boundary
3. Lower outlier boundary = 13
Q1 – 1.5*IQR 16 – 1.5*14 = -5
4. Upper outlier boundary =
Q3 + 1.5*IQR
Example adapted from “Metrics, Measurements, & Mathematical Mayhem,” Alison Frost, Raytheon, SEPG 2003
© 2006 by Carnegie Mellon University page 72
Tips About Outliers
Outliers can be clue to process understanding:
learn from them.
If outliers lead you to measurement system problems,
• repair the erroneous data if possible
• if it cannot be repaired, delete it
Charts that are particularly effective to flag possible outliers
include:
• box plots
• distributions
• scatter plots
• control charts (if you meet the assumptions)
Rescale charts when an outlier reduces visibility into variation.
Be wary of influence of outliers on linear relationships.
[MPDI]
© 2006 by Carnegie Mellon University page 73
When Not to Remove Outliers
When you don’t understand the process.
Because you “don’t like the data points” or they make your
analysis more complicated.
Because IQR or Grubbs method “says so.”
When they indicate a “second population.”
• Identify the distinguishing factor and separate the data.
When you have very few data points.
140.00
the lifecycle
Defects/KSLOC
100.00
All Projects
New Process
• Effectiveness of peer
80.00
60.00
CPI)
• Schedule achieved/actual
(Schedule Performance
Index – SPI)
© 2006 by Carnegie Mellon University page 81
Peer Reviews
Can we predict the number of
errors found in a peer review?
12
Could generate a control chart 11
UCL=11.60
Individual Value
reviews 9
8 Mean=7.8
Must assume: 7
6
• Product errors are normally 5
and uniformly distributed 4 LCL=4.001
Individual Value
9
Expected
8
Variation Average Mean=7.8
7
6
5
4 LCL=4.001
3
0 5 10 15
Observation Number
• Create procedures/checklists
UCL=11.17
• Strengthen process audits 10
Individual Value
Mean=7.268
Increase the effectiveness
5
(increase the mean) LCL=3.363
• Train people 0
• Create checklists
• Reduce waste and re-work
0 5 10 15
• Replicate best practices Observation Number
tailoring
Clarify what your customer wants Determine the industry best practice
(Voice of Customer) • Benchmarking, models
• Critical to Quality (CTQs)
Compare your current practices to the
Determine what your processes can model
do (Voice of Process) • Appraisal, education
• Statistical Process Control
Identify and prioritize improvement
Identify and prioritize improvement opportunities
opportunities • Implementation
• Causal analysis of data • Institutionalization
Determine where your Look for ways to optimize the
customers/competitors are going processes
(Voice of Business)
• Design for Six Sigma
© 2006 by Carnegie Mellon University page 89
Agenda
Six Sigma Benefits for Early Adopters – What the Books Don’t
Tell You
The 7 Six Sigma Tools Everyone Can Use
I Hear Voices
Dirty Data (and How to Fix It)
Statistical Process Control – Where Does It Apply to
Engineering?
Convincing Sr. Management: The Value Proposition
Summary
Cheap Sources of Information and Tools
MSE Addendum
[Siviy 05-2]
© 2006 by Carnegie Mellon University page 93
Change Management
D x V x F > R
D = Dissatisfaction with the present
V = Vision for the Future
F = First (or next) Steps
R = Resistance
[Beckhard]
© 2006 by Carnegie Mellon University page 94
What Drives Process Improvement?
Performance issues: product, project—
• and, eventually, process issues
Regulations and mandates
• Sarbanes Oxley
• “Level 3” requirements to win contracts
Business issues and “burning platforms”
• lost market share or contracts
• continuous cost and cycle time improvement
• capitalizing on new opportunities
[IR&D 04]
© 2006 by Carnegie Mellon University page 97
Selected Supporting Findings 1
Six Sigma helps integrate multiple improvement approaches to
create a seamless, single solution.
Rollouts of process improvement by Six Sigma adopters are
mission-focused, flexible, and adaptive to changing
organizational and technical situations.
Six Sigma is frequently used as a mechanism to help
sustain—and sometimes improve—performance in the
midst of reorganizations and organizational acquisitions.
Six Sigma adopters have a high comfort level with
a variety of measurement and analysis methods.
[IR&D 04]
© 2006 by Carnegie Mellon University page 98
Selected Supporting Findings 2
Six Sigma can accelerate the transition of CMMI.
• moving from CMMI Maturity Level 3 to 5 in 9 months, or from
SW-CMM Level 1 to 5 in 3 years (the typical move taking 12-
18 months per level)
• underlying reasons are strategic and tactical
When Six Sigma is used in an enabling, accelerating, or
integrating capacity for improvement technologies, adopters
report quantitative performance benefits using measures they
know are meaningful for their organizations and clients. For
instance,
• ROI of 3:1 and higher, reduced security risk, and better cost
containment
[IR&D 04]
© 2006 by Carnegie Mellon University page 100
Why does this work?
Let’s decompose
• Arsenal of tools, and people trained to use them
• Methodical problem-solving methods
• Common philosophies and paradigms
• Fanatical focus on mission
[Hefner 04]
© 2006 by Carnegie Mellon University page 102
How Does this Work? 2
Specific DMAIC-CMMI Relationships
Overall
• DMAIC: a problem solving approach
• CMMI: a process & measurement deployment approach
PAs that align with DMAIC include the following:
• MA, GPs
• QPM, CAR, OID (either “continuous” or high-maturity view)
A DMAIC project may leverage these existing processes:
• PP, PMC, IPM
• OPP for organization level execution, mgmt, oversight
PAs through which DMAIC may be incorporated into
organizational process definition include the following:
• OPF, OPD
[Siviy 05-1]
© 2006 by Carnegie Mellon University page 103
How Does this Work? 3
Specific DMAIC-CMMI Relationships
[Siviy 05-1]
© 2006 by Carnegie Mellon University page 104
Example M&A Process
Establish capability, models, etc. No “Issues”
Specs
OPP
Business Objective
M MA QPM
Select
Performance Thresholds Gather Prioritize
Business Goal Analyze Data
Data Issues
D C Start subprocess
•Project Performance selection
•Measures Quality
Identified
•SPI Implementation Draft Improvement
Thresholds
•Snapshot (1st Iteration Æ Baseline) Goal (SMART) or
•Issues (Validity of data, Quality of Identify focus area
Improvements Data, Variance (performance)
Gather Data/Analyze Data
Identify Perform Causal Prioritize Identify
Possible Analysis Actual Potential
Causes Causes Solutions
(Brainstorm) A CAR
Goal Refinement
1st Iteration Æ Final Goal
Develop Implement
Action Plan I Improvement
[Siviy 05-1]
© 2006 by Carnegie Mellon University page 106
Determining YOUR Approach
First Remember: Everything is a Process!
SEI (or other institution)
transition tech-
develop nology
technology
Project Team
MSE Metrics-Precision
%Gauge Repeatability & Reproducibility (%GR&R):
The fraction of total variation consumed by measurement
system variation.
σ MS *
% GRR = x100 %
σ Total
* Automotive Industry Action Group (AIAG) MSA Reference Manual, 3rd edition
<10% Acceptable
>30% Unacceptable
* Reference Automotive Industry Action Group (AIAG) MSA Reference Manual, 3rd edition
Define •Identify
What is the current problem to be solved?
Explore Identify Define
project •needed
What are the goals,dataimprovement targets,
possible & success criteria?
control
scope •data
What is the business case, potential solutions method
savings, or benefit that
will be realized when the problem is solved?
Obtain Characterize Implement Implement
Establish • Who are the stakeholders? The customers?
data process & (pilot as
set are the relevant processes and who owns them?
formal • What problem needed)
project Document
Evaluate
• data
Have stakeholders
quality agreed to the project
Update Selectcharter or
contract? improvement solution
project
• What is the project plan,scope
including the resource plan and
Summarize & scale
&progress
baseline tracking? Evaluate
•data
How will the project progress be communicated?