You are on page 1of 45

UNCLASSIFIED / FOUO

Black Belt
Improve
Tollgate Briefing

Project Name
DEPMS Project Number

Name of Belt
Organization
Date
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Tollgate Requirements - Improve

NG CPI BLACK BELT TOLLGATE REQUIREMENTS


PROJECT DELIVERABLES NGB COMMENTS
IMPROVE
Solution Prioritization / Selection Mandatory Brainstorm/prioritize
Future State Process Map Mandatory
Implementation / Improvement Plan Mandatory Action plan
Pilot Plan / Pilot Results Recommended
New Process Capability / Sigma Level / DPMO Mandatory Based on data type
New Control Charts Recommended Not appl to all projects
Storyboard / A3 Mandatory 1-page proj summary
Barriers/Issues/Risks Mandatory
Quick Wins Recommended
Lessons Learned Optional

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Improve Tollgate Templates

NOTE: THIS IS A TEMPLATE FOR ALL NG CPI BELTS

NG has developed this template as a basic format with standard deliverables to


help guide NG CPI belts through the NG tollgate requirements for certification.
It is recognized that each project is unique and has unique deliverables with
unique flows. Therefore, this format does not have to be followed exactly to the
letter of the law for your project.

(DELETE THIS SLIDE FOR YOUR PROJECT)

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Define Charter and Timeline


Team Members
Name Role Affiliation DACI
Black Belt Driver
Master Black Belt Driver
Take away
Project Sponsor Approver
message goes
Process Owner Approver
here
Project Charter (impact of
Problem problem)
Statement:
Business Case:

Project Timeline
Goal Statement:
Phase Start Stop Status
Unit: Define mm/dd/yy mm/dd/yy
Defect: Measure mm/dd/yy mm/dd/yy
Customer Analyze mm/dd/yy mm/dd/yy
Specifications:
Improve mm/dd/yy mm/dd/yy
Process Start:
Control mm/dd/yy mm/dd/yy
Process Stop:
4
Scope:
Required Deliverable UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Measure Overview
Baseline Statistics Process Capability
Process Capability of Delivery Time P otential (Within) C apability
 VOC / VOB LSL Target USL
Cp
C PL
1.16
2.22

 Unit (d) or Mean (c) W ithin


C PU
C pk
0.10
0.10
O v erall C C pk 1.16
 Defect (d) or St. Dev. (c) O v erall C apability
Pp 1.24
 DPMO (d) PPL 2.37
PPU 0.11


PCE: (Cycle Time Only)
PLT: (Cycle Time Only)
- Example - LS L
P pk
C pm
0.11
0.35
P rocess D ata
10
T arget 20
USL 30
S ample M ean 29.1203
 Sigma Quality Level S ample N
S tD ev (Within)
266
2.87033
S tD ev (O v erall) 2.69154

12 16 20 24 28 32 36
 MSA Results: show the percentage result of the GR&R or other O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance
P P M < LS L 0.00
measurement systems analysis carried out in the project
P P M < LS L 0.00 PPM < LS L 0.00
PPM > USL 281954.89 PPM > USL 379619.67 PPM > USL 371895.18
P P M T otal 281954.89 PPM T otal 379619.67 P P M T otal 371895.18

Baseline “As Is” Performance Tools Used


Summary for Delivery Time A nderson-D arling N ormality Test
 Detailed process mapping  Time Series Plot
A -S quared 1.95
P -V alue <
M ean
0.005
29.128  Measurement Systems Analysis  Probability Plot
S tD ev 2.677
V ariance
S kew ness
7.169
0.201075  Value Stream Mapping  Pareto Analysis
Kurtosis -0.471714
N
M inimum
266
24.000  Data Collection Planning  Operational Definitions
5s
1st Q uartile 27.000

24 26 28 30 32 34
M edian
3rd Q uartile
29.000
31.000  Basic Statistics
Generic Pull
M aximum 35.000

95% C onfidence Interv al for M ean
28.805 29.451
 Process Capability
95% C onfidence Interv al for M edian
29.000 29.000  Histograms  Control Charts
95% C onfidence Interv al for S tD ev
9 5 % C onfidence Inter v als
2.468 2.927
Mean

5
Median

28.8 28.9 29.0 29.1 29.2 29.3 29.4

Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Analyze Overview
Fishbone Diagram Cause and Effect Matrix
Ra ting of
Importa nce to

Facilities & Equipment


Custome r
Materials Manpower 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Wrong Location

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement
Space
Lack of Seats No Standardization of seats Lack of Knowledge Old Buildings
Total

Inequality in seats New Codes Proce ss Ste p Proce ss Input

Lack of Funds Lack of Controls Not Suited for


“Dedicated” to Task
1 0

Senior Leader Current Mission (Type of Space)


2
3
0
0
No Suitable space to Assign 4 0

Getting Seats Takes Time 5


6
0
0

(Y) Effect:
7 0
8 0
9 0
10 0

PLT = 5 days
11 0
Vague Lack of Database
12 0

Reqmts
13 0

People Unplanned Programs


(too long)
14 0
15 0

Multiple Paths Facilities 16 0


17 0
Location (Competing for 18 0
Same Space) 19 0
20 0
Lack of Controls 0
Senior Leadership

0
Total

Delays in elevating Too Long (Time) Lower Spec

Collocation Target

Impasse issues Mold, HVAC Crashes Upper Spec

Approvals This table provides the initial input to the FMEA. W hen each of the output variables
(requirements) are not correct, that represents potential "EFFECTS". W hen each input
Methods Mother Nature Measurements variable is not correct, that represents "Failure Modes".
CAO/IPT Time Avail
Unforeseen Funding Decision to Wait
1. List the Key Process Output Variables
2. Rate each variable on a 1-to-10 scale to importantance to the customer

Circumstances 3. List Key Process Input Variables


4. Rate each variables relationship to each output variable on a 1-to-10 scale
5. Select the top input variables to start the FMEA process; Determine how each selected

Competency vs. PMA


input varable can "go wrong" and place that in the Failure Mode column of the FMEA.

Critical X/Root Causes Analysis Root Cause/Effect


Pareto Chart

100
150
80
 Root cause:
 Effect - Example -
Percent

100 60
Count

50
40
 Root cause:
20
 Effect
0 0

Defect So
ut h
No
r th
Ea
s t
Oth
e rs
 Root cause:
Count 100 50 15 6
Percent 58.5 29.2 8.8 3.5
 Effect 6
Cum % 58.5 87.7 96.5 100.0
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Solution Selection Matrix

Per Commercial Terms


Time to Issue Invoice
Root Causes (Xs)?
- Example -

Level of Effort
Presentation

Overall Impact Rating


Complete

Accuracy

Risk Rating
Root Cause Significance Rating ? 10 10 10 10 10 10

Potential Improvements ? Impact Rating


Offshore Costs 7 1 5 8 1 10 320 8
Commercial Terms 8 4 2 5 10 7 360 7
Quantity of Source Data 8 5 7 7 1 10 380 6
Reconciliation 10 10 10 1 10 10 510 6
Quality of Source Data 7 7 7 7 7 7 420 7
Training 8 7 10 5 8 10 480 6
Client (eg RCTI) 6 2 2 10 8 8 360 5
Job Setup 10 4 10 6 10 10 500 5
Payroll Close Date 8 10 1 1 1 8 290 8
Pre-Billing 10 8 6 8 2 10 440 5
Client Reporting Requirements 7 9 4 10 10 10 500 4
Delivery Method 9 1 1 5 5 7 280 9
Job Completion 7 10 1 7 5 10 400 3
Job Manager Requirements 9 7 5 4 8 9 420 2

* Solutions ranked > 450 have been selected to be implemented


7
Required DeliverableUNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Implementation Plan

PROJECT NAME DIVISION GREEN BELT DATE

PROJECT SPONSOR SERVICE AREA / FUNCTION / SERVICE BLACK BELT

Responsible
Implementation Control Action Target/ Actual
Solution Improvement Action Individual/ Issues/Barriers Risk Mitigation Current Status/ Comments
Number Number Complete Date
Solution Owner
1
2
3
4
5
6
7
8

8
Required DeliverableUNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Pilot Plan
Pilot
Description Success Criteria Test Team Schedule
Test
• Sample Check-in Data Sets to be entered in Hand- • Data Set Entry Accuracy BR, KM, plus Start 3/1
Hand- Chek device < 3.4 DPMO Hot-Chek Complete 3/3
Chek/ • Sample Data Sets Transmitted to Hot-Chek System • Data Set Entry Time < tech rep
Hot-Chek – All Hotel Floors, All Hotel Rooms 6 Seconds
Interface • Confirmation Data Received from Hot-Chek to • Data Set Transmission/
Test Hand-Chek Device – All Hotel Floors, All Hotel Reception Accuracy <
Rooms 3.4 DPMO
• Sample Guest Data Entered in Hot-Chek System • Data Set Entry Accuracy BR, KM, + 6 Start 3/6
(variety of room requirements) < 3.4 DPMO Check-in Complete 3/7
• “Guests” (Hotel Employees) Walked Through • Data Set Entry Time < Staff
Check-in 6 Seconds
Check-in Process (90% Pre-Registered, 10% Non-
Verificatio • Data Set Transmission/
Pre-Registered)
n Test Reception Accuracy <
• Volume Stress Test – Simulated Arrival 20 Guests in
a “Tour Bus” 3.4 DPMO
• Design Scorecard CCRs
• Process Measurements recorded via Observer (see
Design Scorecard); “Guest” Observations Recorded.
• 25 Guests invited to experience new hotel check-in • Data Set Entry Accuracy BR, KM, + 6 Start 3/10
p < 3.4 DPMO Check-in Complete
• Guests “pre-registered” with their room • Data Set Entry Time < Staff 3/10
Check-in requirements in Hot-Chek system. 6 Seconds
Validation • Guests Walked Through Check-in Process (90% • Data Set Transmission/ - Example -
Test Pre-Registered, 10% Non-Pre-Registered) Reception Accuracy <
• Process Measurements recorded via Observer (see 3.4 DPMO
Design Scorecard) • Design Scorecard CCRs
• Guests Debriefed Following Experience.
Recommended Deliverable 9

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Pilot Results
Data Collected:

Plan Pilot
Measure Target x s Comments
PLT 1 minute 0.5 min. 0.05 min. Improved PLT

Data Accuracy < 3.4 DPMO 100 DPMO Decreased DPMO

PCE < 10 % < 15 % Improved PCE

Pilot Observations: 1) Data Entry sequence was confusing

GAP Analysis/Root Causes: 1) SOP wasn’t clear; need to lay it out better
before implementation
2) Order of questions needs to be reevaluated

Follow-up Actions: 1) Revise SOP on order of questions asked and flow, and run pilot again

- Example -
Recommended Deliverable 10

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Failure Mode Effects Analysis (FMEA)


O O
Process Step / Potential Failure Potential Failure Potential Root D Actions D
S C Current Controls Resp. Actions Taken S C
Input Mode (X) Effects (Y) Causes E Recommended E
E C E C
T T
What is the In what ways does What is the impact V What causes the Key U What are the existing What are the What are the V U
E E R
process step the Key Input go on the Key Output E Input to go wrong? R controls and procedures actions for reducing completed actions E R
C RPN C P
and Input wrong? Variables (Customer R R (inspection and test) that the occurrence of taken with the R R
T T N
under Requirements)? I E prevent either the cause the cause, or recalculated I E
I I
investigation? T N or the Failure Mode? improving RPN? T N
O O
Y C detection? Y C
N N
E E
Updating Ineffective Discrepancies: POI vs Adjust templates to
Ineffective reviews 5 4 None 5 100 PMO 5 2 2 20
Tollgates templates Templates match POI
Users and leaders Slide purposes not Adjust slide titles
4 None 4 80 PMO 1 2 10
don't buy-in to LSS clear and notes
Redundant and NVA Eliminate NVA
3 None 4 60 PMO 1 2 10
slides slides
Incomplete SOP or Develop "read me"
3 None 5 75 PMO 1 2 10
"Help" within PS slides

Updating Too many steps to Link templates to


Inefficient updating 3 NVA steps 5 None 4 60 PMO 3 3 2 18
Tollgates build/update PS Phase
Users get frustrated Too many choices Eliminate NVA;
4 None 4 48 PMO 2 2 12
and delay projects between templates group in folders
Inconsistent file Simple names;
3 None 4 36 PMO 1 2 6
names and locations group in folders

LSS Tool Not all LSS tools & User cannot find Not all tools available Revise list of tools
4 5 None 4 80 PMO 4 2 2 16
Access refs in PS tools & references in PS and joggers
Project completion is Poor explanation in Develop direct
3 None 3 36 PMO 2 2 16
delayed some references access pdf file

LSS Tool Too many steps to Inefficient retrieval of Multiple means for "Read me" file; one
2 3 None 5 30 PMO 2 2 2 8
Access retrieve tools LSS tools/refs accessing tools folder
Eliminate NVA
NVA steps 3 None 5 30 PMO 2 2 8
steps

- Example -
11
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Descriptive Statistics / Process Capability

“As Is” Process Capability “New” Process Capability


Process Capability Analysis for Cholesterol Process Capability Analysis for Control

USL USL
Process Data Process Data
220.000 ST ST
USL 220.000
et * LT LT
Target *
*
LSL *
193.133
Mean 184.967
le N 30
Sample N 30
v (ST) 26.0455
v (LT) 22.4931 StDev (ST) 20.4206
StDev (LT) 16.3662

ntial (ST) Capability


* Potential (ST) Capability
0.34 Cp *
*
0.34
CPU
CPL
Process Capability Analysis for Cholesterol
0.57
*
Process Cap
* Cpk 0.57
120 140 160 180 200 220 240 260
Cpm *
140 160 180 200 220 240
erall (LT) Capability
*
Observed Performance
PPM < LSL *
Expected ST Performance
PPM < LSL *
Expected LT Performance
PPM < LSL *
USL
Process Data Overall (LT) Capability Observed Performance Expected ST Performance
Process Data Expected LT Performance
0.40 PPM > USL 133333.33 PPM > USL 151146.50 PPM > USL 116152.65
Pp * PPM < LSL * PPM < LSL * PPM < LSL *
* PPM Total USL
133333.33 PPM Total 220.000PPM Total
151146.50 116152.65
0.40
Target * - Example -
PPU
PPL
Ppk
0.71
*
0.71
PPM > USL
PPM Total
USL
0.00
0.00
Target
PPM > USL
PPM Total
220.000
43119.06
43119.06
*
PPM > USL
PPM Total
16153.51
16153.51

LSL *
LSL *
Mean 193.133
Mean 184.967
Sample N 30
Sample N 30
StDev (ST) 26.0455
StDev (ST) 20.4206
StDev (LT) 22.4931
StDev (LT) 16.3662

Potential (ST) Capability 12


Cp * Required
PotentialDeliverable
(ST) Capability
Cp * UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Control Chart
P Chart for Total Defectives
0.10 - Example -
Feb/Mar Data Confirms Process Has
Remained In Control
3.0SL=0.08162
Proportion

0.05
P=0.03817

0.00 -3.0SL=0.00E+00

0 50 100 150
Sample Number
Recommended Deliverable 13

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

“As Is” vs. “To Be” Process Map


“As Is” Process

To Be Process

- Example -

Required Deliverable 14

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Related Project Consideration

Multi-Generation Project Plan


(MGPP)
Generation 1 Generation 2 Generation 3
(Date) (Date) (Date)

Vision

Process
Generation

Platforms /
Technology
Optional Deliverable 15

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Project Barriers/Issues/Risks
 Barriers

 Issues

 Risks

Required Deliverable
16

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Improve Storyboard
Define 1.2 Day
Project Charter CCR Gap Sigma
Performanc
BUS CASE: Be #2 Fin Service Provider e Level of
1.3
GOAL: Reduce Loan/Lease CT from
9.2 to 8.0 days by July 1
Measure
FIN IMPACT: $2.7M per year Analyze

- Example -
Improve
Pilot Plan
Loan or
Lease
Screen “Officer performs
Entry both” & “Officers
changed”,
Color
eliminated as
Printouts
contributors to high
Rewards cycle time.
& Recog
Officer Work & Turnover, Waiting, & Automation
Affect CT; Job Aids affect Variation in CT Flex Required Deliverable
Time
17

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

8-Step A3 Project Summary Report


Company: Department: Date: Prepared by:
1. Define the problem situation 3. Action plans to correct problems

2. Problem Analysis 4. Results of Activity

5. Future Steps

18

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

NG CPI Tollgate Tool


8-STEP PROCESS
1. Validate 2. Identify 3. Set 4. Find Root 5. Develop 6. See C-Ms 7. Confirm
8. Standardize
Problem Gaps Targets Cause C-Ms Through Results

Define Measure Analyze Improve Control


 Project Charter
• Problem Stmt Detailed As Is Process  Potential Xs  “To Be” Process Map  Process Control Plan
• Defect Definition Map • Brainstorming
• 5 Whys  Solution Generation /  Process Owner
• Goal Statement
 Value Stream Map • Fishbone Prioritization Accountability
• Project Scope Affinity Diagram
 Key Process Metrics •
• Business Impact
• Strat Alignment  Data Collection Plan  Critical Xs  Improvement Strategy  Updated Financial
 Measure Systems • Cause & Effect Matrix • Improvement Model Benefits
 Sponsor & Team • Hypothesis Testing • Implementation Plan
Analysis • Pilot
• Regression Replication Opportunities
 Replication Check  Data Collected • “X” Improvement Target
• Time Studies
 Measurable Y  Baseline Data Analysis • Theory of Constraints  Mistake Proofing  Project Documentation
 Voice of Customer • Descriptive Stats of revised policies,
 FMEA SOP’s, procedures, and
 Customer Specs • Graphs
• Risk Analysis
• Pareto Charts
training
 Voice of Business • Risk Mitigation Plan

 Project Timeline  Est Financial Benefits  FMEA  Visual Process Control


• Risk Analysis Tools (Optional)
 Communication Plan  Control Charts (as • Risk Mitigation Plan
• Stakeholder Analysis needed)
High Level Process  Process Capability  Control Charts
Map (SIPOC)  Process Capability
I accept the Define I accept the Measure I accept the Analyze I accept the Improve I accept the Control
Tollgate Tollgate Tollgate Tollgate Tollgate

(Sponsor) (Sponsor) (Sponsor) (Sponsor) (Sponsor)

(Process Owner) (Process Owner) (Process Owner) (Process Owner) (Process Owner)

(MBB) (MBB) (MBB) (MBB) (MBB)


(Finance Owner) (Finance Owner) 19

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Appendix

20

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Cross Functional Team


Team Members
Name Role Affiliation DACI
Black Belt Driver
Master Black Belt Driver
Project Sponsor Approver
Process Owner Approver
Contributor
Contributor
Contributor
Contributor
Inform
Inform
Inform
Inform

21
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Replication Check
I confirm that:
 DEPMS (DoD Enterprise Performance Management System) has been
searched for similar projects: Yes / No

 Replication: List relevant initiatives / potential replication projects


found (if any):
• Project 1: (DEPMS # or other tracking tool project number)
• Project 2:

 Collaboration: Identify organizations that can/should be considered for


working this project collaboratively:
• Organization 1:
• Organization 2: 22
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Strategic Alignment
The Define Tollgate requires a linkage to organizational strategy.

 Include an organizational metric/metrics for which your project will


help improve
 Refer to your organization’s Strategic Plan and/or other referenced
documents

23
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Business Impact
 Insert as much information as possible about the potential operational
and/or financial benefits
 Include any assumptions upon which these estimates are based

Example: Operational benefits – This project is expected to reduce PLT


by 35%, improve SQL from 1.2 to 3.0, save 20 man hours per shift

Example: Financial benefits – This project is expected to save $xx in FYxx

24
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

High-Level Process Map (SIPOC)


Suppliers Inputs Process Outputs Customers

Measurable Y:
25
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Voice of Customer / Voice of Business


Critical Customer
Voice of the Key Customer Issue(s) Requirement
Customer /
What does the customer want from us? What does the customer want from us? We should summarize key issues and
We need to identify the issue(s) that translate them into specific and measurable
prevent us from satisfying our requirements
customers.

CriticalBusiness
Voice of the Key Process Issue(s)
Requirement
Business
What does the business want/need from us? What does the business want/need from We should summarize key issues and
us? We need to identify the issue(s) translate them into specific and measurable
that prevent us from meeting strategic requirements
goals/missions.

26
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Stakeholder Analysis
Stakeholder Explanation of
Stakeholder’s Stakeholder
Project Impact Level of Current Action Plan
Stakeholder Current Attitude Score
On Stakeholder Influence on Stakeholder For
Name/Group Toward Project (H=3, M=2, L=1,
(H, M, L) Success of Attitude Stakeholder
( +, 0, - ) +=3, 0=1, -=-3)
Project (H,M,L) (list)

Recommended Deliverable 27

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Communication Plan
Topics of
Audience Media Purpose Discussion/ Owner Frequency Notes/Status
Key Messages

28
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Detailed “As Is” Process Map


- Example -

Required Deliverable - VSM or Process Map or Both

29

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Value Stream Map


Order Mgmt Supervisor
Service lead time = 384 min
CUSTOMER Weekly Update
- Example - Customer call time = 24 min
Phone Call
Phone Call

Trigger:

Order Mgmt
Completion Criteria:
Cycle Time:
Screen for Acct Mgr Takt Time:
Number of People:
Manual Update
SUPPLIERS
P/T = 3 min
Number of Approvals:
Items in Inbox:

Lost calls=10%
% Rework:
# of Iterations (cycles):

Volume=1200 # of Databases:
Top 3 Rework Issues:
1.
2.
3.

Large
Business
6 Customers
Order Mgmt Order Mgmt Order Mgmt Order Mgmt DIST
Customer Product Shipping Pick
Info Need Pricing Info
Small 4 4 4 4 10 Pack & Ship
Business 20 Orders
5 Customers P/T = 2 min P/T = 6 Min P/T = 6 Min P/T = 2 Min P/T = 120 Min
Error Rate=2% Error Rate=0% Error Rate=2% Error Rate=1% Error Rate=1%
Volume=800
Volume=1200
Home
Volume=800 Volume=800 Volume=800
3 Customers
5 min 240 min
3 min 2 min 6 min 6 min 2 min 120 min

Required Deliverable - VSM or Process Map or Both 30

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Key Input, Process, and Output Metrics

Suppliers Inputs Process Outputs Customer


Start Step 2 Step 3
Step1

Step 5 Step 4

VOC/ Input Metrics Process Metrics Output Metrics


VOB
Quality

Speed

Cost

Required Deliverable 31

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Operational Definitions
 Define each of the Key Input, Output, Process Metrics from your SIPOC that you are going to
collect data on (via the Data Collection Plan) as well as any other terms that need clarification
for the data collectors and everyone else on the team.
 Examples:
 Award Process PLT: The time from when a Director submits the Award recommendation to
the time when the employee is presented the Award in a ceremony.
 Number of Claims Processed: The number of Claims processed per weekday (M-F).
 Total Hours Worked: The total number of hours worked in the facility including weekends
and holidays.
 Number of Personnel: The total number of military and civilian personnel working (not
including contractors).

 Include other unique terms that apply to your project that require clear operational definitions
for those collecting the data and for those interpreting the data.

Required
32

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Data Collection Plan


Performanc Operational Data How Will Data Be Who Will When Will Sampl Stratificati How will
e Measure Definition Source and Collected Collect Data Be e Size on Factors data be
Location Data Collected used?
1
Ability to update X – Steps to In DEPMS By counting steps Name ASAP 1 None To find VA, BNVA,
projects and update projects NVA
build tollgate
reviews

- Example -
2
Ability to update X – Tollgate In DEPMS By determining % of Name ASAP 40 None To determine
projects and template slides activity steps identified in consistency with
build tollgate that match POI “Introduction to _____” POI
reviews modules in POI that are
adequately addressed in
templates

3
Easy Access to X – Availability of In DEPMS By determining the Name ASAP 63 None To determine
LSS tools and LSS tools and percentage of tools, with availability of tools
references references their references, listed on and references
DMAIC Road Map slides that
can be found in PS

4
Easy Access to X – Steps In DEPMS By counting # steps Name ASAP 37 None To find VA, BNVA,
LSS tools and required to find required to find the tools NVA
references tools and and their references
references

33
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Measurement Systems Analysis


The Measurement System used to collect data has been calibrated and is considered to have no potential for significant
errors. The data collection tool is reliable, can be counted on, has good resolution, shows no signs of bias and is stable.

Type of
Measurement Description Considerations to this Project
Error
The ability of the measurement Work hours can be measured to <.25
Discrimination
system to divide measurements into hours. Radar usage measure to +- 2
(resolution)
“data categories” minute.
The difference between an observed No bias - Work hours and radar start-
Bias average measurement result and a stop times consistent through
reference value population.
No bias of work hours and radar
Stability The change in bias over time
usage data.
Not an issue. Labor and radar usage
Repeatability The extent variability is consistent is historical and felt to be accurate
enough for insight and analysis.
- Example - Remarks in usage data deemed not
Different appraisers produce reproducible, therefore were not
Reproducibility
consistent results considered in determining which
radars were used in each op
34
Variation The difference between parts N/a to this process.
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Reported by :
Gage
Tolerance:
name:

Measurement Systems Analysis Date


Misc:
of study :

Gage R&R (ANOVA) for Response


Gage R&R
Components of Variation Response by Part
%Contribution
Source VarComp (of VarComp) 100 % Contribution
% Study Var
10.00
Total Gage R&R 0.0015896 3.70

Percent
Repeatability 0.0005567 1.29 9.75
Reproducibility 0.0010330 2.40 50
Operator 0.0003418 0.79 9.50
Operator*Part 0.0006912 1.61
0
Part-To-Part 0.0414247 96.30 Gage R&R Repeat Reprod Part-to-Part 1 2 3 4 5 6 7 8 9 10
Total Variation 0.0430143 100.00 Part
R Chart by Operator
Study Var %Study Var Response by Operator
1 2 3
Source StdDev (SD) (6 * SD) (%SV) UCL=0.1073
0.10 10.00
Total Gage R&R 0.039870 0.23922 19.22
Sample Range

Repeatability 0.023594 0.14156 11.38


Reproducibility 0.032140 0.19284 15.50 _ 9.75
0.05
Operator 0.018488 0.11093 8.91 R=0.0417
Operator*Part 0.026290 0.15774 12.68 9.50
Part-To-Part 0.203531 1.22118 98.13 0.00 LCL=0
1 2 3
Total Variation 0.207399 1.24439 100.00
Operator
Xbar Chart by Operator
Number of Distinct Categories = 7 1 2 3 Operator * Part Interaction
10.00 10.00 Operator
Sample Mean

UCL=9.8422
__ 1

The Measurement

Average
2
X=9.7996 9.75
9.75 LCL=9.7569
3

System is acceptable
9.50
with the Total Gage 9.50
1 2 3 4 5 6 7 8 9 10
R&R % Contribution Part

<10% - Example - 35
Optional BB Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

“As Is” Baseline Statistics


Summary for Workdays
A nderson-Darling N ormality Test
A -S quared 12.65
P -V alue < 0.005
M ean 44.814
S tDev 61.251

- Example - V ariance
S kew ness
3751.674
2.87329
Kurtosis 9.54577
N 118

M inimum 1.000  The current process


has a non-normal
1st Q uartile 12.000
M edian 22.000
3rd Q uartile 52.000
0 60 120 180 240 300 360
M aximum 365.000 distribution with the
95% C onfidence Interv al for M ean
33.647 55.981 P-Value < 0.05
95% C onfidence Interv al for M edian
17.000 29.123
95% C onfidence Interv al for S tDev
 Mean = 44 days
9 5 % C onfidence Inter vals
54.308 70.246
Mean  Median = 22 days
Median

20 30 40 50 60  Std Dev = 61 days


 Range = 365 days
Required Deliverable

36

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Process Control Chart


I-MR Chart of Delivery Time
 The current baseline
40
delivery time is stable UC L=37.70

over time with both

Indiv idual V alue


35

the Moving Range 30 _


X=29.13
(3.22 days) and 25
Individual Average
LC L=20.56
(29.13 days) 20
1 28 55 82 109 136 163 190 217 244

experiencing common Observation

cause variation 10.0


UC L=10.53

 255 data points


M ov ing Range

7.5

collected with zero 5.0

subgroups, thus the


__
MR=3.22
2.5
I&MR control chart
0.0 LC L=0
selected 1 28 55 82 109 136 163 190 217 244
Observation

- Example - Required As Applicable


37

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Process Capability
Process Capability of Workdays
 118 data points collected Calculations Based on Lognormal Distribution Model

 Non-normal distribution
LSL
USL

 Mean = 44 days LS L
P rocess Data
0
O v erall C apability
Z.Bench -0.31
Target * Z.LS L 3.07
 Lower Cust Spec = 0 days USL 15 Z.U S L -0.02

 Upper Cust Spec = 15 days


S ample M ean 44.8136
S ample N 118 - Example - P pk -0.01

E xp. O v erall P erformance


Location 3.09501
% < LS L 0.00
S cale 1.26378
 65% of observations % > U S L 62.03
O bserv ed P erformance % Total 62.03
outside customer spec % < LS L 0.00
% > U S L 65.25
 Z Bench = -.31 % Total 65.25

0 60 120 180 240 300 360 420

Required Deliverable

38

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Process Constraint ID Analysis


 Takt Rate Analysis compares the task time of each process (or process step) to other steps
and customer demand to determine if the time trap is the constraint

Takt Time = Net Process Time Available Takt Rate = Customer Demand Rate = Number of Units to Process
Number of Units to Process Net Process Time Available
Value Add Analysis - Current State
Takt Tim e = 55
80
Task Time (seconds)

70
60
50
40
30
20
10
0
1 2 3 4 5 6 7 8 9 10
Task #
- Example -
CVA Time NVA-R Time NVA Time
39
BB Optional Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Pareto Plot Analysis


Pareto Chart

100
150
80

- Example -

Percent
100 60
Count

40
50
20

0 0
ut h r th st e rs
So No Ea Oth
Defect
Count 100 50 15 6
Percent 58.5 29.2 8.8 3.5
Cum % 58.5 87.7 96.5 100.0

The South and North contain over 80% of the defects. Our
project will focus here and not on the East and West.
40
Optional Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Cause & Effect Diagram (Fishbone)


Materials Manpower Facilities & Equipment
Wrong Location
Space
Lack of Seats No Standardization of seats Lack of Knowledge Old Buildings

Inequality in seats New Codes


Lack of Funds Lack of Controls Not Suited for
“Dedicated” to Task
Senior Leader Current Mission (Type of Space)
No Suitable space to Assign
Getting Seats Takes Time
(Y) Effect:
Vague
Reqmts
Lack of Database PLT = 5 days
People Unplanned Programs
Multiple Paths Facilities (too long)
Location (Competing
Same Space)
for

Lack of Controls
Senior Leadership - Example -
Delays in elevating Too Long (Time)
Collocation
Impasse issues Mold, HVAC Crashes
Approvals
Methods Mother Nature Measurements
CAO/IPT Time Avail to
Unforeseen Funding Decision Wait
Circumstances
Competency vs. PMA

41
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

XY Matrix (Root Cause Analysis)

Ra ting of
Importa nce to
Custome r
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement

Requirement
Total

Proce ss Ste p Proce ss Input

1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
10 0
11 0
12 0
13 0
14 0
15 0
16 0
17 0
18 0
19 0
20 0
0
0

0
Total
Lower Spec
Target
Upper Spec

This table provides the initial input to the FMEA. W hen each of the output variables
(requirements) are not correct, that represents potential "EFFECTS". W hen each input
variable is not correct, that represents "Failure Modes".

1. List the Key Process Output Variables


2. Rate each variable on a 1-to-10 scale to importantance to the customer
3. List Key Process Input Variables
4. Rate each variables relationship to each output variable on a 1-to-10 scale
5. Select the top input variables to start the FMEA process; Determine how each selected
input varable can "go wrong" and place that in the Failure Mode column of the FMEA.

42
Required DeliverableUNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Hypothesis Test Summary


Hypothesis Test Factor (x)
(ANOVA, 1 or 2 sample t - test, Chi Squared, p Value Observations/Conclusion
Regression, Test of Equal Variance, etc) Tested
Significant factor - 1 hour driving time from DC
Example: ANOVA Location 0.030 to Baltimore office causes ticket cycle time to
generally be longer for the Baltimore site
Significant factor - on average, calls requiring
Example: ANOVA Part vs. No Part 0.004 parts have double the cycle time (22 vs 43
hours)
Significant factor - Department 4 has digitized
Example: Chi Squared Department 0.000 addition of customer info to ticket and less
human intervention, resulting in fewer errors
South region accounted for 59% of the defects
Example: Pareto Region n/a due to their manual process and distance from
the parts warehouse

- Example - Optional BB Deliverable


Describe any other observations about the root cause (x) data

43

UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

One-Way ANOVA: Root Cause Verification


Boxplots of Net Hour by Part/No
(means are indicated by solid circles)

 After further investigation, possible 150


Boxplot: Part/ No Part Impact on Ticket Cycle Time
reasons proposed by the team are
OEM backorders, lack of technician
- Example -
Net Hours Call Open

certifications and the distance from


the OEM to the client site. It is also 100

caused by the need for technicians to


make a second visit to the end user
to complete the part replacement. 50
Next step will be for the team to
confirm these suspected root causes.

0
Part/No Part

Part
No Part
Analysis of Variance for Net Hour  Because the p-value <=
Source DF SS MS F P
Part/No 1 7421 7421 8.65 0.004 0.05, we can be confident
Error 69 59194 858 that calls requiring parts
Total 70 66615 do have an impact on the
Individual 95% CI's For Mean
Level N Mean StDev --+---------+---------+---------+----
ticket cycle time.
No Part 27 21.99 19.95 (--------*---------)
Part 44 43.05 33.70 (------*------)
--+---------+---------+---------+----
Pooled StDev = 29.29 12 24 36 48
44
Optional BB Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO

Linear Regression
 95% confident that 94.1% of the variation in “Wait Time” is from the “Qty of Deliveries”

Fitted Line Plot


Wait Time = 32.05 + 0.5825 Deliveries
55
S 1.11885
R-Sq 94.1%
R-Sq(adj) 93.9%
50
Wait Time

45

40
- Example -
35
10 15 20 25 30 35
Deliveries

45
Optional BB Deliverable
UNCLASSIFIED / FOUO

You might also like